Provisioning an AWS VPC with Terraform

- By Manisha Sahasrabudhe on May 18, 2018

This tutorial explains how to manually provision a AWS Virtual Private Cloud(VPC) using Terraform. Before you start, you should be familiar with the following concepts:

The best way to get started is to install Terraform and run scripts on your local machine to provision a VPC. The first section of this tutorial explains how to do that. However, manual execution isn't the best and most efficient way to run Terraform scripts, so we will take a look at the challenges and learn how to automate this workflow

 

Step-by-step instructions

Follow the steps below in order to provision your VPC.

 

Step 1: Prep your machine

 

Step 2: Prepare Terraform scripts

  • Terraform scans for all files with extensions *.tf in the current folder and its subfolders recursively. It combines them all into a single file before executing it. In our example, we are using the following files:
    • terraform.tfvars supplies the values for all the dynamic variables needed
    • variables.tf is the representation of those variables in Terraform format
    • vpc.tf is the actual script that provisions a VPC
├── terraform.tfvars
├── variables.tf
├── vpc.tf
  • If you do not have your own Terraform scripts, please feel free to clone our sample playbook here: https://github.com/devops-recipes/prov_aws_vpc_terraform
  • In our scenario, you will need to provide the values in the tfvars file and you should be good to go. In terraform.tfvars, replace these wildcards with your desired values: ${AWS_ACCESS_KEY_ID} ${AWS_SECRET_ACCESS_KEY} ${vpc_region} ${vpc_name} ${vpc_cidr_block} ${vpc_public_subnet_1_cidr} ${vpc_access_from_ip_range}.

 

Step 3: Apply your Terraform scripts

Execute the following command to run your Terraform scripts from the directory that contains the .tf file.

terraform apply -var-file=terraform.tfvars

Verify on AWS if the VPC was provisioned.

 

Automating Terraform workflows

While manual execution is great while getting started, you'll run into some challenges if you continue doing this manually.

  • Reduced Reusability: vpc.tf is a reusable script, i.e. it has wildcards for settings like region, name, and CIDR blocks. This means that as long as you inject the right values using variables.tf, the playbook can be used to provision multiple VPCs. However, this also means that you need to be very careful to use the right variables.tf each time, and the number of these files will multiply over time. This defeats the reusability of your playbook. The right way to do this is to have a programmatic way to inject the right values based on context.
  • Security concerns: The machine you will use to run your playbook needs to be authenticated to the AWS account. If you now want to provision using different credentials, you'll need to keep switching accounts, or use different machines. The machines also need to be secure since your AWS credentials will be accessible on the machine unless you clean up after every execution.

In a nutshell, if you want to achieve frictionless execution of Terraform scripts with modular, reusable playbooks, you need to templatize your scripts and automate the workflow used to execute them.

To show you how to automate the provisioning of your AWS infrastructure, we have designed a step by step tutorial in our documentation:

Provision AWS VPC using Terraform

If you want a live demo of the Shippable platform and watch this scenario in action, schedule a demo with us:

Schedule a demo