How To Import an Existing Cloud Environment into Terraform

If you are in a position managing compute infrastructure in the cloud, you may end up inheriting an environment that is not being managed with an infrastructure-as-code solution.

A popular, well-supported option would be to use Terraform. Terraform allows you to model the infrastructure as configuration files, which can be committed into a source control repository.

Here are the steps I recently followed to take an existing environment in AWS and model some of the important resources in Terraform.

Use terraformer to perform the import

Terraform supports importing existing infrastructure via the command line, but a cursory review of the documentation should make it clear that it expects you to first write a stub configuration object and then import the object using:

terraform import ...

over and over again for every possible resource. If you have more than a few EC2 instances, one can imagine this quickly becomes unwieldy.

So instead, we'll use terraformer! Terraformer is a go application written by Waze-SRE that makes the import of the environment a single step. Install the application using whatever distribution and ensure that you have the cloud cli (e.g. aws cli) credentials configured in your shell.

Then you can run an example command such as:

terraformer import aws -r ec2_instance,sg,vpc --regions=us-west-2

This uses the aws provider and imports ec2 instances, security groups, and vpcs into a directory - generated/aws/us-west-2/...

It also will set up the terraform state locally.

You may need to run

terraform state replace-provider registry.terraform.io/-/aws hashicorp/aws

in order to fix the provider - this appears to be an incompatibility issue with older versions of terraform and it may be resolved in future releases of terraformer.

You can now run terraform plan and terraform apply and see that the resources are mapped in your local state file. Although you could stop at this point and commit all of the files, including terraform.tfstate to the repository, this isn't recommend for multiple users. Instead we should configure a remote backend and push the state there.

set up a remote backend

For those less familiar with terraform - it has to maintain state, usually accomplished through a file, that serves as a source of truth to map cloud infrastructure to resources. A full explanation on why terraform needs state can be found here.

Remote backends serve a similar function as remote source control repositories - it makes it easier for multiple people to work with the state file. Since it's just a file, a variety of cloud storage backends are supported out-of-the-box.

In the us-west-2 directory of the directory created above:

├── ec2_instance
|   ├── outputs.tf
│   ├── instance.tf
│   ├── provider.tf
|   ├── terraform.tfstate
│   └── variables.tf
├── main.tf
├── sg
│   ├── outputs.tf
│   ├── provider.tf
│   ├── security_group.tf
│   ├── security_group_rule.tf
|   ├── terraform.tfstate
│   └── variables.tf
└── vpc
    ├── outputs.tf
    ├── provider.tf
    ├── terraform.tfstate
    └── vpc.tf

Each of these directories has a terraform.tfstate file. We'll need to edit the provider.tf in each of these directories to add the backend.

Create an s3 bucket (ensure your cli credentials have write permissions to it) and create a directory structure like this:

aws
└── us-west-2
    ├── ec2_instance
    ├── sg
    └── vpc

Then update the provider.tf in your local ec2_instance folder by adding the remote s3 backend.

terraform {
	backend "s3" {
		bucket = "your-s3-bucket-name"
		key = "aws/us-west-2/ec2_instance"
		region = "us-west-2"
	}
	required_providers {
		aws = {
        		version = "~> 3.52.0"
		}
  }
}

Make similar updates in the sg and vpc folders making note of the different key names.

Once done, run terraform init -reconfigure in each of those directories. Terraform will make note of the state change and ask if you'd like to copy the state - you'll want to say "yes" to the prompt.

Once that's done, mission accomplished! You can delete the local terraform.tfstate* files and commit the rest of the files to version control.

conclusion

Don't go writing your own script to try and import things - terraformer works very well and makes the process pretty painless. It saved me a lot of time and served as a very helpful crash course in getting up and running with real-world terraform.