I am using Terraform with an AWS S3 backend to manage my infrastructure. I have two folders (folder1
and folder2
), and I am trying to deploy resources in folder2
while keeping the resources created in folder1
. Both folders are configured to use the same S3 backend (same bucket and state file).
However, when I run terraform apply
in folder2
, Terraform is attempting to destroy resources that were created in folder1
. I have verified that both folders point to the same S3 bucket and use the same key for the state file.
Here are the steps I followed:
folder1
, which successfully created resources and stored the state in an S3 bucket.folder2
with the same S3 backend configuration (same bucket, same key for the state file).terraform init
and terraform apply
in folder2
expecting Terraform to deploy new resources without deleting the existing ones from folder1
.folder1
terraform show
in folder2 shows all the resources created from folder1.
Are there any steps I can take to make sure Terraform in folder2
recognizes the resources in the shared state and does not attempt to destroy them?
The issue arises because Terraform is designed to manage the complete state of infrastructure defined within a single state file. When both folder1 and folder2 use the same backend configuration (same S3 bucket and key), Terraform interprets the state as belonging to the resources defined by the current folder’s configuration files. Resources not defined in the current folder are considered “orphaned” and are planned for destruction.
To get rid of this issue you can simple just update the state key in one of them:
backend "s3" {
bucket = "same bucket"
key = "state/folder1/terraform.tfstate" # "state/folder2/terraform.tfstate"
region = "same region"
}
then you can export a state from one terraform project to another, if you want to export the outputs for some shared variables, if that what you need from the idea of using the same state, as the below example:
data "terraform_remote_state" "folder1" {
backend = "s3"
config = {
bucket = "your-s3-bucket"
key = "state/folder1/terraform.tfstate"
region = "your-region"
}
}
resource "aws_other_resource" "example" {
depends_on = [data.terraform_remote_state.folder1]
some_property = data.terraform_remote_state.folder1.outputs.example_output
}
https://developer.hashicorp.com/terraform/language/state/remote-state-data
so at the end, you for sure need one state per folder, but you can export the state between different project to use some variables.