Terraform state - local workspaces to remote storage containers - azure

I initially saved terraform state in multiple workspaces (for each environment) locally as that part of infrastructure was intended to be updated very rarely. However, now I want to store the state for each workspace in a separate azure storage container (one per environment).
What is the best way I can move the terraform state (per workspace) into its own remote storage i.e. azure storage container?
Options I tried:
Using terraform init --migrate-state and using the backend with remote storage seems to be moving the state files for all workspaces into the same storage container (eg: dev). After this, using terraform workspace select would not let me re initialize the terraform init to point to the remote storage for the next workspace/environment.
Trying using multiple backend files for each environment. Since we cannot have multiple backend files in the same code, this is not an option.
Can anyone suggest any other solutions to this? I want to avoid duplicating the code for each environment into separate folders and then reconfiguring the terraform state to point to the respective storage container.
Thanks!

Related

Force volume recreation with Terraform Cloud (VCS)

I have a Terraform Cloud account connected to a git repo (a VCS-backed workspace), so I can only do VCS-driven run workflows. I have a VM with an attached volume and I would like to recreate the volume from scratch (yes, losing all data). I have read about plan replace option, but this can not be used in my workspace.
So, which is the best option to perform a volume re-creation with a Terraform VCS-backed workspace?
By the way I'm using an OpenStack as cloud infrastructure and official terraform-provider-openstack/openstack provider.

Create google cloud bucket and save terraform state to it with the same terraform script?

I'm new to terraform, and am trying to use it to create and configure an entire project from scratch. We're currently thinking about it as 1 google project is one environment.
It seems reasonable to store the terraform remote state inside of a bucket in the project that it is configuring. i.e. have terraform create the google cloud project, create a bucket, and then store its own remote state in that bucket that it just created. That also seems very advanced and potentially chicken / egg.
Is it possible to store terraform scripts remote state in the project that the terraform script itself is creating?
You could use terraform to create the project and bucket and then migrate the state into that bucket. But this is a chicken/egg scenario that begs the question, what happens if you need to delete/rebuild the bucket containing the state?
A more sensible approach would be to manually create a master project and remote state bucket. From there you would have a base for a project vending machine to spin up new projects and baseline config.
No - you cannot do this. After days of research, this isn't something that's possible.

Efficient way to manage terraform state with Azure storage contianer per pipeline

As part of the IaC workflow we are implementing through Terraform, for some of the common resources we provision for users, we want to create a centralized remote state store. We are using Azure cloud so the default choice is to use Azure blob storage. We were initially thinking of creating one storage continaer per pipeline and store the state there. But then there was another thought wherein create one container and create directory structure per pipeline and store the state there. I understand blob storage by default is the flat file system. But Azure storage also gives an option to enable hierarchical file structure with ADLS2. Did anyone attempt to store terraform states by enabling hierarchical file system structure in Azure? Is that a valid option at all? Also, can anyone suggest what would be the recommended apporach in my scenario?
Thanks
Tintu
Never tried with ADLS2 by using its hierarchical feature. But since your requirement is to save the statefiles in same container but within different folders, you can try out specifying different folder structure while configuring the backend in backend.tf
terraform init backend-config "key=$somePath/<tfstate-file-name>.tfstate"
And pass different somePath values from a different backend.tfvars files.
I hope this answers your question!

Copy files to Azure VMSS using GitHub Actions

Is there any way I can copy the set of files from my GitHub Repo/Azure Blob Storage to Azure VMSS (Windows Server) using GitHub actions and then restart all the instances? should I use a custom extension script to copy the files? are there any other methods to copy? Please Note: on my vmss server autoscaling is enabled.
Thank you
Hussain.
It seems you want to copy the files from your GitHub Repo/Azure Blob Storage to Azure VMSS and the VMSS is configured with autoscaling, which means you want the copy files in all the instances, including the autoscaling instances.
If I'm right, then manually copy files is not the right way. You should know all the instances of the VMSS are created from the configuration that you set in the creation time. If you just copy files to the existing instances, then when it auto-scales, the new instances will not have the copy files. There are two ways for you.
One is that you can create a custom VM image and copy files in that image. Then use this VM image to create your VMSS.
Another one is that if you use the Azure File Share, you can mount the file share to the VMSS, then all the files in the file share will exist in all the instances of the VMSS. Here is an example.

copy file to Azure File Share (Azure Storage)

in addition to my question Code sync from Azure Scale Set VM To Azure Storage is there any way to copy files from one of the particular Scale Set VM to Azure File share(Azure Storage) through ADO Pipelines ? since its scale Set server i cant push every time from one VM. Eg: In pool there will be 100 VMSS servers ,when i try to push code through pipeline it should pick up one server from pool and from that need to push code !!! does it possible ?
No, we do not have this kind of build-in task. We have a Azure File Copy task which use it copy application files and other artifacts to Microsoft Azure storage blobs or virtual machines (VMs).
When the target is Azure VMs, the files are first copied to an
automatically generated Azure blob container and then downloaded into
the VMs. The container is deleted after the files have been
successfully copied to the VMs.
As a workaround, please use a custom script to copy files to Azure File Storage. In other words, if you are able to do the same thing locally. You should also be able to achieve it through Azure DevOps pipeline.
You may have to built your own extension or use scripts in pipeline. Take a look at this similar 3rd-party task-- AzureDevOpsReleaseToFileStorage

Resources