We are developing an azure function which should run terraform cmdlets like init, plan and apply.
when we run above commands in powershell, we are getting below error.
Error checking configuration: <nil>: Failed to read module directory; Module directory C:\home\site\wwwroot\databricks-user-sync-modules does not exist or cannot be read
My run.ps1 file includes below sinppet
write-output (terraform --version)
Write-Output ((Get-ChildItem).Name)
Get-Content -Path main.tf
write-output (terraform init)
terraform plan -var-file dev.tfvars
How to run terraform in azure functions.
The error message you are receiving may or may not be related to your attempt to use terraform in Azure Functions.
I'd initially ask whether Azure Functions is the ideal solution to the problem you're trying to solve?
Based on the fact that the functions runtime doesn't include the terraform binary in order to run tf cli commands. You may have more success using Azure Devops pipelines or Github Actions to deploy your terraform code.
Both AzDo and Github can trigger CI/CD operations via a webhook.
https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#repository_dispatch
and
https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/pipelines/sprint-172-update
Unless I am missing something obvious, you may have to provide more context around why you're using Azure Functions for this scenario
Related
Learning Terraform, and in one of the tutorials for terraform with azure a requirement was to log in with the az client. Now my understanding is that this was to create a Service Princlple.
I was trying this with Github actions and my assumption was that the properties obtained for the Service Principle. When I tried running terraform plan everything worked out fine.
However, when I tried to do terraform apply it failed until I explicitly did an az login step in the github workflow job.
What am I missing here? Does terraform plan only compare the new configuration file against the state file, not the actual account? Or does it verify the state against the resource-group/subscription in Azure?
I was a little confused with the documentation on terraform plan
I wrote infrastructure as code using terraform, and apply it successfully on azure cloud, now i created another 3 vm's using the same networking file and variable file that already used in the pervious IAAC, how can i run only these 3 vm's without generating new error, or "already exists" to create them on the same subscription and same variable/networking configuration.
Thanks
if you understand correctly, you can use
terraform -target
Be notice that -target attribute respects dependencies
I worked with Terraform for AWS before successfully. Now I am trying to work with Azure and facing a few challenges. I have successfully authenticated to my azure account using Azure CLI. When I run the basic terraform provider arm .tf and do a terraform init it just works. But when I put in any additional code like container creation or blob creation .tfs, the init is not working and is giving me the below message :
No available provider "azure" plugins are compatible with this Terraform version.
Error: no available version is compatible with this version of Terraform
Terraform version :
bash-3.2$ terraform -v
Terraform v0.12.19
+ provider.azurerm v1.38.0
I used version 1.38.0 and tried many others but it still continues to give me error.
They are the two providers for the different Azure models.
Azure Service Management Provider model is the classic model in Azure and is not recommended to use now. It provides the resources with format azure_xxx.
Azure Resource Manager Provider model is the Resource Manager model which calls ARM and is recommended to use and supported well. It provides the resources with format azurerm_xxx.
You can also learn more about the ASM and ARM model in document Azure Resource Manager vs. classic deployment: Understand deployment models and the state of your resources.
I have a most of my Azure infrastructure managed with Terraform.
However, I am quickly finding that a lot of the small details are missing.
e.g. client secrets aren't fully supported https://github.com/terraform-providers/terraform-provider-azuread/issues/95
It doesn't seem possible to add an Active Directory Provider to APIM
How Do I Add Active Directory To APIM Using Terraform?
Creating the APIM leaves demo products on it that can't be removed
How Can I Remove Demo Products From APIM Created With Terraform?
etc, etc.
Solutions to these seems to be utilising the cli
e.g. https://learn.microsoft.com/en-us/cli/azure/ad/app/permission?view=azure-cli-latest#az-ad-app-permission-add
Or falling back to the REST API:
e.g.
https://learn.microsoft.com/en-us/rest/api/apimanagement/2019-01-01/apis/delete
How can I mix terraform with the CLI and REST API?
Can they be embedded in terraform?
Or do I just run some commands to run them after terraform has finished?
Is there a way to do these commands in a cross platform way?
Will running the CLI and REST API after terraform cause the state to be wrong and likely cause problems the next time terraform is run?
How can I mix terraform with the CLI and REST API?
You can use the Terraform provisioner local-exec or remote-exec. In these ways, you can run the script with CLI commands or the REST API. For more details, see local-exec and remote-exec. But you need to take care of them. These two ways just run the scripts and display the output, but they do not have the outputs.
If you want to use the result of the script in the same Terraform file for other resources, you need to use the Terraform external data source, see the details here.
Update:
Here is an example.
Bash script file vmTags.sh:
#!/bin/bash
az vm show -d -g myGroup -n myVM --query tags
Terraform external data source:
data "external" "test" {
program = ["/bin/bash", "./vmTags.sh"]
}
output "value" {
value = "${data.external.test.result}"
}
I am working in a project that will be deployed at my client's Microsoft Azure. Thus I am currently testing terraform to assist me when the time comes.
create a azure function with terraform that will trigger on blob storage input data
My question is about how to add the azure functions's javascript/c# code into the terraform script so it will be automatically deployed ?
I checked the terraform docs, but it wasn't of much help:
https://www.terraform.io/docs/providers/azurerm/r/function_app.html
Any ideas?
Terraform doesn't handle pushing code to Azure resources, that's usually done in a following step in the pipeline (e.g. 1- execute terraform 2- deploy code).
However, the Azure Function App does have the ability to connect directly to your repo, and the Terraform azurerm_function_app module exposes the source_control property.
Terraform's azurerm_function_app documentation
So with Terraform you can configure the function app to pull the code directly from the repo when a change is detected.
Microsoft's Azure Function Continuous Deployment documentation