I have a project that consists of an Azure webapp, a PostgreSQL on Azure, and multiple Azure functions for background ETL workflows. I also have a local Python package that I need to access from both the webapp and the Azure functions.
How can I structure configuration and script deployment for those resources from a single git repo?
Any suggestions or pointers to good examples or tutorials would be very appreciated.
All the Azure tutorials that I've seen are only for small and simple projects.
For now, I've hand-written an admin.py script that does e.g. the webapp and function deployments by creating a Python package, creating ZIP files for each resource and doing ZIP deployments. This is getting messy, and now I want to have QA and PROD versions, and I need to pass secrets so that the DB is reachable, and it's getting more complex. Is there either a nice way to structure this packaging / deployment, or a tool to help with it? For me, putting everything in Kubernetes is not the solution, at least the DB already exists. Also, Azure DevOps is not an option, we are using Gitlab CI, so eventually I want to have a solution that can run on CI/CD there.
Not sure if this will help complete but here we go.
Instead of using a hand-written admin.py script, try using a yaml pipeline flow. For Gitlab, they have https://docs.gitlab.com/ee/ci/yaml/ that you can use to get started. From what you've indicated, I would recommend having several job steps in your yaml pipeline that will build and package your web and function apps. For deployment, you can make use of environments. Have a look at https://docs.gitlab.com/ee/ci/multi_project_pipelines.html as well which illustrates how you can create downstream pipelines.
From a deployment standpoint, the current integration I've found between Azure and GitLab leaves me with two recommendations:
Leverage the script command of yaml to continue zipping your artifacts use Azure CLI (I would assume you can install the tools during the pipeline) to zip deploy.
Keep your code inside the GitLab repo and utilize Azure Pipelines to handle the CI/CD for you.
I hope you find this helpful.
Related
I currently have an application in amplify but I want to integrate the repo, but I have it in azure, I don't see the option to integrate directly with azure from amplify.
In your case, the best suggestion would be to use multiple remotes provided by GitHub. no need to worry if you are not using GitHub because the process and the commands are the same.
As you are using 2 different environments you need to use Azure Repos and Code commit with git standard.
Once you have done setting multiple repos, configuring CI/CD pipeline would be a better process to deploy based on your push.
It's worth pointing out that you’ll need to properly setup and configure your CI/CD pipeline. AWS provides a number of services to support this including AWS CodePipeline, AWS CodeBuild, and AWS CodeDeploy.
Im looking into using terraform to automate setting up an environment for demos.
Works for VM instance and can be fully automated but management prefers to use Cloud run with Docker containers.
When I read this article it starts with manually having to build and register a docker container. I don't get that step, why can't that be automated as well with terraform?
Terraform is a deployment tool. More or less, it invokes API to build, update or delete things. So now what do you want to do? To take a container and to deploy it on Cloud Run. Build sources, uploading files, perform git clone aren't actions designed for Terraform.
It's not surprising to have a CI pipeline that build things and at the end a CD tool called for the deployment.
I would like to perform the following steps on schedule (presumably using Azure Automation):
Provision a VM in Azure
Run a powershell script on that VM
Deprovision VM
Actually I have more steps but left only 3 for simplicity.
I am new to IaC and appreciate your general guidance and advice.
Is it scope of Azure Automation or I need something else?
I would like to code everything in text format and put in Git and update automatically via Pull Requests
Should I use Runbooks or DSC?
Regarding step 2, I cannot figure out how I can upload my powersehll script into newly created VM and run it locally. The script downloads some files and updates some remote resources.
Thanks,
Ruslan
there are a lot of options and tools to achieve your goal.
If you will be working strictly in the Azure cloud, The following tools are most commonly used for building an environment.
Azure-powershell
Azure-CLI
ARM-templates
each of them very similar but all a little different with their own benefits to them, but they are all tools for building your virtual infrastructure. For configuring your resources there are other tools. Like you mentioned yourself, DSC is a tool to configure virtual machines.
if you are planning to use github to push your code, i would recommend using ARM-templates. You can very easily use your own or other templates by referencing in your code. However this might be the 'hardest' solution to learn and understand the syntax in comparison to the cli and powershell. But also the most frequently used.
It is possible to build your environment and configure it in the same script using the Azure-CLI, Azure-Powershell or an other opensource solution like Terraform, But this is not best practice.
A lot of starter scripts are publicly available on github and in the Microsoft docs.
if you have any specific questions you can always send me a message, i am currently working on azure automation myself.
I'd like to execute a command line app that I created as part of my CI builds.
The command line app is not related to the repository on which the build is running. (Basically, its a tool that is supposed to send some project related metadata to a company web application to gather some metrics).
It seems I can execute it using a Command Line task, as explained here:
How to execute exe in VSTS Release pipeline empty process
The question is, however - how do I upload my custom tool into Azure Devops?
I know it's possible to create custom tasks (https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-build-task?view=azure-devops) , but it seems quite a lot of effort to create, especially given that I have a console tool that is doing what I need.
I do this by including a seperate deployment folder in the git repo with my source code. This folder contains all the extra stuff (exe's) that I need to call to deploy.
You could if you wanted keep these artifacts in a different repo and/or different artifact if you wish as you can get any number of different artifacts
Then I include the deployment folder when publishing artifacts. In your build step you pull down the artifacts and it includes your EXE.
You're meant to be able to use things like NPM to install helper libraries on the fly but none of my required libraries were ever supported.
You can also use a self hosted agent which is your own host (Often an Azure VM). You install everything you need on there then you install a DevOps self hosted agent which lets build pipelines use it.
Create a build of your exe
Upload your exe to blob storage.
Create a SAS token to access your blob.
In your build create a task with a PowerShell script. In the PS script download your exe (unzip), and copy it to Build.StagingDirectory/"yourToolFolder". Then in your PS script run it. You probably want to pass it arguments like the location of the repo on the build agent.
A way to achieve this involve create a deployment group and add a server to the group where you have access and privileges to upload your console. it could be onprem or cloud depends in your requirements.
I would like to connect Azure Functions to a repository where on each master push functions would be redeployed along with artifacts (Project.json, function.json, etc). Is there anything already in place? If not, anything planned?
Sean,
Since Azure Functions sits on top of App Service/Web Apps, this is a fully supported scenario and you can find detailed information on the deployment process here
I also have deployment scripts you can use with your function to make sure your packages are properly restored on deployment here (you can find more information about how to use the script here)
Hope this helps!