I have three windows self-hosted runners for github actions.
Need to turn on the machines when the pipeline runs.
And want to turn off the machine once the pipeline finishes.
But, want to make sure, other pipelines are not running before turning off the machine.
Also, with azure, unlike aws, shuttting down the machine doesn't stop the billing. So, we need to turnoff from azure portal/api. Thought to run that command at the end of pipeline but again, it needs another github runner(either self-hosted or provided by github actions) to do that step, if I do in the same runner, it won't provide the return code correctly as the runner will be turned off.
Related
I have a Jenkins instance running on an AWS EC2. Rather than use the built-in nodes(i have disabled them), I need to provision an Azure VM agent, perform builds and execute tests on that agent. Using the https://plugins.jenkins.io/azure-vm-agents/ via a freestyle job I was able to successfully provision an Azure VM agent on demand. However I an unable to achieve the same via a Multibranch pipeline ( there is a seperate repo for Jenkins File and one for the code repo) (plugin https://plugins.jenkins.io/remote-file/). It does not seem to kick of the Azure VM agent and fails waiting for an agent. If i enable the built-in node, it works. So it appears that via the pipeline the azure-vm-agents is not triggering for provisioning. . There appears as if a link For how to select agent in pipeline, refer to this doc. within the https://plugins.jenkins.io/azure-vm-agents/ page does not lead anywhere. Any thoughts much appreciated.
As a beginner of DevOps, I would like to know how to use one VM for azure pipeline runs. When starting the run of the azure pipeline task it always gives a fresh VM from azure.
For caching and file saving purposes, I want to use a reserved VM for pipeline run.
Appreciate your suggestions and support.
Check the pic, In the Azure DevOps, we could run the pipeline via Hosted agent and Self-Host agent.
Azure Pipelines provides a pre-defined agent pool named Azure Pipelines, this is hosted agent and each time you run a pipeline, you get a fresh virtual machine. The virtual machine is discarded after one use.
For caching and file saving purposes, I want to use a reserved VM for pipeline run.
We could refer to this doc to install the self-hosted agent, it will save the cache.
You can setup a 'self hosted agent'. That would be your own VM, which you have total control over. I'm not sure whether this will be any cheaper than hosted agents.
I've used a self-hosted agent a while ago, and saved some money booting the VM only when needed. After a while it would shutdown again.
Source: Self-hosted agents
Does azure pipelines allow custom action like AWS codepipeline?
I want to create a job worker that will poll azure pipeline for job requests for this custom action, execute the job, and return the status result to azure pipeline.
Something similar to - https://docs.aws.amazon.com/codepipeline/latest/userguide/actions-create-custom-action.html
Tasks are the building blocks for defining automation in a build or release pipeline in Azure DevOps. There are many built-in tasks to enable fundamental build and deployment scenarios. If the existing tasks don't satisfy your needs, you can always build a custom task. Check Task types & usage for more details.
In addition, Visual Studio Marketplace offers a number of extensions; each of which, when installed to your subscription or collection, extends the task catalog with one or more tasks. Furthermore, you can write your own custom extensions to add tasks to Azure Pipelines.
Azure Pipeline Agents
When your pipeline runs, the system begins one or more jobs. An agent is computing infrastructure with installed agent software that runs one job at a time.
You have two options here to choose from: Microsoft-hosted agents or Self-hosted agents
An agent that you set up and manage on your own to run jobs is a self-hosted agent. Self-hosted agents give you more control to install dependent software needed for your builds and deployments. Also, machine-level caches and configuration persist from run to run, which can boost speed.
However, before you install a self-hosted agent you might want to see if a Microsoft-hosted agent pool will work for you. In many cases, this is the simplest way to get going.
With Microsoft-hosted agents, maintenance and upgrades are taken care of for you. Each time you run a pipeline, you get a fresh virtual machine. The virtual machine is discarded after one use. Microsoft-hosted agents can run jobs directly on the VM or in a container. Azure Pipelines provides a pre-defined agent pool named Azure Pipelines with Microsoft-hosted agents.
You can try it first and see if it works for your build or deployment. If not, you can use a self-hosted agent. Check this doc for more details.
I will pull the agent queue from my custom job worker and process the job. Is that possible in azure pipelines?
Based on my understanding of code pipeline and Azure devops, I am afraid what you said should be meaningless.
According to the document Create and add a custom action in CodePipeline, we could to know that:
AWS CodePipeline includes a number of actions that help you configure
build, test, and deploy resources for your automated release process.
If your release process includes activities that are not included in
the default actions, such as an internally developed build process or
a test suite, you can create a custom action for that purpose and
include it in your pipeline.
But for Azure devops, we do not need to create a job worker that will poll CodePipeline for job requests for this custom action. That because the whole process of build/release can be customized. We do not need to add a job worker for additional custom actions.
Azure devops provide a lot of templates when we create the pipeline, we could modify the pipeline directly in the pipeline to add/remove or update the task:
Even we can completely start with a blank pipeline and completely customize the entire build/release process.
So, we do not need to create a job worker for the custom action, just modify your pipeline directly.
We use Azure Pipeline to implement our Continuous integration pipeline. The app is deployed in virtual machines that we need to provision and configure. There are tones of libraries, patches , configurations , and applications that we need to deploy on the target VM before we get our code into those.
The question is what is the best tool to provision and configure these virtual machines? I was thinking of using Ansible AWX. Basically Azure Pipeline would make a call to the AWX API, which would then take it from there and finalize things.
There is an Azure Pipeline Extension that allows me to execute a playbook https://github.com/microsoft/azure-pipelines-extensions/blob/master/Extensions/Ansible/Src/readme.md. But I would like to use AWX instead so that my ansible/deployment code is decoupled from my pipeline.
Any suggestions?
As far as I know, Ansible allows you to automate the deployment and configuration of resources in your environment. It could meet your needs.
As you said, Azure Pipeline supports to run the playbook in the Ansible task(Ansible extension).
So I think you can directly complete the VM Configuration and Code Deployment in the azure pipeline.
If you want to separate these two steps, you can split them into two pipelines (VM configure and Code Deployment). To avoid confusion between configuration and deployment code, you can also split them into two repos.
On the other hand, if you run the playbook in the azure pipeline, the azure pipeline also supports adding tasks to change the parameters in the playbook(e.g. Replace Token).
Here is an operation guide about using Ansible in Azure Pipeline.
By the way, if the Virtual Machine is Azure VM, you also could use ARM template to update the Azure VM resource.
Personally, I would drop the AWX requirement. It's something else to manage and maintain and an entirely separate interface too. Instead, just do your whole pipeline in one place... azure devops. Pick one or the other. Tower doesn't have a built in source control, so I recommend ADO over it, but they'll both run ansible and they'll both do it on your own control nodes. There's no reason to take an extra step with another tool. It adds way too much complexity.
I am trying to create a deployment plan for my web application in Azure which can support 2 environments (Dev/Staging). Basically, I want the code checked in by developers to be deployed to the Dev machine end of the day. And then the latest Dev changes to be merged with staging branch and if no merge conflict happened a new publish goes to staging machine. Can anyone help me where to start and what feature in azure i can use to server this?
You could use Azure Devops to trigger scheduled builds (at the end of the day) to perform the routine you are describing. you could also use releases and trigger them automatically from the builds that happen on a schedule.
You would also need to deploy vsts agents on the machines or allow for vsts agents to somehow talk to the machines to upload code to them