I am getting Run failed: Unable to establish SSH connection error when I trigger my Published Azure ML Pipeline using Azure Function App while the VM is closed. Normally The Azure ML Pipeline should be able to automatically turn the virtual machine on when I trigger it and close the VM when the process done. Otherwise, it doesn't make any sense.
Sometimes I don't get such an error and the pipeline just works perfectly.
Also, the Pipeline works without a problem when I manually start the VM from AzurePortal before trigger the pipeline.
The Published Pipeline uses Azure Data Science Virtual Machine - Ubuntu. I am using username and password to access the VM.
I totally agree that it would be awesome if ML Service could do this, but fairly certain it isn't supported.
Perhaps you use the function app and the Azure SDK to turnono the DSVM before the pipeline is triggered?
If you're interested in minimizing the time that the VM is running, I highly recommend looking at AMLCompute. Provided you can define the desired runtime environment using either Anacondoa or Docker, you can send the code along with the desired environment to AML Compute and it will automatically spin-up the VM(s) you require and spin them down after. You can also use you environment definition to create the same environment on the DSVM if need be.
My team uses AMLCompute heavily. For me, it solves the problem that I think you're asking.
Related
I am using third party software for master data management. This software does not have API's to perform deployment across environment. It has command line utility which needs to be installed on machine and call the commands to perform the installation. I am looking to automate this solution using Azure Devops pipeline. What I am trying to do is as below
Create windows VM in Azure and install command line utility on it.
Store script in some folder in the VM.
Use Azure pipeline to call this script which is stored in my VM.
I don't even know if it possible to do such things. I tried looking on internet on how to call script stored in VM via Azure pipeline but didn't find any useful link.
If any one has done such activity or have an idea how can it be achieved please help.
this can help you
Running the command line in a Azure Virtual Machine is fully supported in Azure Pipelines. You could install a self-hosted agent in the VM.
Before that, I recommend that you could create a new agent pool for the self-hosted agents. Please go to the organization settings -> Agent pools-> click the "Add pool" -> choose the "self-hosted" type.
Then, you could refer to this document to complete the installation. When you configure the agent, you could choose the new created agent pool.
After installation, you could create a pipeline, choose the new created agent pool and add a Command Line task to run the command line. Of course, if you have many agents in the same agent pool, you could also set the demands in the pipeline to specify one agent.
reference : Is it possible to run a command line in a Azure Virtual Machine from Azure DevOps pipeline?
I have a Jenkins instance running on an AWS EC2. Rather than use the built-in nodes(i have disabled them), I need to provision an Azure VM agent, perform builds and execute tests on that agent. Using the https://plugins.jenkins.io/azure-vm-agents/ via a freestyle job I was able to successfully provision an Azure VM agent on demand. However I an unable to achieve the same via a Multibranch pipeline ( there is a seperate repo for Jenkins File and one for the code repo) (plugin https://plugins.jenkins.io/remote-file/). It does not seem to kick of the Azure VM agent and fails waiting for an agent. If i enable the built-in node, it works. So it appears that via the pipeline the azure-vm-agents is not triggering for provisioning. . There appears as if a link For how to select agent in pipeline, refer to this doc. within the https://plugins.jenkins.io/azure-vm-agents/ page does not lead anywhere. Any thoughts much appreciated.
As a beginner of DevOps, I would like to know how to use one VM for azure pipeline runs. When starting the run of the azure pipeline task it always gives a fresh VM from azure.
For caching and file saving purposes, I want to use a reserved VM for pipeline run.
Appreciate your suggestions and support.
Check the pic, In the Azure DevOps, we could run the pipeline via Hosted agent and Self-Host agent.
Azure Pipelines provides a pre-defined agent pool named Azure Pipelines, this is hosted agent and each time you run a pipeline, you get a fresh virtual machine. The virtual machine is discarded after one use.
For caching and file saving purposes, I want to use a reserved VM for pipeline run.
We could refer to this doc to install the self-hosted agent, it will save the cache.
You can setup a 'self hosted agent'. That would be your own VM, which you have total control over. I'm not sure whether this will be any cheaper than hosted agents.
I've used a self-hosted agent a while ago, and saved some money booting the VM only when needed. After a while it would shutdown again.
Source: Self-hosted agents
We use Azure Pipeline to implement our Continuous integration pipeline. The app is deployed in virtual machines that we need to provision and configure. There are tones of libraries, patches , configurations , and applications that we need to deploy on the target VM before we get our code into those.
The question is what is the best tool to provision and configure these virtual machines? I was thinking of using Ansible AWX. Basically Azure Pipeline would make a call to the AWX API, which would then take it from there and finalize things.
There is an Azure Pipeline Extension that allows me to execute a playbook https://github.com/microsoft/azure-pipelines-extensions/blob/master/Extensions/Ansible/Src/readme.md. But I would like to use AWX instead so that my ansible/deployment code is decoupled from my pipeline.
Any suggestions?
As far as I know, Ansible allows you to automate the deployment and configuration of resources in your environment. It could meet your needs.
As you said, Azure Pipeline supports to run the playbook in the Ansible task(Ansible extension).
So I think you can directly complete the VM Configuration and Code Deployment in the azure pipeline.
If you want to separate these two steps, you can split them into two pipelines (VM configure and Code Deployment). To avoid confusion between configuration and deployment code, you can also split them into two repos.
On the other hand, if you run the playbook in the azure pipeline, the azure pipeline also supports adding tasks to change the parameters in the playbook(e.g. Replace Token).
Here is an operation guide about using Ansible in Azure Pipeline.
By the way, if the Virtual Machine is Azure VM, you also could use ARM template to update the Azure VM resource.
Personally, I would drop the AWX requirement. It's something else to manage and maintain and an entirely separate interface too. Instead, just do your whole pipeline in one place... azure devops. Pick one or the other. Tower doesn't have a built in source control, so I recommend ADO over it, but they'll both run ansible and they'll both do it on your own control nodes. There's no reason to take an extra step with another tool. It adds way too much complexity.
I created a simple .net core console app. This app's repository is a Azure DevOps one. I have also created a ubuntu vm which i can successfully connect to, to receive the deploy.
I have managed to deploy my app from my local computer, by cloning, building and pushing it (via scp command).
Now I would like to do this using azure devops pipeline?
I managed to build the app, but now i can't seem to find help regarding how to execute the scp (or a alternative) command...
Edit1:
Ok, this is turning out to be an order of magnitude harder than I expected. I'm giving up for now. I've been trying to figure this out for almost 2 work-days. I can't believe that a task that requires 4-6 commands on a script on my local machine should require this much effort to do on a devops environment...
You can configure a deploy agent to your VM and use a release management to copy and configure your applications:
Deploy an agent on Linux
Define your multi-stage continuous deployment (CD) pipeline
Have a look at the Copy files over SSH pipeline task which supports SCP.