I have a python script resides in an Azure VM. This script uses a few local files from this VM. I need to create an ADF pipeline which will execute this python script residing in this on-premises VM. As the script is placed on-premises, I can't use any cluster activity of ADF. So Basically the pipeline should connect to the VM and trigger the script execution. I could think of an option of using Custom Activity of ADF and trigger Powershell command from there to this on-premises python script. But not sure how to connect to on-premises scripts.
After my researching, you do could run Python script in ADF custom activity. However, based on the official document,you need to rely on Azure Batch Service to put your scripts and dependencies to the folder path in the Azure Batch Service. So, I think it's properly for you to execute on-premises Python scripts situation.
I provide you with a workaround.
step1: expose an endpoint to executing your on-premises Python scripts, of course, the local files could be touched.
step2: then use VPN gateway to get access to network channels between on-premises and Azure side.
step3: use Web activity in ADF to invoke the exposed endpoint and get executing results.
Related
I have a large PowerShell script that creates a machine and performs a bunch of configurations and it relies on Az module together with some custom modules I have written. I can run it from my machine, but I am exploring the possibility to run it from Azure and letting others run it without the need to fetch the latest version of the script and the dependent modules.
I have looked into Azure Functions, Logic Apps and Pipelines, but I don't really even know where to start and which one is the most suitable.
The workflow I would like to achieve is this:
A teammate would specify a machine name and trigger the script.
The script would then use Az modules and pull some modules from a git repo to create and configure a VM.
The teammate would receive some sort of feedback to show if the script was successful, maybe a log or an email notification.
Did you look at Runbook running over Automation Account? You can manage your source with DevOps and use Automation Account' Source Control. You can also use Azure Arc Agent on VM to run your script.
I am using third party software for master data management. This software does not have API's to perform deployment across environment. It has command line utility which needs to be installed on machine and call the commands to perform the installation. I am looking to automate this solution using Azure Devops pipeline. What I am trying to do is as below
Create windows VM in Azure and install command line utility on it.
Store script in some folder in the VM.
Use Azure pipeline to call this script which is stored in my VM.
I don't even know if it possible to do such things. I tried looking on internet on how to call script stored in VM via Azure pipeline but didn't find any useful link.
If any one has done such activity or have an idea how can it be achieved please help.
this can help you
Running the command line in a Azure Virtual Machine is fully supported in Azure Pipelines. You could install a self-hosted agent in the VM.
Before that, I recommend that you could create a new agent pool for the self-hosted agents. Please go to the organization settings -> Agent pools-> click the "Add pool" -> choose the "self-hosted" type.
Then, you could refer to this document to complete the installation. When you configure the agent, you could choose the new created agent pool.
After installation, you could create a pipeline, choose the new created agent pool and add a Command Line task to run the command line. Of course, if you have many agents in the same agent pool, you could also set the demands in the pipeline to specify one agent.
reference : Is it possible to run a command line in a Azure Virtual Machine from Azure DevOps pipeline?
I need to run various batch scripts on an on-premises Windows server. These scripts need to be called from an Azure service. We were thinking of using Datafactory to orchestrate the whole process but we are looking for a way to call these scripts.
Would it be possible using Azure Functions? Is it possible to connect somehow with the on-premises Windows machine and call the scripts in a secure way?
I have TFS 2015 and i was able to automated the build process from the branch and get the files from the drop folder as shown below:
It has release for multiple projects like Web API and Windows Service
I want Azure VM on which i want to automate the deployment process - continuous delivery.
Deploy the Web API on IIS on Azure VM
Deploy the Windows Services On Azure VM.
Run Scripts SQL.
I have credentials of Azure VM. How i can perform the three above steps.
I have worked on a similar problem in the past so can probably help you out (MSFT, if it helps).
Web Api on IIS on Azure VM
This is almost completely automated in the form of WinRM - IIS Web App Deployment task that you can find and add in your release definition. The link provides complete instructions on what parameters to provide and tweaks to be done for Azure VM compared to on-premise ones. There are a few prerequisites to running this task, like installing and configuring IIS on the VM which the documentation discusses in detail. As a necessary input to this task, you need to provide the web deploy package which I am assuming was generated as your build output. If not, you can refer to this SO post to get the required output. If you have parameters like connection strings that you wish to modify at deploy time, using a parameters.xml file in the above task.
Windows Service on Azure VM
There is no completely automated task for this requirement, but it is pretty straight-forward. It can be achieved by using the PowerShell on Target Machines task along with Azure File Copy task. For the first task, all that is required as input is the .exe of the windows service that you wish to deploy, which should be generated as the output of your build process (build artifacts). Much of the remote machine inputs for this task is similar to the previous one so you should not have any problem there. You will need to check-in the Powershell script that does the actual windows service installation, in your source code as part of the same windows service project (copy local = True). This will ensure that as the build output, you will have access to the powershell script which you can use in the second task. Azure File Copy is required to copy your powershell script to the Azure VM so that the Powershell task can execute it. Let's assume you copied the powershell script to a folder C:\Data\ on the Azure VM.
$serviceName = "MyWindowsService"
$exeFullName = "path\\to\\your\\service.exe"
$serviceDisplayName = "MyWindowsService"
$pss = New-Service $serviceName $exeFullName -DisplayName $serviceDisplayName
-StartupType Automatic
Add this content to the checked in powershell file and name it installWindowsService.ps1. Then in the powershell task provide the path of the powershell file to execute as C:\Data\installWindowsService.ps1.
Run SQL Scripts on Azure VM
I haven't personally worked on this so the best I can do is point you in the right direction. If you are using DACPAC for your SQL deployment, you can use the WinRM - SQL Server Database Deployment task. If you just intend to execute scripts, use the remote powershell task from above and refer this post that will help you with running SQL commands through powershell script
Seems you want the CD release process picks up the artifacts published by your CI build and then deploys them to your IIS servers/Windows Services on Azure VM.
If you've just completed a CI build, then you should create a new release definition that's automatically linked to the build definition.
Open the Releases tab of the Build & Release hub, open the + drop-down in the list of release definitions, and choose Create release definition.
For 2, write a powershell script to handle this, ensure build outputs
were available to copy from the ‘Drop’ folder on the build and that
they are copied to C:\xxx\ on the target VM(s). More detail steps
please refer this blog.
For 3, you could use Azure SQL Database Deployment task. Either
select the SQL Script file on the automation agent or on a UNC path
that is accessible to the automation agent. Or directly enter the
InLine SQL Script to run against the Azure SQL Server Database. Also take a look at the tutorial.
Maybe not all the task is fully Compatible with TFS2015 version, you could upgrade your TFS version to get more new features or customize your own build/release task to handle it.
We have one windows application installed in Azure VM. We want to execute that application using Azure so that we will be able to monitor execution of this application using Azure portal.
Is there any way to invoke an executable present in one Azure VM using Azure Data Factory pipeline or some other service present in Azure?
What does the exe do? Is it a console app?
Generally, I think it's possible.
This can be achieved by using ADF Custom Activity. You may rewrite your app as a custom activity, which will be run in Azure Batch VMs.
If your app can't be run in Azure Batch, you will have to enable something like PowerShell remoting, so that exe can be launched remotely. Caveat is, even in this case, you will still need a Custom Activity / Azure Batch as invoker, since ADF Pipeline itself can't do remote call or run custom code.
Hope this will help.