I'd like to execute a command line app that I created as part of my CI builds.
The command line app is not related to the repository on which the build is running. (Basically, its a tool that is supposed to send some project related metadata to a company web application to gather some metrics).
It seems I can execute it using a Command Line task, as explained here:
How to execute exe in VSTS Release pipeline empty process
The question is, however - how do I upload my custom tool into Azure Devops?
I know it's possible to create custom tasks (https://learn.microsoft.com/en-us/azure/devops/extend/develop/add-build-task?view=azure-devops) , but it seems quite a lot of effort to create, especially given that I have a console tool that is doing what I need.
I do this by including a seperate deployment folder in the git repo with my source code. This folder contains all the extra stuff (exe's) that I need to call to deploy.
You could if you wanted keep these artifacts in a different repo and/or different artifact if you wish as you can get any number of different artifacts
Then I include the deployment folder when publishing artifacts. In your build step you pull down the artifacts and it includes your EXE.
You're meant to be able to use things like NPM to install helper libraries on the fly but none of my required libraries were ever supported.
You can also use a self hosted agent which is your own host (Often an Azure VM). You install everything you need on there then you install a DevOps self hosted agent which lets build pipelines use it.
Create a build of your exe
Upload your exe to blob storage.
Create a SAS token to access your blob.
In your build create a task with a PowerShell script. In the PS script download your exe (unzip), and copy it to Build.StagingDirectory/"yourToolFolder". Then in your PS script run it. You probably want to pass it arguments like the location of the repo on the build agent.
A way to achieve this involve create a deployment group and add a server to the group where you have access and privileges to upload your console. it could be onprem or cloud depends in your requirements.
Related
I have a large PowerShell script that creates a machine and performs a bunch of configurations and it relies on Az module together with some custom modules I have written. I can run it from my machine, but I am exploring the possibility to run it from Azure and letting others run it without the need to fetch the latest version of the script and the dependent modules.
I have looked into Azure Functions, Logic Apps and Pipelines, but I don't really even know where to start and which one is the most suitable.
The workflow I would like to achieve is this:
A teammate would specify a machine name and trigger the script.
The script would then use Az modules and pull some modules from a git repo to create and configure a VM.
The teammate would receive some sort of feedback to show if the script was successful, maybe a log or an email notification.
Did you look at Runbook running over Automation Account? You can manage your source with DevOps and use Automation Account' Source Control. You can also use Azure Arc Agent on VM to run your script.
I'm using GitLab's CI/CD for building and deploying my application. As my deployment environment is GCP, I'm trying to use GCP cloud build to build and deploy my application to the cloud. In GitLab, I was using it's environment file feature to store certain critical files. Is there any alternative for that in GCP's cloud build?
I'm storing some configuration files and some other files as environment file. How can I completely replace GitLab CI/CD with GCP Cloud build?
You can in fact use Cloud Build for CI/CD and GitHub for Source Code Management as you usually do using GitLab, in that way you can build and deploy your apps on GCP and still be able to manage your source code.
In this guide of Similarities and differences between GitLab CI and Cloud Build, there is an explanation on how to build and deploy your code with Cloud Build. The steps in a few words would be:
Connect Cloud Build to your source repository
Select the repository where your source code is with authentication
Add a trigger to automate the builds using a Dockerfile or Cloud Build config file in YAML format
Also, a brief explanation of some important fields:
The steps field in the config file specifies a set of steps that you want Cloud Build to perform.
The id field sets a unique identifier for a build step.
The name field of a build step specifies a cloud builder, which is a container image running common tools. You use a builder in a build step to execute your tasks.
The args field of a build step takes a list of arguments and passes them to the builder referenced by the name field.
The entrypoint in a build step specifies an entrypoint if you don't want to use the default entrypoint of the builder.
To store config files, you can refer to Storing images in Container Registry:
You can configure Cloud Build to store your built image in one of the following ways:
using the images field, which stores the image in the Container Registry after your build completes.
using the docker push command, which stores the image in the Container Registry as part of your build flow.
If your build produces artifacts like binaries or tarballs, as mentioned in Storing build artifacts, you can store them in Cloud Storage.
In case you want to use variables, I would refer to the site mentioned in a previous comment.
I have a .net core web app (NopCommerce 4.1) that I am attempting to setup a build and release pipeline for.
However, when I setup the pipeline my deployment is failing because it attempts to create a folder, but write rights do not exist. I have confirmed this with Kudu where I get an error message (409) when attempting to create a folder via the cmd shell.
NopCommerce requires a couple of folders to be editable, but azure pipelines deploys a zip folder and creates a folder structure that is read only.
I want to deploy to a dev, test, prod environment with a folder structure that is editable (as nopcommerce creates folders and writes files to them dynamically).
I followed the following YAML structure:
https://damianbrady.com.au/2018/10/11/what-yaml-do-i-need-for-azure-pipelines/
Is there a way to create a build / deployment that will deploy either:
1. The files without zipping
2. Transfer a zip, unpack into a folder structure with execute/modify/create permissions
1.The files without zipping 2.Transfer a zip, unpack into a folder structure with execute/modify/create permissions
We could use the Azure kudu Zip API to do that.
Note:It is not recommended for deployments Kudu's zip API.
Kudu Rest API is an effective way to move multiple files to your site, but zipdeploy is preferable for deployment For more information please refer to this document.
In your case, you could use a Powershell task with Powershell script to invoke Kudu Zip API to do that. For information about how invoke REST API with Powershell, please refer to this SO thread.
I am looking for some help to understand how to use Azure DevOps to deploy the application I am working on to a Windows VM.
Currently process: Our code is currently in Azure Git repo and we have two QA servers, the QA servers have been setup already. Every time we go to QA server to manually pull the lastest code with command line git pull command. Then run a web page to upgrade/downgrade database if database script has been updated.
Goal: Would use Azure DevOps to automate the process.
Here is what I would know:
1) With Azure DevOps, when deploying the code to QA server, could we only copy over the changed files? The software package is pretty big, it would take long time to copy the whole thing.
2) How Azure DevOps move files to QA server, does it use Git pull or file copy?
3) When using Azure DevOps tools, could we trigger a http(s) request?
4) Is there any tool I could check if Git repo has updates?
5) Is there any tool support if/else logic, because we would trigger the http(s) request only if Git repo has changes.
Just would get an overall idea.
as far as I know there is no layering\caching.
how would it use git pull to download from a web server? its using http request to download the package
not sure I understand the question, but you can have a script step in the deployment and do whatever you like (ie http(s) request)
this question doesnt make sense, you can use git command line, but I dont understand how that ties to the release process. you should build your code on commit and create a package that you would later consume in the release process
read 3 and 4.
I'm setting up a test pipeline using VSTS and now I have my builds working using an Azure VM as a build agent, I want to also deploy to that machine.
I've tried using the Windows File Copy and IIS deploy tasks but understand this isn't a very good solution for security reasons, so wondered what the best task to use would be to get the build/release agent on the machine to copy the artefacts to the Azure based VM and deploy locally to its IIS install?
I'd suggest that you strongly reconsider not deploying your application to your build agent. This will make it extremely difficult to find issues due to missing DLLs or files because the build server has everything. I suggest either creating another VM to deploy to or leverage Azure's PaaS for web applications.
With all of that said, because you are working locally on the same VM, you can simply leverage the Copy Files task to move the files to where they need to be. To "deploy" the application, you can simply copy the output of the website to the IIS directory.
Another option would be to create a PowerShell script that would setup, configure and deploy the application to the local machine. In which case, you could simply leverage the PowerShell task.
The source (Get sources section in build definition) will be download to build agent automatically during the build, so you don’t need to copy the files to that machine through Windows File Copy task, the simple workflow is that:
Add NuGet task to restore packages
Add Visual Studio Build task (MSBuild Arguments: /p:SkipInvalidConfigurations=true /p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageLocation="$(build.artifactstagingdirectory)\\web.zip" /P:PackageTempRootDir="")
Add WinRM-IIS Web App Deployment task: (Web Deploy package: $(Build.ArtifactStagingDirectory)\web.zip)
As virusstorm said that you can copy files to other path on that machine through Copy Files task.
On the other hand, by default the artifact will be downloaded to the target machine if you are using release and you can consider Deployment groups if the deployment machine is different with the build agent machine.