Azure pipelines, Self-hosted agent, can I use a zip utility from scripts - azure

I am working on Azure pipelines using a Windows Self-hosted Agent behind a firewall, after creating the artefact, I want to zip the current version that exists on the target folder and store the zip on a shared folder in case we need to rollback or compare.
I don’t want to use a predefined task in the pipeline for that as the machine names and folders need to be hidden.
I created a PowerShell script that runs the 7-zip utility but I had to install it on a server and provide the full path to it while I believe some zip utility exists on the agent.
Are we allowed to reference a provided tools like that and is there a variable to it or should I simply install it on the agent server?
Any other recommended approach?
Thanks.

There is no guarantee any tools that come with the current agent will:
remain the same version/compatible
keep shipping with the agent
The team building the agent tries to keep the agent as slim as acceptable. It also ships with the Azure DevOps Server DVD images where every bit counts. Tools have been removed in the past.
PowerShell
Depending on the features you need, PowerShell also has built-in archive support with compress-archive as of version 5. Which is for some time now, that wouldn't require you to install anything on the server.
On a sufficiently old PowerShell version you can tap into the .NET framework directly:
Zip:
PS C:\> Add-Type -A 'System.IO.Compression.FileSystem';
[IO.Compression.ZipFile]::CreateFromDirectory('C:\folder', 'C:\output.zip')
Unzip:
PS C:\> Add-Type -A 'System.IO.Compression.FileSystem';
[IO.Compression.ZipFile]::ExtractToDirectory('C:\input.zip', 'C:\output')
Windows
Windows also has built-in support for archiving with compress/expand.
You could also create a vhdx virtual drive, mount it and copy the files over. There are a number of options to turn on deduplication and a couple of other fancy features.
Very recent versions of windows also come with tar.
Alternate tools
Short term: you can rely on the tools to be there in the external tools folder of the agent.
Long term: A Tool Installer Task (allowing you different versions per pipeline) or a choco install or a winget install is going do be more reliable. You can also install the tool directly on the server.

Related

Create an Azure virtual machine with premade files and run them?

Is there a way in Azure to create a new virtual machine with preselected files that will always be there when establishing the new machine, as well as run them?
I have a shell script that I have to run on new Ubuntu machines that I deploy and I was wondering if there's a way to make Azure already install Ubuntu with those files and maybe even run them.
You can store the files in a storage account and quickly get the files in your VM: https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows. Alternatively, you can restore a backup of a VM that has all prerequisites installed: https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms.
If this is not what you're looking for, I think you should create an ISO of your VM with all software/files installed that you want. This is however not straightforward, see the discussion here: https://serverfault.com/a/952930
If I do not misunderstand, you're searching for a way to run the shell script when creating a new VM. Then I will recommend you use the cloud-init, it can run your shell script to provision the VM in the creation time. You can follow the notes here to use the shell script.

Setting Up Continuous Deployment of a WPF Desktop Application

For a project I am currently working on, I need to create a setup application for an existing desktop application. The setup application will be downloaded from a website, and will download required files to the correct locations. When the application is started, it will look for newer versions of these files, download them if any exist, then start the application.
I am using Visual Studio Online with TFVC, linked to Azure. I have a test application set up so that when I trigger a build, Release Management finds the build directory, and moves the files to Azure Blob Storage, but prepends a GUID to the file names being transferred. So what I have in my storage container is:
{Some GUID}/2390/Test.exe
{Some GUID}/2389/Test.exe
{Some GUID}/2387/Test.exe
...
What I want in my container is the latest version of Test.exe, so I can connect to the container, and determine whether I want to download or not.
I have put together a NullSoft installer that checks a website, and downloads files. I have also written a NullSoft "launcher" that will compare local file versions with versions on the website (using a version xml file on the website), and download if newer, then launch the application. What I need to figure out is how to get the newer files to the website after a build, with automation being one of the goals.
I am an intern, and new to deployment in general, and I don't even know if I'm going about this the right way.
Questions:
Does what I am doing make sense for what I am trying to accomplish?
We are trying to emulate ClickOnce functionality, but can't use ClickOnce due to the fact that the application dynamically loads a number of DLLs. Is there a way to configure ClickOnce to include non-referenced DLLs?
Is there a best practice for doing what I'm describing?
I appreciate any advice, links to references, or real-world examples.
You are mentioning ClickOnce, which you investigated but can't use. Have you already tried an alternative: Squirrel? With Squirrel you can specify which files should be part of the installation, allowing you to explicitly specify which files to include even if you load them dynamically.
Link: https://github.com/Squirrel/Squirrel.Windows
Squirrel is a full framework for creating an auto-update application and can work with Azure Blob Storage hosting (and also CDN if you need to scale up)

Including installers in Azure cloudservice bloats resulting package

I have a standard azure webrole that requires some 3rd party software to be installed on the cloud service. The webrole itself is very basic. The 3rd party software has a few prerequisites, so everything is being included in the role's content (a total of 5 MSI files) and is installed via a an elevated startup task.
The software is installing successfully and everything's working, but including these MSIs as content for the webrole results in an 80MB *.cspkg file.
Excluding the installers yields a package size of 10MB. The total size of all the 5 MSI files is 20.5MB. I don't understand why including 20MB of installers results in an 80MB cloud service package.
I think the packaging process might be attempting to compress the files (unintentionally increasing the payload). Is what I'm seeing normal? Or is there any way of reducing the resulting package when installers are included?
The package for WebRoles is usually bigger then one for worker roles because your role content is included twice! Once is approot folder and once in sites\0\ folder.
As a second thought of yours - compressing. This is true. Azure package is just a regular zip file. You can safely rename yourpackage.cspkg to yourpackage.zip and browse the content.
However your strategy for keeping installation files in package is not effective in its roots. What I will suggest is that you put your 3rd party installer in a blob storage. Then first download it and then install it. For downloading the content you can use various techniques. From PowerShell to Azure Bootstrapper and probably many more.
Having your (all and any of them) 3rd party installers in a Blob storage is a recommended best practice for designing applications for Windows Azure.
As to the question why your Web Application is copied twice in the Azure Deployment package - it has its roots in the times where Windows Azure Web Role was using IIS Hostable Web Core and not full IIS. And is probably still there for backward compatibility.
I found if you add the msi to the \bin folder of the Azure project it doesn't copy multiple times, in this style:
https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-dotnet-install-dotnet
Using Visual Studio 2017, Azure SDK 2.9, Build Engine version 14.0.27530.0

OnStart vs Startup Script for batch file?

I have a Ruby on Rails application that needs to find a home in an Azure Worker Role.
I currently automate the deployment of the application with a batch file - a file that takes the apache and ruby installers, runs them, and then drops the RoR app in the appropriate directory. After the batch script finishes, Apache is serving to and from the application via port 80.
I'm new to Azure and trying to figure out how to do this.
From my understanding, I have two options here: OnStart with the installation files in Blob Storage, or a startup script. I'm not sure how to do the latter, but I have located the onStart method within the WorkerRole.vb file in the new Azure project I just created.
My question: Is it recommended to use OnStart to deploy the application (using the batch script)? If so, how would I go about integrating the script into the project? And - how do I get started with storing and referencing the files in blob storage?
I know these are super high-level questions. Any input or suggested reading would be super helpful. I have tried to google / search for relevant resources but haven't been able to find much. Thank you for your time!
When you are inside OnStart() function it is better to do role configuration things i.e. IP binding, etc however if you would want to install runtime, download application zip, configured role specific setting, it is best to use Startup task. Please visit my blog Windows Azure: Startup task or OnStart(), which to choose? to learn more about it.
Now in your case it is best to use Startup task. What you can do it as below:
Create your ROR package a zip and place it at Windows Azure Blob Storage
Create a Cmmmand batch file which will do:
2.1 Download the ZIP
2.2 Unzip to Zip content to a specific location
2.3 Update the status back to AZure Blob Storage (Optional)
In your OnStart() function you just need to configure the ROR
The code will look as below if you have TCP Endpoint name "RORWeb80" set to use port 80:
TcpListener RoRPortListener = new TcpListener(RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["RORWeb80"].IPEndpoint);
RoRPortListener.Start();
I have written a sample app for Tomcat/Java based worker role which does exactly the same. So what you can do it just replace the Tomcat ZIP file with ROR ZIP and reuse the code exactly.
As long as you don't need admin-level access (e.g. modifying registry, installing msi's, etc.) you can do your setup from OnStart(), including launching your script. Just include the startup script with your project (don't forget to set Copy Local to true).
Same goes with startup script: you call your cmd file, which then executes the sequence for you. And if you give it elevated permissions, you can run installers, modify registry settings, install custom perf counters, whatever.
In either case: you can keep your apache zip, ruby installers, etc. in blob storage and, at startup, download them to local storage. This saves you from bundling everything within the deployment, which gives you a few advantages (being able to update ruby / apache without redeploy, reduced package size, etc.).
There's a sample app on codeplex that demonstrates the basics of setting up Tomcat via startup script. For one more example, you can look at the scripts installed via Eclipse Windows Azure plugin for Java. These scripts are quite similar. The key is to have some way of downloading files from blob storage and then unzipping them. the codeplex project I referred to points to a sample app that does simple blob downloading. The Eclipse packaging provides similar functionality in a .vbs app. Here's a snippet of one of my scripts from an Eclipse-based project:
SET SERVER_DIR_NAME=apache-tomcat-7.0.25
SET WAR_NAME=myapp.war
rd "\%ROLENAME%"
mklink /D "\%ROLENAME%" "%ROLEROOT%\approot"
cd /d "\%ROLENAME%"
cscript /NoLogo util\unzip.vbs jre7.zip "%CD%"
cscript /NoLogo util\unzip.vbs tomcat7.zip "%CD%"
copy %WAR_NAME% "%SERVER_DIR_NAME%\webapps\%WAR_NAME%"
cd "%SERVER_DIR_NAME%\bin"
set JAVA_HOME=\%ROLENAME%\jre7
set PATH=%PATH%;%JAVA_HOME%\bin
cmd /c startup.bat
The codeplex project has a similar-looking script.
Don't forget: you'll need to set up an Input Endpoint for your role (part of the role properties).
To get blobs into blob storage, there are both free tools (like Clumsy Leaf CloudXplorer and paid tools (such as Cerebrata's Cloud Storage Studio).
To download blobs to local storage, you can either write a few lines of .net code (from OnStart) or just use the utility pointed to in the codeplex project.

What is a good deployment tool for websites on Windows?

I'm looking for something that can copy (preferably only changed) files from a development machine to a staging machine and finally to a set of production machines.
A "what if" mode would be nice as would the capability to "rollback" the last deployment. Database migrations aren't a necessary feature.
UPDATE: A free/low-cost tool would be great, but cost isn't the only concern. A tool that could actually manage deployment from one environment to the next (dev->staging->production instead of from a development machine to each environment) would also be ideal.
The other big nice-to-have is the ability to only copy changed files - some of our older sites contain hundreds of .asp files.
#Sean Carpenter can you tell us a little more about your environment? Should the solution be free? simple?
I find robocopy to be pretty slick for this sort of thing. Wrap in up in a batch file and you are good to go. It's a glorified xcopy, but deploying my website isn't really hard. Just copy out the files.
As far as rollbacks... You are using source control right? Just pull the old source out of there. Or, in your batch file, ALSO copy the deployment to another folder called website yyyy.mm.dd so you have a lovely folder ready to go in an emergency.
look at the for command for details on how to get the parts of the date.
robocopy.exe
for /?
Yeah, it's a total "hack" but it moves the files nicely.
For some scenarios I used a freeware product called SyncBack (Download here).
It provides complex, multi-step file synchronization (filesystem or FTP etc., compression etc.). The program has a nice graphical user interface. You can define profiles and group/execute them together.
You can set filter on file types, names etc. and execute commands/programs after the job execution. There is also a job log provided as html report, which can be sent as email to you if you schedule the job.
There is also a professional version of the software, but for common tasks the freeware should do fine.
You don't specify if you are using Visual Studio .NET, but there are a few built-in tools in Visual Studio 2005 and 2008:
Copy Website tool -- basically a visual synchronization tool, it highlights files and lets you copy from one to the other. Manual, built into Visual Studio.
aspnet_compiler.exe -- lets you precompile websites.
Of course you can create a web deployment package and deploy as an MSI as well.
I have used a combination of Cruise Control.NET, nant and MSBuild to compile, and swap out configuration files for specific environments and copy the files to a build output directory. Then we had another nant script to do the file copying (and run database scripts if necessary).
For a rollback, we would save all prior deployments, so theoretically rolling back just involved redeploying the last working build (and restoring the database).
We used UnleashIt (unfortunate name I know) which was nicely customizable and allowed you to save profiles for deploying to different servers. It also has a "backup" feature which will backup your production files before deployment so rollback should be pretty easy.
I've given up trying to find a good free product that works.
I then found Microsoft's Sync Toy 2.0 which while lacking in options works well.
BUT I need to deploy to a remote server.
Since I connect with terminal services I realized I can select my local hard drive when I connect and then in explorer on the remote server i can open \\tsclient\S\MyWebsite on the remote server.
I then use synctoy with that path and synchronize it with my server. Seems to work pretty well and fast so far...
Maybe rsync plus some custom scripts will do the trick.
Try repliweb. It handles full rollback to previous versions of files. I've used it whilst working for a client who demanded its use and I;ve become a big fan of it, partiularily:
Rollback to previous versions of code
Authentication and rules for different user roles
Deploy to multiple environments
Full reporting to the user via email / logs statiing what has changed, what the current version is etc.

Resources