Create an Azure virtual machine with premade files and run them? - azure

Is there a way in Azure to create a new virtual machine with preselected files that will always be there when establishing the new machine, as well as run them?
I have a shell script that I have to run on new Ubuntu machines that I deploy and I was wondering if there's a way to make Azure already install Ubuntu with those files and maybe even run them.

You can store the files in a storage account and quickly get the files in your VM: https://learn.microsoft.com/en-us/azure/storage/files/storage-files-quick-create-use-windows. Alternatively, you can restore a backup of a VM that has all prerequisites installed: https://learn.microsoft.com/en-us/azure/backup/backup-azure-arm-restore-vms.
If this is not what you're looking for, I think you should create an ISO of your VM with all software/files installed that you want. This is however not straightforward, see the discussion here: https://serverfault.com/a/952930

If I do not misunderstand, you're searching for a way to run the shell script when creating a new VM. Then I will recommend you use the cloud-init, it can run your shell script to provision the VM in the creation time. You can follow the notes here to use the shell script.

Related

Synology DS120j autostart node.js server in deamon mode on system boot

How to start node.js server in deamon mode on Synology DS120j NAS (and other synology models if it's similar) drive since it looks that there is not systemd?
You need to create an installer package (spk).
Therefore you find related developer documents provided by synology
If you have docker on your diskstation, you find a solution by KeesCBakker on github
Even it is possible to create the required files inplace on the DiskStation, I recommend to create an spk with a build environment as provided by SynoCommunity

Azure pipelines, Self-hosted agent, can I use a zip utility from scripts

I am working on Azure pipelines using a Windows Self-hosted Agent behind a firewall, after creating the artefact, I want to zip the current version that exists on the target folder and store the zip on a shared folder in case we need to rollback or compare.
I don’t want to use a predefined task in the pipeline for that as the machine names and folders need to be hidden.
I created a PowerShell script that runs the 7-zip utility but I had to install it on a server and provide the full path to it while I believe some zip utility exists on the agent.
Are we allowed to reference a provided tools like that and is there a variable to it or should I simply install it on the agent server?
Any other recommended approach?
Thanks.
There is no guarantee any tools that come with the current agent will:
remain the same version/compatible
keep shipping with the agent
The team building the agent tries to keep the agent as slim as acceptable. It also ships with the Azure DevOps Server DVD images where every bit counts. Tools have been removed in the past.
PowerShell
Depending on the features you need, PowerShell also has built-in archive support with compress-archive as of version 5. Which is for some time now, that wouldn't require you to install anything on the server.
On a sufficiently old PowerShell version you can tap into the .NET framework directly:
Zip:
PS C:\> Add-Type -A 'System.IO.Compression.FileSystem';
[IO.Compression.ZipFile]::CreateFromDirectory('C:\folder', 'C:\output.zip')
Unzip:
PS C:\> Add-Type -A 'System.IO.Compression.FileSystem';
[IO.Compression.ZipFile]::ExtractToDirectory('C:\input.zip', 'C:\output')
Windows
Windows also has built-in support for archiving with compress/expand.
You could also create a vhdx virtual drive, mount it and copy the files over. There are a number of options to turn on deduplication and a couple of other fancy features.
Very recent versions of windows also come with tar.
Alternate tools
Short term: you can rely on the tools to be there in the external tools folder of the agent.
Long term: A Tool Installer Task (allowing you different versions per pipeline) or a choco install or a winget install is going do be more reliable. You can also install the tool directly on the server.

Setting up Windows Pycharm to use Git on my linux server

So I'm trying to do something probably pretty simple but can't figure it out of course. I have a VM on my network running CentOS and I installed Git using the guide. Now I'm on my Windows PC using Pycharm trying to setup Git but the setup asks for where the git.exe file is and can't seem to navigate to my CentOS VM within pycharm to point it anywhere. I tried \IP Address but that didn't work. Is there somewhere on the Linux vm I need to do to allow the windows Pycharm to reach it? I'm new to this on both sides lol.
If I understand your question correctly, you want to access a Git-Repository in your local Linux network. In that case you need to be able to access these files with your Windows machine in order to push/fetch your changes from/to it. A simple way to do it, could be set up file sharing as explained here link and then clone and handle it as any local Git-Repository (see e.g. link - with local Repos the file path is used instead of the URL).
An even simpler way could be to create a hosted online Git-Repository (e.g. on Github) even if you are a sole contributor, maybe make it a private one, if you don't want to make the contents public.

Merge two different Verdaccio storage folders (CouchDb databases)

I am trying to set up a completely offline Verdaccio installation that I can use as a proxy for NPM packages for a small team of developers and for our build machine.
My challenge is that I would like to be able to update the offline npm packages from time to time (to add more packages). To do this, the only possibility I have is to use another machine (a laptop) that is outside this isolated network and has a separate Verdaccio installation and npm install the packages there. After installation I have the new tgz files in the storage folder. But my question is how do I merge the Storage folder from the offline Verdaccio installation with the Storage folder of the online one. Because I cannot do this manually, especially when you have packages with a lot of dependencies.
Is there some replication that I could easily setup (I am not an expert in CouchDb) or even a plain CLI instruction I could use? Or is there a way to achieve this with Verdaccio or some other utility?
Please bear in mind that this network is completely isolated (without network access) even if it is also used for development.
I just got the craziest idea that could be a good solution:
Whenever I will need to merge a new storage folder content, I will just copy it to a temporary location, start a new Verdaccio instance (with the storage at that location), then simply set the uplink of the main instance to the new/temporary one.
This way I will be able to run 'npm install' on the new packages on the main registry and this will automatically update the main storage folder (merge the databases). At the end I simply stop the second Verdaccio instance and cleanup the temp folder.
For this problem exactly I created this CLI tool: bulk-npm-publish.
How to use it:
Copy the storage folder to your isolated network.
Run bulk-npm-publish
3: Follow the steps (if you pass -i It will run interactively.
Run the outputted publish script.
with this tool you can skip already existing packages.

How to install applications on azure VM

I have an Azure VM and IIS setup through ADO playbook. Is it possible to have the playbook also install applications like putty as well? I’m using ansible as well in the playbook.
I thought I might be able to use get_url but that will just download the installer.
If there is no way, then is the best practice to just build up the VM the way I need it and to create an image?
You could likely find a way to install programs using something like Custom Script Extension or manually modifying the install files to skip any user input. That being said, the best option would be to create an image with the required programs already installed then capture an image of that VM. This will save you a lot of time and effort so I suggest simply going that route instead.

Resources