I have a large PowerShell script that creates a machine and performs a bunch of configurations and it relies on Az module together with some custom modules I have written. I can run it from my machine, but I am exploring the possibility to run it from Azure and letting others run it without the need to fetch the latest version of the script and the dependent modules.
I have looked into Azure Functions, Logic Apps and Pipelines, but I don't really even know where to start and which one is the most suitable.
The workflow I would like to achieve is this:
A teammate would specify a machine name and trigger the script.
The script would then use Az modules and pull some modules from a git repo to create and configure a VM.
The teammate would receive some sort of feedback to show if the script was successful, maybe a log or an email notification.
Did you look at Runbook running over Automation Account? You can manage your source with DevOps and use Automation Account' Source Control. You can also use Azure Arc Agent on VM to run your script.
I am trying to better understand azure virtual machine scale sets and how my company can use it. Currently we run a custom software (wpf program) that will need to be deployed and updated on all VMs.
Is a extension where I setup the deploy of the wpf?
Can I pull the files from a git repo to deploy?
How do I config this?
Not directly. there is a custom script extension and azure powershell dsc extension (and several others, like chef) which allow you to do pretty much anything, but nothing built-in
No, you cannot do that natively, you can use those extensions to do whatever you like. Or, better, you can use ci\cd systems (like Azure Devops) to do that. You need to install agent on the vm (in most cases) and then use that ci\cd system to deploy to vmss instances.
Another alternative - using images, that is a native way, but you need to prebuild images (packer, questionmark).
We are deploying into azure using Octopus deploy. We are using it since more than a year, and suddenly we started (about 3 weeks ago) to get errors on few deployments.
Microsoft.Web.Deployment.DeploymentDetailedClientServerException: Web Deploy cannot modify the file 'msvcr120.dll' on the destination because it is locked by an external process. In order to allow the publish operation to succeed, you may need to either restart your application to release the lock, or use the AppOffline rule handler for .Net applications on your next publish attempt. Learn more at: http://go.microsoft.com/fwlink/?LinkId=221672#ERROR_FILE_IN_USE
We have the webapp running and always on and we have the app setting 'MSDEPLOY_RENAME_LOCKED_FILES' to 1 that in theory prevents this.
Does anyone knows if something was changed in azure or octopus?
There are a number of reasons files may be locked during deployment. You should be able to get an idea of what may be locking files by using the kudu process explorer, which you can access using the url {yoursite}.scm.azurewebsites.net.
In order to avoid the locking issue altogether, you could make use of slots to achieve a zero downtime deployment if that's an option for you. In this case you could stop the site or enable App Offline which should unlock any files and allow the deployment to succeed after which a slot swap will make the deployment live. App Offline is preferred over using MSDEPLOY_RENAME_LOCKED_FILES, but will take the application offline during the deployment. Octopus also has support for this as an option on the Deploy an Azure Web App step itself, so may be worth a try even without slots.
You can use custom pre/post deployment scripts as part of your Deploy an Azure Web App to make use of the Stop-AzureRmWebAppSlot, Start-AzureRmWebAppSlot and Switch-AzureRmWebAppSlot Powershell commands Azure commandlets to achieve the above.
An alternative may be to use zip deployments, however, the Deploy an Azure Web App Octopus step doesn't have first class support for this quite yet. It can still be achieved using a Run an Azure PowerShell Script along with a package references if this is what you are wanting to do.
I am investigating ways to automate deployment of a specific build of a product to a specific Azure Cloud Service or VM.
The following steps would be automated, with as little manual intervention as possible:
Create a Cloud Service or VM
Install a specific build of the product (as a standalone exe or
Windows service, not IIS)
Tweak the configuration files(s)
Set up user account(s)
Run the exe/service
The code is currently in Visual Studio Online / TFS. We have Cruise Control .NET CI set up and we are looking at moving to TeamCity.
This will be used for the usual QA & Production type environments, but also for ad-hoc deployment e.g. if a trial feature has been added to the product and we want to deploy that to a new VM for a specific customer to play around with. Ideally we would be able to use the command line or a UI to pick the build, create the VM and specify any configuration changes.
One possible solution might be Octopus Deploy although I don't think this would be able to actually create an Azure VM. I will probably also look at the Azure API, and also TFS deploy.
Basically is this feasible, and are there any proven alternatives that I'm missing, in order to narrow down my research?
Thanks in advance!
While Octopus Deploy can do many things, in this particular scenario of yours, you're asking it to do three types of work - release management, automated provisioning and configuration management. It's a fine line between automation awesomeness and a really sticky situation.
Of the tasks you're asking, almost all of them can be done within Octopus today. I'd argue that it may be possible to Create a cloud service or VM. If there's some PowerShell cmdlet/library that allows you to spin up VMs with authentication, odds are you can do it Octopus - but it may not be the right tool to do that job today. Why?
In my opinion, it distorts the barrier between Developers, DevOps and SysAdmins. Whether you use Chef, Puppet, Salt, etc. whatever configuration management you have, that needs a whole layer of users with the expertise to back it up - often said expertise of system which the very developers who want such flexibility may not have. Secondly, right now this isn't a focus within Octopus (yet). I'd be hard pressed to say whether to use a tool such as Octopus on what it can do vs what it should do or not.
It's really nice that Azure now has support for preinstalling the Octopus tentacle for VMs. But that requires additional info such as, the Server thumbprint, port other supplementary configuration info in order to automate vm provisioning. That configuration management - should it be under Octopus's control, or something like Chef or Puppet? I honestly don't have an answer to this but my feeling as of now is not Octopus. Someday, perhaps, but until this is really ready and fully tested and vetted, I'd wait it out (a little) at least with Octopus.
If you're the adventurous type, then by all means try out Octopus. I may do a PoC (proof of concept) of this infrastructure automation later this year, but to rely on it today for business/production usage as the primary means of infrastructure automation will be risky and require a lot of work and experimentation. Again, I'm not saying it cannot be done, I'm questioning whether it should be done within Octopus as of this response today.
If anything, from the Octopus Deploy side of things is this feasible? Yes - it just hasn't quite been worked out yet. Looking at what you want to do, I'd say it's a two-phase process: 1. spinning up the new VM, attaching the tentacle to the environment and 2. running the deployment process on that new VM.
I'd also recommend checking out the Octopus blog. They're publicly talking about infrastructure automation. You can read about it here: http://octopusdeploy.com/blog/rfc-cloud-and-infrastructure-automation-support
I hope this response helps in some way.
The solution to the automated deployment in Azure is use ElasticBox.
I will skip the details of all the configuration options for Azure supported by ElasticBox, as they are detailed in the documentation section: http://elasticbox.com/documentation/deploying-and-managing-instances/using-azure/.
You only need to create a box (abstraction unit that ElasticBox uses to define the installation and configuration of the deployment of a service or application in any cloud) that takes care of the steps you need to be automated. So finally you will deploy the vm with near no manual intervention, just one click or a command with some parameters.
A box includes the variables necessary for your deployment and your scripts (In this case probably PowerShell, but they could be bash, python, perl, java, etc.)
When you deploy the box you create to deploy your application, ElasticBox will:
Create a Cloud Service or VM. (ElasticBox takes care of provision the vm in your Azure provider, or any of your preferred cloud provider).
Install a specific build of the product (as a standalone exe or Windows service, not IIS) -> This should be your install event script.
Tweak the configuration files(s) -> This should be part of your configure event script.
Set up user account(s) -> This should be part of your configure event script.
Run the exe/service -> This should be part of your start event script.
ElasticBox has a command line tool that enables to do VM deployments of your boxes and also you can manage your deployed vms with it: https://pypi.python.org/pypi/ebcli
It also support automatic termination of the vm after a custom time value.
This is quite a broad question, but certainly the goal is achieveable via one of a number of methods. While a bit old, Tom Hollander's blog on automated deployments is a good starting place. I've seen a lot of OctopusDeploy used as well as TeamCity but they all ultimately rely on Azure's PowerShell Cmdlets, Management Libraries in custom code or pure REST API calls.
Just an FYI; One option is to do everything by using the Azure Management API. I also like to reference the Azure Client Libraries in a VS project and do everything is C# code.
I have two Azure VM's running in a cloud service. They contains almost the same thing. Some TCP port's are also opened between them.
Is it possible to create a deploy package from this existing setup so that at a later time can deploy this setup in an easy way. I.e. I want to be able to do this:
1. Create deploy package from existing setup *
2. Delete whole existing cloud service including VM's
3. Deploy the package from step 1 and have everything created again.
*I can save one of the VM's to my Azure storage and use it as template for both of them if that is easier.
How to accomplish this if it is possible?
Yes, you can take what you have as a template and use it to stand up multiple silos. But in IaaS, there isn't a notion of a deployment package. There's a few things you'll need to do...
1) understand how to take an existing VM and turn it into an image
2) use Powershell or another DevOps style automation suite (Chef/Puppet/etc..) to define deploy your silo.
You seem specifically interested in how to create an image so I'd recommend using the tutorial we have published on this. http://www.windowsazure.com/en-us/documentation/articles/virtual-machines-capture-image-windows-server/ This does of course presume you're running Windows Server. But a Linux version it can be found at: http://www.windowsazure.com/en-us/documentation/articles/virtual-machines-linux-capture-image/
The automation of a deployment depends on a great many things, so I'd suggest at a starting point, familiarizing yourself with the management API: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx
With the implementation of Resource Manager, you can now easily use JSON template to deploy and redeploy resources in Azure. There are also starter templates available - https://azure.microsoft.com/en-us/documentation/templates/