Execute complex script inside Azure ARM Template - azure

Inside of one ARM , I have one "type": "Microsoft.Resources/deploymentScripts". This is calling an custom powershell script.
When I am using the standard AZ methods everything works properly.
But when I want to use AzureAD module (installing and importing ) the Script fails during ARM deployment.
My question is how I can created a complex powershell script (calling other modules ) to use inside of one ARM?
Thanks in advance for your help

Related

Jenkins using PowerShell Azure Modules failing to run

I have a bit of a strange one here.
I have written a PowerShell script to pull Azure blob storage objects which when I run manually via the console works absolutely fine. However if i run it from jenkins, it calls the PowerShell function, starts running through it but fails when using the az.storage function Get-azstorageblobcontent.
The errors seem to be things like "Couldn't connect" or "Retry count exceeded" etc. I am unable to use the -debug switch as it says its interactive.
I have tested by including the access token inside the script, eliminating jenkins from handling the secrets but i get the same issue.
Jenkins is running the latest version, latest PowerShell plugin and latest version of java. it is also calling the 64 bit powershell session as expected.
I am also aware that there is a Jenkins Blob storage plugin however due to the amount of additional work that is required, it makes a lot more sense i use the PowerShell Modules inside a PowerShell Script to run this.
I would really appreciate if anyone has any ideas about this please, It has been driving me nuts for weeks.
Many Thanks

Difference between Azure CLI and PowerShell

I am asking this question as Windows user but I request you not to limit the answer for windows only.
I tried executing Azure CLI commands in Power-Shell and they are executing successfully but not the other way around. In that case why do we have 2 separate command sets? Why not just work on Power-Shell? I have only tried some basic commands and they are all working except filter commands | find only works on CLI and | Select only works in Power-Shell.
I know that Azure CLI is for cross platform support. But is there any difference for Windows user? Are there any consequences to run CLI command on Power-Shell?
Thanks in advance.
Here are my opinions from using both. In no way am I saying one is better than the other. They both have their pros and cons.
Azure CLI is cross-platform command-line tool for managing Azure resources, and it can run in Windows, Mac and Linux. This also means it can run on Windows PowerShell. Its more flexible than Azure PowerShell since its a binary and can run inside any OS default shell.
Are there any consequences to run CLI command on Power-Shell?
Updating can be a bit of a pain. If you want to update it on Windows you have to re-install the MSI with the instructions from Install Azure CLI on Windows. Updating is easier on other platforms, and makes me only want to only use Azure PowerShell on Windows PowerShell. A work around is to use WSL on Windows, then you can run Azure CLI on Linux inside a Windows machine. You can install WSL at Windows Subsystem for Linux Installation Guide for Windows 10. I find updating the azure-cli package much easier on Linux using apt-get than the Windows equivalent. You can have a look at Install Azure CLI with apt on how to install the Azure CLI package on Linux.
Another difference is that you have to chain multiple commands with Azure CLI, such as az group list vs Get-AzResourceGroup from Azure PowerShell. You also can't run Get-Help with Azure CLI commands like you do with Azure PowerShell, which is a huge game changer for me, since I find the PowerShell help system to be very helpful displaying in-depth information about PowerShell Cmdlets. The Azure CLI help info is found with az --help, but is not as comprehensive as Get-Help.
Tab completion also doesn't work in Azure CLI when using Windows PowerShell. This makes typing a bit of a pain as well, another reason I use WSL inside of Windows. You can have a look at the other alternatives at this Autocompletion support in Windows command line GitHub issue.
Azure PowerShell on the other hand is a set of PowerShell Cmdlets for managing Azure resources from the command-line, and only works in Windows PowerShell and PowerShell Core. This also means that if another OS such as Mac or Linux is running PowerShell Core, then it can run Azure PowerShell as well.
I have only tried some basic commands and they are all working except filter commands | find only works on CLI and | Select only works in Power-Shell.
Select-Object or the shorthand Select is a PowerShell Cmdlet, so it only works on PowerShell objects. find can search a string or text file. Furthermore, if you are searching for data from Azure CLI, you should use the --query parameter instead of find, since find is limited to searching strings inside text. Azure CLI uses the JMESPath query language to search for data inside the JSON output you receive. If your comfortable with this query language then searching for data using Azure CLI shouldn't be too much of an issue. Additionally, you can also use Azure CLI commands inside PowerShell scripts, but not vice-versa.
Summary
If you deal with multiple platforms or want to write scripts with others that use different platforms, using Azure CLI is a good choice. However, if you mainly deal with Windows systems and work with others who do so as well, then using Azure PowerShell is a good idea. If your like me and have to use different platforms, then installing both is a good idea. If you still just want to use Azure PowerShell on different OS platforms, then you need PowerShell Core.
For simple tasks, like quickly looking up resources in cloud shell or writing quick scripts, Azure CLI is good to use and less verbose than Azure PowerShell. If you already use bash a lot, this will feel more natural, and adding Azure CLI commands to existing scripts will be a simple task. As others have also said, there is nothing stopping you from adding Azure CLI commands to powershell scripts, which allows you to deserialize the JSON output using ConvertFrom-Json into a PSCustomObject.
For more complex tasks, Azure PowerShell is preferable, since working with .NET objects/OOP principals is much easier than parsing the JSON text given from Azure CLI. This is one reason I try to use Azure PowerShell when I can.
Azure CLI does benefit from being idempotent, so running the same command against the resources won't require any null checking like in Azure PowerShell. If this becomes an issue, then you can run ARM templates in Azure PowerShell, which are idempotent.
Update
As #AimusSage helpfully pointed out in the comments, PowerShell 7.0 has recently been released, replacing the name PowerShell Core or PowerShell 6.x. You can read more at Announcing PowerShell 7.0.
Another Idea
If you want to maintain OOP principals from Azure PowerShell, but use something that is easier for Linux Sys Admins to use, then you can consider using the Azure SDK for Python. I have used this in the past when I wanted to run scripts in a Linux host, but didn't want to use Azure CLI or install PowerShell.
I like the previous answers, I just want to add a different point of view for people in the enterprise world that is forced to pick one:
In that case why do we have 2 separate command sets? Why not just work on Power-Shell?
Rephrasing: Both Az CLI and Az PowerShell just call the same set of APIs, the Azure APIs.
This is important because theoretically and eventually you will be able to do everything in both ways
So why did Microsoft creates and maintains two ways of doing the same thing?
Martin Fowler ones said: but remember, the skill of the team will outweigh any monolith/microservice choice
If you change the monolith/microservice by PowerShell/Bash then this answers the question.
I believe that there are people with decades of development of systems using Bash, and there are other teams that are heavy users of PowerShell. Microsoft does not want them to have to learn a whole new programming language to be able to use Azure.
Summary:
If your team is familiar with PowerShell, go with PowerShell and do as much native PowerShell stuff as possible. This way you benefit from things like error handling, OOP concepts,environment settings, parallelization, etc
If yout team are linux admins, heavy users of jenkins, with million lines of Bash to automate other things already there, and working with Bash for their entire life, go with CLI and keep consistency across all the already built tools
I've noticed when doing the MS Azure training (AZ-900 fundamentals and AZ-303 Azure Architect) is that the exercises are done in Azure CLI.
That's not to say it's better, but if you're wanting to do the exams it might be worth being familiar with it. For the record, I'm a PowerShell guy.
the existing answer is silly way of looking at this question. and misleading.
The biggest difference is that azure cli is a binary (that can run on different platforms) and Powershell is a shell that works across platforms. Azure Powershell is a bunch of Powershell modules, everything else derives from that.
find cannot work in the cli, because there is no in the cli, because its not a shell. find works perfectly fine in Powershell on Windows, because its a binary in the Windows OS, whereas select is a cmdlet in Powershell and hence it will not work in command line on Windows (or bash on Linux).
Furthermore, if you are searching for data from Azure CLI, you should
use the --query parameter instead of find
this is also debatable, JMESPath query language is overcomplicated for no particular reason and I dont know how Azure Cli is actually compatible with the official JMESPath documentation. I prefer to use Powershell to run Azure Cli commands and just parse output JSON with Powershell. Obviously, you might not be as comfortable with Powershell as I am and you might not find this convenient.
Another issue with Azure Powershell that does not seem to be the case with azure cli is the different versions. I have spent days figuring out which version of which command for which script in my pipeline needs to be what. Seriously the most ridiculous assinined

Get Source Code for referenced functions in Azure Runbook

I'm new to powershell and Azure automation. Currently I've an Azure Automation Account and it has few Runbook jobs. I'm trying to add new logic to an existing Azure Runbook job by updating its powershell script. I see there are some functions but unfortunately we didn't maintain the source code :(. As the runbook is currently running without any issues, i want to know how to get the source code of the referenced functions.
I searched in the modules, modules gallery, python 2 packages, etc in the Automation Account used by this runbook as well as under Assets, cmdlets, and runbooks nodes (that you see in the Edit mode of the script in Portal) but couldn't find where these functions are referenced. I see one module which I suspect to have something related but not sure.
As an FYI, the functions are named like this:
GetClassicConnection,
GetRunAsConnection,
Set-Subscription $subcriptName
So here are my questions:
Is there are way to get the source code of all the referenced functions within this runbook powershell script? Something like disassembling a .NET dll using disassembler tools.
How to see the source code of an existing module in Automation Account that has its Status as "Available" under Modules section.
I have not had teh reason, yet, to use Azure Runbooks, however, PowerShellCore is already open source and can be viewed on GitHub.
That being said, you can get source code from local cmdlets for example this way...
Param
(
[string]$CmdletName = (Get-Command -CommandType Cmdlet |
Out-GridView -Passthru)
)
# Get the DLL is it is a compiled cmdlet
'Getting DLL if the entered cmdlet name is a compiled cmdlet'
(Get-Command $CmdletName).DLL
'Getting cmdlet details / source code'
$metadata = New-Object system.management.automation.commandmetadata (Get-Command $CmdletName)
[System.management.automation.proxycommand]::Create($MetaData)
Note: Even with the above I've had issues with some cmdlets erroring out.
You can get source code from local functions for example this way...
Param
(
[string]$FunctionName = (Get-Command -CommandType Function |
Out-GridView -Passthru)
)
(Get-Command -Name $FunctionName).ScriptBlock
For dll, one could use the same approach for looking at any other .Net dlls, would be the same tools, ILSpy or dotNetPeek and the like

How to remove custom script extensions on multiple Azure VMs parallelly?

I am working on Devops project to run QA powershell codes as custom script extensions . I need to run it on multiple Virtual machines (minimum 10). I figured out how to install custom script extension parallelly in VMs. But I did not find a solution uninstall custom script extensions parallelly in Vms . Please help. I am ok with ARM template or using Azure CLI .
one way would be to use jobs, something like this (rough sketch):
"vm1","vm2","vm3" | Foreach-Object {
Start-Job -ScriptBlock {
Remove-AzureRMVMCustomScriptExtension -ResourceGroupName xxx -VmName $using:PSItem -Name extensionname -Force
}
}
the above will work if you have azurermcontextautosave enabled. ARM Templates are not capable of removing custom script extension, you might experiment with Complete mode, but its a bit dangerous.
But honestly you just need to use forceUpdateTag to just force extension to rerun, without removing it

How to Create a VM and deploy an application in Azure Resource Group Template

I want to create a set of VMs using either the Resource Group Template in VS 2015 or utilise one of the azure-quickstart-templates as the basis for doing this.
My specific requirements are also to install a simple .exe application and modify its .ini file with a key that I want to pass from the template, i.e. seqno = copyindex() or similar.
Can anybody provide some guidance please?
For a simple Windows VM you can use either a DSC Extension or a Custom Script extension on the VM. Both of the samples in the azure-quickstart repo require you to figure out how to stage the artifacts needed for the extension (in this case the EXE and the script that installs the EXE).
If you go the VS 2015 route, you can start with a VM template, add the DSC or custom script extension and then the PowerShell script provided by Visual Studio will stage the artifacts for you if you make them part of the project.
You can also mix/match - grab a template from github, modify it in VS or take the VS PowerShell script and bring it into whatever workflow works best for you.
Note: one thing to keep in mind as well - you need to pass the location of that EXE into the script that does the install - that script/vm will need to know where to get it from. In the VS 2015 workflow you can use the parameter values of _artifactsLocation and just pass that value along to the installation script. If you start with a quickstart template, you have to manage that yourself.

Resources