I'm writing a powershell script to create VM using New-AzureRmResourceGroupDeployment cmdlet, which is as below.
New-AzureRmResourceGroupDeployment -Name VmDeployment `
-TemplateFile C:\template\template.json `
-TemplateParameterFile C:\template\parameters.json
This is used create a VM. In parameters.json , there are some parameters like virtualMachineName, networkInterfaceName etc which are hardcoded.
Now I'm trying to automate these scripts , i.e they run on there own from a tool , when ever some condition is met.
My requirement here is , whenever this script runs, it has to increase the number in the VMName . Suppose the VM Name is now VMName1, it has to be VMName2 when the script runs next time. Similarly VMName3 when the script runs next time. Since the virtualMachineName parameter is hardcoded, this is not happening now. Is there anyway I can pass virtualMachineName as a parameter in the script itself rather than taking it from the json file.
Any guidance is highly appreciated.Thanks!
You can definitely do this, and fortunately there are a handful of ways too.
Pass inline parameters. It says in the Azure PowerShell docs for Templates that you can use inline parameters with a local parameter file and the inline parameters take precedence. Relevant paragraph:
You can use inline parameters and a local parameter file in the same deployment operation. For example, you can specify some values in the local parameter file and add other values inline during deployment. If you provide values for a parameter in both the local parameter file and inline, the inline value takes precedence.
This is valuable because it provides you explicit control over the VM Name parameter, but it is up to the caller (you in this case) to pass an inline parameter. Please note this only works with local parameter files and not remote files (i.e. -TemplateParameterFile and not -TemplateParameterUri). The resulting command would look something like:
New-AzureRmResourceGroupDeployment -Name VmDeployment `
-TemplateFile C:\template\template.json `
-TemplateParameterFile C:\template\parameters.json `
-virtualMachineName VMName42
Modify original parameters.json. You can write some PowerShell/Python/Favorite-scripting-language to parse paramters.json, find the VM Name parameter, find the integer suffix, increment it, and overwrite the file with the new version. This has the benefit of not having to remember to pass an inline parameter, and you won't have to track the version number anywhere as it is already stored in parameters.json. This has one major drawback: it modifies the original JSON which can be dangerous.
Copy parameters.json and modify temporary copy. You can write a script to copy parameters.json to another temporary JSON file and then increment the VM Name parameter during the copy just like in option 2. Pass this temporary file to New-AzureRmResourceGroupDeployment. This has the benefit of not modifying the original parameters.json file but requires you to track the version number somewhere (e.g. another local file, a command line parameter, environment variables, etc.).
For simplicity, I would recommend option 1. It already works out-of-the-box and does not require any external scripts.
Related
With Azure Release Pipeline, in a task using the PowerShell Script, I am able to set values of variables and pass to next task using the command
Write-Host '##vso[task.setvariable variable=varResourceExists;isOutput=true;something'
However, when I put this similar command in a task that uses Azure PowerShell, this command is no longer allowed, the task produces a warning:
2019-10-22T00:23:14.3080614Z ##[warning]'##vso[task.setvariable
variable=varResourceExists;isOutput=true;something' contains logging
command keyword '##vso', but it's not a legal command. Please see the
list of accepted commands:
https://go.microsoft.com/fwlink/?LinkId=817296
As a result, the variable varResourceExists cannot be set by my task. I have also tried a conventional PowerShell set value by doing
$varResourceExists = 'something'; # this also does not work
Is there a way I can set this value in Azure Powershell script so that the next task can reference it?
##vso[task.setvariable variable=varResourceExists;isOutput=true;something is not correct syntax. You're missing a closing ].
It should be ##vso[task.setvariable variable=varResourceExists;isOutput=true;]something
Here is how I solved my topic. In the pipeline Azure PowerShell task, I can have code such as Write-Host '##vso[task.setvariable variable=varResourceExists;isOutput=true;]False';
In the Output Variable option, I set a Reference Name "step1":
Output Variable
Then in the next step, I can do a conditional check using a Custom Condition:
Custom Condition
I can also reference the variable in my code such as Write-Host "The step1.varResourceExists says: $(step1.varResourceExists)";
I am trying to customize the output of my pipeline release through setting some env variables into a task.
I found the following link:
https://learn.microsoft.com/en-us/azure/devops/pipelines/scripts/logging-commands?view=azure-devops&tabs=powershell
which however does not seem to work.
What I am doing is simply to create a pipeline with a single task (either bash or PS), and there declaring the commands specified in the link through the inline version of the task.
Has anyone already successfully managed to make these commands work?
Do I do something wrong and/or incomplete?
Does anyone have a better way to customise the pipeline with relevant information from a task? E.g. through the release name, or the description and/or tag of the specific release?
Edit:
Write-Host "##vso[task.setvariable variable=sauce;]crushed tomatoes"
Write-Host "##vso[task.setvariable variable=secretSauce;issecret=true]crushed tomatoes with garlic"
Write-Host "Non-secrets automatically mapped in, sauce is $env:SAUCE"
Write-Host "Secrets are not automatically mapped in, secretSauce is $env:SECRETSAUCE"
Write-Host "You can use macro replacement to get secrets, and they'll be masked in the log: $(secretSauce)"
this is the code, copy and pasted. Now I tried also with the script, and it does not work either.
I use an hosted windows agent.
When you set a new variable with the logging command the variable is available only in the next tasks and not in the same task.
So, split your script to 2 tasks, in the second task put the last 3 lines and you will see that the first task works:
this also puzzled me for a while, in the end i found out that if you want to modify the $env:path you can call the special task called task.prependpath by using the special logging command syntax like "##vso[task.prependpath]local directory path". you can find more of this kind of special commands from their source :
https://github.com/microsoft/azure-pipelines-tasks/blob/master/docs/authoring/commands.md
I'm trying to teach myself Azure Powershell scripting and have an ultimate goal of setting up a script that reads in an Excel Spreadsheet with specifications for Azure VMs to create (things like, type of VM, tags, timezone, which AD group to add it to, etc). If anybody has any tutorial references for this, that would be very helpful.
Currently, I'm falling on my face on what should be something relatively simple. Exporting functions. I have opened Powershell ISE and tried to run the following code (taken from one of the examples I found on MSDN):
Function New-Test
{
Write-Output 'I am New-Test function'
}
Export-ModuleMember -Function New-Test
function Validate-Test
{
Write-Output 'I am Validate-Test function'
}
function Start-Test
{
Write-Output 'I am Start-Test function'
}
Set-Alias stt Start-Test
Export-ModuleMember -Function Start-Test -Alias stt
But I get an error saying:
"Export-ModuleMember: Object Reference not set to an instance of an object"
I have tried saving this code out to a ps1 file named test and then navigating to the directory it's in and running "./test.ps1" but the same error comes up.
Any idea on what I'm doing wrong here? There is surely something fundamental that I am missing here.
I ran into this error, too. For myself, I found out that the "Export-ModuleMember" can't have any line breaks between it and the end of the last function bracket.
Examples:
Import-Module Object Reference Error:
function Test-Import{
Write-Host "import function success"
}
Export-ModuleMember -Function Test-Import
Import-Module, no errors:
function Test-Import{
Write-Host "import function success"
}
Export-ModuleMember -Function Test-Import
In my case the solution was to reference the psm1 instead of the ps1 ind the psd1 file
# Script module or binary module file associated with this manifest.
RootModule = 'pf-rootscript.psm1'
instead of
# Script module or binary module file associated with this manifest.
RootModule = 'pf-rootscript.ps1'
When you plan to write a lot of code in PowerShell scripts, then setting up your module and controlling what you export (or not) using Export-ModuleMember is the right thing to do. So if you're planning on building your own module to consume in further PowerShell scripts, then you're on the right track.
You haven't mentioned anything about the module definition or how/where exactly you are consuming these functions, so I think you are probably missing the part where you define your module first.
You can follow a step by step guide for doing that here: PowerShell: Building a Module, one microstep at a time
I am currently working on a pre-existing kettle job that calls a powershell script that sends Azure HdInsight a pig job. Get-AzureHdInsightJobOutput is part of this script and is now deprecated. So I am replacing it with Get-AzureRmHdInsightJobOutput. However the new cmdlet has two parameter sets, one for display and one for download. I need an object to be returned in order to avoid making changes to the kettle job.
I'm hoping to find out if the display parameters will return an object or if they just print out the results.
Yes, Get-AzureRmHdInsightJobOutput cmdlet will return string object when in use with the Display parameter set, regardless of the DisplayOutputType.
You can refer to the source code of this cmdlet below in the Azure PowerShell github link:
GetAzureHDInsightJobOutputCommand.cs
Hope this helps!
I'll explain the task requested from me:
I have two containers in Azure, one called "data" and one called "script". In the "data" container there's a txt file with data, and in the "script" container there's a script file.
Now, I need programatically (with WorkerRole) to execute the script file, with the content of the data file as parameters (Example: a script file that accepts a string 's' and returns to the screen "Hello, 's'", when 's' in the string given, and in the data file there's a string), and save the result of the run into another file which needs to be saved in another container called "result".
How do I do all these? I've already uploaded the files and created the blobs programatically, but I can't seem to understand how to execute the file of how to save its result to another file?
Can I please have some help?
Thanks in advance
Here are the steps in pseudo code:
Retrieve the script from the blob(using DownloadToStream())
Compile the script(I will leave this to you as I have no idea what
format your script is)
Load parameters from blob(same as step 1)
Execute script with those parameters.
If your script's can be written as lambda expressions then this becomes a lot easier as you can turn them into Action's
Edit based on your questiions:
DownloadText() is no longer included in Azure Storage 2.0, you only have access to DownloadToStream(). Even if you are using an older version(say 1.7) I would recommend using DownloadToStream() in the event you ever upgrade in the future. This will prevent having to refactor your code.
In terms of executing your script, depending on what type of script it is(if it is c# code you can use this example: Is it possible to dynamically compile and execute C# code fragments?. If you need to execute a different type of script you would need to run it using Process.Start and you can look at this example: http://www.dotnetperls.com/process-start
I do not have much experience with point number 2 but those are the processes I have heard and seen used.