Why is azure CLI adding 0 to my returned list - azure

I'm working with azure CLI to build out some automation around repo creation, etc. I'm using python as a sort of wrapper around various CLI commands to bundle up the automation. I want to write in a simple check to see if a repo name has been used and exists or not.
repoName comes from a system input and would be whatever the user wants to name their fresh repository.
So far I have this:
azRepoListCmd = "az repos list --query \"[?contains(name, \'" + repoName + "\')].[name]\" --organization https://myOrganizationHere.visualstudio.com/ --project myProject -o tsv"
azRepoList = os.system(azRepoListCmd)
print(azRepoList)
what the above returns is :
test-project-2
0
What is this "0" and where does it come from? Expected result would just be the name or an empty array if it didn't find anything.

The 0 is the resultcode of running os.system, which doesn't capture output.
https://docs.python.org/3/library/os.html#os.system
So your azRepoList = ... line is actually what's outputting the repo name, then the next line is outputting the result code.
What you want instead is subprocess.
import subprocess as sp
output = sp.getoutput("az repos list --query \"[?contains(name, 'PartsUnlimited')].[name]\" --project \"Parts Unlimited\" -o tsv")
print (output)
PartsUnlimited

Related

Tabular Editor - Set "Shared Expression" value from Tabular Editor CLI (with Azure Devops)

I need to change the value of a parameter in a TOM. I am using Azure Devops with steps that include Tabular Editor CLI. I have written a one-line script that should be able to change the value of a Shared Expression. (Maybe a shared expression is read only?)
The script that will be executed
Model.Expressions["CustomerNameParameter"].Expression = "\"some value\" meta [IsParameterQuery=true, Type=\"Text\", IsParameterQueryRequired=true]";
I returns an error whenever Azure Devops tries to run it:
It cannot find the CustomerNameParameter in the model.
My build looks like this:
Starting: Build Mode.bim from SourceDirectory
==============================================================================
Task : Command line
Description : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
Version : 2.201.1
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
==============================================================================
Generating script.
Script contents: shell
TabularEditor.exe "D:\a\1\s" -B "D:\a\1\a\Model.bim"
========================== Starting Command Output ===========================
"C:\Windows\system32\cmd.exe" /D /E:ON /V:OFF /S /C "CALL "D:\a\_temp\ba31b528-d9a3-42cc-9099-d80d46d1ffe6.cmd""
Tabular Editor 2.12.4 (build 2.12.7563.29301)
--------------------------------
Dependency tree built in 113 ms
Loading model...
Building Model.bim file...
Finishing: Build Mode.bim from SourceDirectory
Your script looks good. Have you tried executing it within the Tabular Editor UI?
Perhaps the parameter is named differently in your model. You can use the following script in the CLI to output the list of parameters:
foreach(var expr in Model.Expressions) Info(expr.Name);
The result when executed in the CLI on the model shown in the screenshot above:
What I did to fix this is create a separate script (SharedExpressions.csx) and (manually) created this line for each expression:
Model.Expressions["Start Date - Sales"].Expression = "#datetime(2019, 1, 1, 0, 0, 0) meta [IsParameterQuery=true, Type=\"DateTime\", IsParameterQueryRequired=true]";
In this case my Expressions are datetime, but you can change it to anything you like. Just be very aware with the quote placement.
Then in my pipeline I use the following script to execute the changes:
start /wait TabularEditor.exe "$(System.DefaultWorkingDirectory)/Tabular Model/model/model.bim" -S "$(System.DefaultWorkingDirectory)/Tabular Model/scripts/SharedExpressions.csx" -D -V
My tabular model uses a shared expression (parameter) to set the connection string. The following solution worked for me. I updated the command line script in the release pipeline as follows:
echo var connectionString = Environment.GetEnvironmentVariable("SQLDWConnectionString"); >> SetConnectionStringFromEnv.cs
echo Model.Expressions["ServerName"].Expression = "\"" + connectionString +"\"" + " meta [IsParameterQuery=true, Type=\"Text\", IsParameterQueryRequired=true]"; >> SetConnectionStringFromEnv.cs
TabularEditor.exe "_$(Build.DefinitionName)\drop\Model.bim" -S SetConnectionStringFromEnv.cs -D "%ASConnectionString%" "$(ASDatabaseName)" -O -C -P -R -M -W -E -V

Execute azure command in powrershell without writing error to console?

I am using powershell script in pipeline and the problem I have with this query.
$value = $(az appconfig kv show -n ThisisEnv --key thisisconfigkey) | ConvertFrom-Json
What this query does is get the data related to key if exist. If this key doesn't exist it give the error like
ERROR: Key 'abcdefg' with label 'None' does not exist.
It is work as expected. In pipeline when the key doesn't exist, it's printed a error on CLI. The pipeline see it as error and show it as failed. Is there a way I can make it work.
Is there a way I can stop it printing it on console. Any powershell operator which help me to get the value from azure command but also let me get it without print anything on console.
You could try to redirect the standard error using 2> $null
$value = $(az appconfig kv show -n ThisisEnv --key thisisconfigkey 2> $null) | ConvertFrom-Json
This will suppress the error within the console. You might also want to set the powerShellIgnoreLASTEXITCODE within the Azure CLI Task in order that the pipeline run doesn't fail - or as a workaround, set the $LASTEXITCODE to 0

Publishing test results to Azure (VS Database Project, tSQLt, Azure Pipelines, Docker)

I am trying to fully automate the build, test, and release of a database project using Azure Pipeline.
I already have a Visual Studio solution which consists of three database projects. The first project is the database, which contains the tables, stored procedures, functions, data, etc.. The second project is the tSQLt framework (v 1.0.5873.27393 if anyone is interested). And finally the third project is the tSQLt tests.
My goal here to check the solution into source control, and the pipeline will automatically build the solution, deploy the dacpacs to a build server (docker in this case), run the tSQLt tests, and publish the results back to the pipeline.
My pipeline works like this.
Building the visual studio solution
Publish the Artifacts
Setup a docker container running Ubuntu & SQL Server
Install SQLPackage
Deploy the dacpacs to the SQL instance
Run the tSQLt tests
Publish the test results
Everything up to publishing the results is working, but on this step I got the following error:
[warning]Failed to read /home/vsts/work/1/Results.xml. Error : Data at the root level is invalid. Line 1, position 1.
I added another step in the pipeline to display the content of the Results.xml file. It appears like this:
XML_F52E2B61-18A1-11d1-B105-00805F49916B
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
<testsuites><testsuite id="1" name="MyNewTestClassOne" tests="1" errors="0" failures="0" timestamp="2021-02-01T10:40:31" time="0.000" hostname="f6a05d4a3932" package="tSQLt"><properties/><testcase classname="MyNewTestClassOne" name="TestNumberOne" time="0.
I'm not sure if the column name and dashes should be in the file, but I'm guessing not. I added another step in to remove them, just leaving me with the XML. But this then gave me a different error to deal with:
##[warning]Failed to read /home/vsts/work/1/Results.xml. Error : There is an unclosed literal string. Line 2, position 1.
This one is a little obvious to spot, because as you'll see above, the XML is incomplete.
Here is the part of my pipeline which runs the tSQLt tests and outs the results to Results.xml
- script: |
sqlcmd -S 127.0.0.1,1433 -U SA -P Password.1! -d StagingDB -Q 'EXEC tSQLt.RunAll;'
displayName: 'tSQLt - Run All Tests'
- script: |
cd $(Pipeline.Workspace)
sqlcmd -S 127.0.0.1,1433 -U SA -P Password.1! -d StagingDB -Q 'SET NOCOUNT ON; EXEC tSQLt.XmlResultFormatter;' -o 'tSQLt_Results.xml'
displayName: 'tSQLt - Output Results'
I've research so many blogs and articles on this, and most people are doing the same. Some people use PowerShell instead of sqlcmd, but given I'm using a Ubuntu machine this isn't an option here.
I am all out of options, so I am looking for a little help on this.
You are dealing with 2 problems here. There is noise in your result set, that is not xml and your xml result is truncated after 256 characters. I can help you with both.
What I am doing is basically this:
/opt/mssql-tools/bin/sqlcmd \
-S "localhost, 31114" -U sa \
-P "password" \
-d dbname \
-y0 \
-Q "BEGIN TRY EXEC tSQLt.RunAll END TRY BEGIN CATCH END CATCH; EXEC tSQLt.XmlResultFormatter" \
| grep -w "<testsuites>" \
| tee "resultfile.xml"
Few things to note:
y0 important. This sets the length of the xml result set to unlimited, up from 256.
grep with a regular expression - make sure you only get the xml and not the noise around it.
If you want to run only a subset of your tests, you need to make amendments to the SQL query being passed in, but other than that, this is a catch it all "oneliner" to run all tests and get the results in xml format, readable by Azure DevOps

Can't set variable using Azure CLI within DevOps Release Pipeline

I'm trying to save a key value to pass on to the next step in my release pipeline, but no matter what I do I can't save the result of my command to a variable. I've already checked many of the articles here dealing with this with no success. Here is what I am trying:
$KEY=(az storage account show-connection-string --key primary -n myStorageAccount -g myResourceGroup --query "connectionString" -o tsv)
echo "Attempting to set variable"
echo $KEY
echo ##vso[task.setvariable variable=AZURE_STORAGE_CONNECTION_STRING;]$KEY
echo $AZURE_STORAGE_CONNECTION_STRING
Running on Windows Agent by the way. I've tried all kinds of variations: SET KEY=, SET $KEY=, SET $(KEY)=, $KEY=, $(KEY)=, KEY=, none of it works. Likewise I've tried referencing the variable differently in the echo statements with no luck. If I just run the az storage account command, I do get back the connection string. But either I get that $KEY is not a recognizeable command or if I'm using SET, echo simply gives me back $KEY and the vso line gives me nothing.
I can accomplish most of this, including saving to variable, in Azure Cloud Shell (via syntax $KEY= and echo $KEY). But of course that doesn't help my pipeline. Any idea the proper syntax to get this value into my next release pipeline step, or is the another method to accomplish this?
Can't set variable using Azure CLI within DevOps Release Pipeline
If you are using Azure CLI version 1.*, try to use following scripts:
for /f "tokens=1 USEBACKQ" %%F in (`Yourcommand`) do echo ##vso[task.setvariable variable=AZURE_STORAGE_CONNECTION_STRING;]%%F
If you are using Azure CLI version 2.*, you can also use a powershell command:
$KEY= & YourCommand
Write-Output("##vso[task.setvariable variable=AZURE_STORAGE_CONNECTION_STRING;]$KEY")
Check this thread for some more details.
Hope this helps.

Azure CLI choose a option from a table

I am trying to get a table output with numbers in the Azure CLI which gives this as a output
Number Location Name
---------- ----------- -------------
1 somewhere ResourceGroup1
2 somewhere ResourceGroup2
The code I have right now is
az group list --query '[].{location:location, name:name}'
The output I'm getting right now is
Location Name
---------- ---------------
somewhere ResourceGroup1
somewhere ResourceGroup2
My end goal is that if you choose the number 1 you select the name so I can use that later in the script
For your issue, there is no Azure CLI command can achieve it. But you can use a script to let it come true. For example, you can use a shell script:
#!/bin/bash
az group list --query '[].{location: location, name: name}' -o table >> output.txt
# This command just add the line number inside the file, it's optional.
cat -n output.txt >> result.txt
# you can just get the group name with a specific line, the same result with output.txt
awk '{if (NR == line) print $3}' result.txt
Hope this will be helpful.
you can use contains expression (jmespath) in the filter to filter the results:
filter=resource_group_name
filterExpression="[?contains(name, '$filter')].name"
az group list --query "$filterExpression" -o tsv
which is a much better way compared to already present answers.
more reading:
http://jmespath.org/specification.html#filterexpressions
http://jmespath.org/specification.html#built-in-functions
From what i understand you are trying to create variable to use later from output. You do not need to put it in a table first. Using same example you have you could do something like below;
gpname="$(az group list --query [0].name --output tsv)"
az group show -n $gpname
Good Luck.....
Information in Comments::
What you are looking for is more Linux than Azure. I am not a Linux CLI expert but her is a basic script that you can build on.
#!/bin/bash
gpnames="$(az group list --query [].name --output tsv)"
PS3='Select A number: '
select gpname in $gpnames
do
az group show -n $gpname
Done
Hope this helps......

Resources