Tabular Editor - Set "Shared Expression" value from Tabular Editor CLI (with Azure Devops) - azure

I need to change the value of a parameter in a TOM. I am using Azure Devops with steps that include Tabular Editor CLI. I have written a one-line script that should be able to change the value of a Shared Expression. (Maybe a shared expression is read only?)
The script that will be executed
Model.Expressions["CustomerNameParameter"].Expression = "\"some value\" meta [IsParameterQuery=true, Type=\"Text\", IsParameterQueryRequired=true]";
I returns an error whenever Azure Devops tries to run it:
It cannot find the CustomerNameParameter in the model.
My build looks like this:
Starting: Build Mode.bim from SourceDirectory
==============================================================================
Task : Command line
Description : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
Version : 2.201.1
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
==============================================================================
Generating script.
Script contents: shell
TabularEditor.exe "D:\a\1\s" -B "D:\a\1\a\Model.bim"
========================== Starting Command Output ===========================
"C:\Windows\system32\cmd.exe" /D /E:ON /V:OFF /S /C "CALL "D:\a\_temp\ba31b528-d9a3-42cc-9099-d80d46d1ffe6.cmd""
Tabular Editor 2.12.4 (build 2.12.7563.29301)
--------------------------------
Dependency tree built in 113 ms
Loading model...
Building Model.bim file...
Finishing: Build Mode.bim from SourceDirectory

Your script looks good. Have you tried executing it within the Tabular Editor UI?
Perhaps the parameter is named differently in your model. You can use the following script in the CLI to output the list of parameters:
foreach(var expr in Model.Expressions) Info(expr.Name);
The result when executed in the CLI on the model shown in the screenshot above:

What I did to fix this is create a separate script (SharedExpressions.csx) and (manually) created this line for each expression:
Model.Expressions["Start Date - Sales"].Expression = "#datetime(2019, 1, 1, 0, 0, 0) meta [IsParameterQuery=true, Type=\"DateTime\", IsParameterQueryRequired=true]";
In this case my Expressions are datetime, but you can change it to anything you like. Just be very aware with the quote placement.
Then in my pipeline I use the following script to execute the changes:
start /wait TabularEditor.exe "$(System.DefaultWorkingDirectory)/Tabular Model/model/model.bim" -S "$(System.DefaultWorkingDirectory)/Tabular Model/scripts/SharedExpressions.csx" -D -V

My tabular model uses a shared expression (parameter) to set the connection string. The following solution worked for me. I updated the command line script in the release pipeline as follows:
echo var connectionString = Environment.GetEnvironmentVariable("SQLDWConnectionString"); >> SetConnectionStringFromEnv.cs
echo Model.Expressions["ServerName"].Expression = "\"" + connectionString +"\"" + " meta [IsParameterQuery=true, Type=\"Text\", IsParameterQueryRequired=true]"; >> SetConnectionStringFromEnv.cs
TabularEditor.exe "_$(Build.DefinitionName)\drop\Model.bim" -S SetConnectionStringFromEnv.cs -D "%ASConnectionString%" "$(ASDatabaseName)" -O -C -P -R -M -W -E -V

Related

Publishing test results to Azure (VS Database Project, tSQLt, Azure Pipelines, Docker)

I am trying to fully automate the build, test, and release of a database project using Azure Pipeline.
I already have a Visual Studio solution which consists of three database projects. The first project is the database, which contains the tables, stored procedures, functions, data, etc.. The second project is the tSQLt framework (v 1.0.5873.27393 if anyone is interested). And finally the third project is the tSQLt tests.
My goal here to check the solution into source control, and the pipeline will automatically build the solution, deploy the dacpacs to a build server (docker in this case), run the tSQLt tests, and publish the results back to the pipeline.
My pipeline works like this.
Building the visual studio solution
Publish the Artifacts
Setup a docker container running Ubuntu & SQL Server
Install SQLPackage
Deploy the dacpacs to the SQL instance
Run the tSQLt tests
Publish the test results
Everything up to publishing the results is working, but on this step I got the following error:
[warning]Failed to read /home/vsts/work/1/Results.xml. Error : Data at the root level is invalid. Line 1, position 1.
I added another step in the pipeline to display the content of the Results.xml file. It appears like this:
XML_F52E2B61-18A1-11d1-B105-00805F49916B
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
<testsuites><testsuite id="1" name="MyNewTestClassOne" tests="1" errors="0" failures="0" timestamp="2021-02-01T10:40:31" time="0.000" hostname="f6a05d4a3932" package="tSQLt"><properties/><testcase classname="MyNewTestClassOne" name="TestNumberOne" time="0.
I'm not sure if the column name and dashes should be in the file, but I'm guessing not. I added another step in to remove them, just leaving me with the XML. But this then gave me a different error to deal with:
##[warning]Failed to read /home/vsts/work/1/Results.xml. Error : There is an unclosed literal string. Line 2, position 1.
This one is a little obvious to spot, because as you'll see above, the XML is incomplete.
Here is the part of my pipeline which runs the tSQLt tests and outs the results to Results.xml
- script: |
sqlcmd -S 127.0.0.1,1433 -U SA -P Password.1! -d StagingDB -Q 'EXEC tSQLt.RunAll;'
displayName: 'tSQLt - Run All Tests'
- script: |
cd $(Pipeline.Workspace)
sqlcmd -S 127.0.0.1,1433 -U SA -P Password.1! -d StagingDB -Q 'SET NOCOUNT ON; EXEC tSQLt.XmlResultFormatter;' -o 'tSQLt_Results.xml'
displayName: 'tSQLt - Output Results'
I've research so many blogs and articles on this, and most people are doing the same. Some people use PowerShell instead of sqlcmd, but given I'm using a Ubuntu machine this isn't an option here.
I am all out of options, so I am looking for a little help on this.
You are dealing with 2 problems here. There is noise in your result set, that is not xml and your xml result is truncated after 256 characters. I can help you with both.
What I am doing is basically this:
/opt/mssql-tools/bin/sqlcmd \
-S "localhost, 31114" -U sa \
-P "password" \
-d dbname \
-y0 \
-Q "BEGIN TRY EXEC tSQLt.RunAll END TRY BEGIN CATCH END CATCH; EXEC tSQLt.XmlResultFormatter" \
| grep -w "<testsuites>" \
| tee "resultfile.xml"
Few things to note:
y0 important. This sets the length of the xml result set to unlimited, up from 256.
grep with a regular expression - make sure you only get the xml and not the noise around it.
If you want to run only a subset of your tests, you need to make amendments to the SQL query being passed in, but other than that, this is a catch it all "oneliner" to run all tests and get the results in xml format, readable by Azure DevOps

Eclipse define build variable in pre-build

I created simple shell script file in /tmp/test.sh
#!/bin/sh
echo 'aaaa'
In Eclipse->C/C++ Build->Setting->Build Step -> Pre-build steps->Command , I added TEST =/tmp/test.sh
Than in C/C++ Build->Setting->Tool Settings->Cross GCC Compiler ->Command line pattern , I tried to used this variable like -D ${TEST} so it pass -D aaaa (the output of shell script)
But in build console I didn't see that -D aaaa have been passed to gcc , In fact I didn't see -D at all
why is that? I want to set variable in pre-build that will be the output of shell script and use it with gcc command line pattern
How can I do that ?
Apparently the CDT managed build does not allow running external commands for assigning environment variables, build variables or command line flags, thus executing the script at that stage doesn't work.
Not a direct answer to the question on how to do that with the gcc command line pattern, but if you really only want to pass defines to the preprocessor with results you got from the script, you may use the script instead to update or create a header file that is included somewhere, e.g.:
#!/bin/sh
echo "#ifndef __MYVARHEADER__" > ../include/myvarheader.h
echo "#define __MYVARHEADER__" >> ../include/myvarheader.h
echo "#define aaaa" >> ../include/myvarheader.h
echo "#endif" >> ../include/myvarheader.h
In Eclipse->C/C++ Build->Setting->Build Step -> Pre-build steps->Command, just execute /tmp/test.sh

Why pipes are not working on Powershell for various NodeJS CLI tools?

I am trying around the following highly used tools:
prettyjson
prettier
For example when I run the following on Powershell:
echo '{"a": 1}' | prettyjson
The terminal will just keep waiting for inputs till CTRL+C pressed and it exits with no expected output.
The workaround is to add .cmd to the command or just use cmd instead:
echo '{"a": 1}' | prettyjson.cmd
Outputs
a: 1
This seems to be a known limitation and a pull request is available:
https://github.com/npm/cmd-shim/pull/43

Problem of running command from Rundeck (Linux)

On a linux server I am running following command with any error and getting the result.
xxxxx#server1 ~]$ grep -o "\-w.*%" /etc/sysconfig/nrpe-disk
-w 15% -c 7%
[xxxxx#server1 ~]$
I want to run same command from Rundeck's command line interface with same xxx user which has sudo rights too.
Command executed from rundeck gives option '.' invalide error:
option invalide -- '.'
Utilisation : grep [OPTION]... MOTIF [FICHIER].
I tried many times with different ways such as escaping . sign, running it with sudo, with absolute path, double quotes - single quotes etc. Still I am receiving same output however, in the server command works locally. What's the way to fix it ?
You can do that putting that on an inline-script ("Script" step) or call an external script with the command content ("Script file or URL" step).
Another way is to use cat tool to print the file and capture the output using log filter (Click on the tiny Gear icon at the left of the step > Click on "Add Log Filter" > Select "Key/value data" and in pattern use with this regex: .*(-w .*%).*, put a name of the data - eg: diskdata - and click on "Log data" checkbox) and you get the output that you want, you can print that value using echo ${data.diskdata} in next step. Check.

Automate VIM EDITOR and update values using shell script (similar to editing crontab via shell script)

I am trying to automate a VI edit for a command similar to crontab editing via Shell script but not working for me so far.
Here is the final json with admin as true:
'{"name":"SQLSRVR","admin":"true","json_class":"Chef::ApiClient","chef_type":"client"}'
As you can see the EDITOR environment variable has to be set or passed as command line option -e
[root#vrhost user]# knife client edit SQLSRVR
ERROR: RuntimeError: Please set EDITOR environment variable
[root#vrhost user]# knife client edit
USAGE: knife client edit CLIENT (options)
-s, --server-url URL Chef Server URL
-k, --key KEY API Client Key
--[no-]color Use colored output, defaults to enabled
-c, --config CONFIG The configuration file to use
--defaults Accept default values for all questions
-d, --disable-editing Do not open EDITOR, just accept the data as is
-e, --editor EDITOR Set the editor to use for interactive commands
-E, --environment ENVIRONMENT Set the Chef environment
-F, --format FORMAT Which format to use for output
-u, --user USER API Client Username
--print-after Show the data after a destructive operation
-V, --verbose More verbose output. Use twice for max verbosity
-v, --version Show chef version
-y, --yes Say yes to all prompts for confirmation
-h, --help Show this message
FATAL: You must specify a client name
The below command opens a vim editor for editing to make the change from ["admin": "false"] to ["admin": "true"]:
[root#vrhost user]# knife client edit SQLSRVR -e vim
{
"name": "SQLSRVR",
"admin": false,
"json_class": "Chef::ApiClient",
"chef_type": "client",
}
I am trying to do this through a shell script and would like to automate it and tried many options but had no luck so far.
[root#vrhost ~]# (echo ^[:g/false/s/false/true/^[:wq!^M) | knife client edit SQLSRVR -e vim
Vim: Warning: Input is not from a terminal
Object unchanged, not saving
or
[root#vrhost user]# echo (^[echo '{"name":"SQLSRVR","admin":"true","json_class":"Chef::ApiClient","chef_type":"client"}'^[:w q!^M) | knife client edit SQLSRVR -e
[root#vrhost ~]# knife client show SQLSRVR
admin: false
chef_type: client
json_class: Chef::ApiClient
name: SQLSRVR
this is very similar to automating crontab editing via shell script but this has not been working for me.
Unless you really need special Vim capabilities, you're probably better off using non-interactive tools like sed, awk, or Perl / Python / Ruby / your favorite scripting language here.
That said, you can use Vim non-interactively, using silent batch mode.
vim -T dumb --noplugin -n -es -S "commands.ex" "filespec"
Instead of the external script to read the commands from via -S "commands.ex", you can also give a few commands directly via -c cmd1 -c cmd2. See :help -s-ex for more information.
Check out
$ knife client edit --help
[...]
-d, --disable-editing Do not open EDITOR, just accept the data as is
So I guess you can change the values without editing in vim. Just:
get the client data in json format.
replace needed values with sed.
upload the data from file.
Code:
$ knife client show -Fj SQLSRVR > SQLSRVR.json
$ sed -i.old "s/\"admin\": true,/\"admin\": false,/" SQLSRVR.json
$ knife client edit -d SQLSRVR < SQLSRVR.json
Something like that.
Here are some links to references:
i) http://mirror.hep.wisc.edu/stable/chef/chef-server-webui/app/controllers/clients_controller.rb
ii) http://www.rubydoc.info/github/opscode/chef/master/Shell/Extensions - tried but unable to get it to work
Finally did the following (it does give 409 the 2nd time on call and I did not need to do it a 2nd time):
# call to below rb, CLIENTNAME is the name of the client and STATE is true/false
$ knife exec clienttransform.rb CLIENTNAME STATE
$ cat clienttransform.rb
Chef::Config[:solo] = false
class Company
class TransformClient
attr_accessor :clientname
attr_accessor :isclientadmin
def initialize(client_name, is_client_admin)
#clientname = client_name
#isclientadmin = is_client_admin
end
def transform
client=Chef::ApiClient.load(#clientname)
# puts "client.name : " + client.name
# puts "client.admin : " + client.admin.to_s
# puts "XX - clientname : " + #clientname
# puts "XX - isclientadmin : " + #isclientadmin.to_s
boolisclientadmin = !!#isclientadmin
client.admin(boolisclientadmin)
client.save()
end
end
end
client_name = ARGV[2].to_s()
is_client_admin = ARGV[3].to_s()
# puts "YY - client_name : " + client_name
# puts "YY - is_client_admin : " + is_client_admin
trc = Company::TransformClient.new(client_name, is_client_admin)
trc.transform
exit 0
Just set your editor and it will work. In my case I use vim editor that's why my command was as follows:
export EDITOR=vim

Resources