I'm just starting to explore the Databricks API. I've created a .netrc file as described in this doc and am able to get the API to work with this for other operations like "list clusters" and "list jobs". But when I try to query details of a particular job, it fails:
$ curl --netrc -X GET https://<my_workspace>.cloud.databricks.com/api/2.0/jobs/get/?job_id=job-395565384955064-run-12345678
{"error_code":"INVALID_PARAMETER_VALUE","message":"Job 0 does not exist."}
What am I doing wrong here?
Job ID should be a numeric identifier while you're providing the job cluster name instead. You need to use first number (395565384955064) from that name as a job ID in REST API. Also, remove / after get - it should be /api/2.0/jobs/get?job_id=<job-ID>
$ curl --netrc -X GET https://<my_workspace>.cloud.databricks.com/api/2.0/jobs/get/?job_id=job-395565384955064-run-12345678
In this link, looks like job_name had been mentioned as alphanumeric value instead of job_id. You can find job_id where you can find it .
Related
I am using DevOps pipeline to build and deploy to different environments
For one environment I am encountering this issue where i am using a Pipeline Variable with $$ in the value
For Example:
Password pipeline variable with value = $omeCla$$Password
When i deploy it fails and when i check the logs the password is displayed as $omeCla$Password. So basically when $$ are together it drops one $
For all variable i am using regex __VaraibleValue__ and its working fine
I have tried:
$omeCla$\$Password to try and escape and it displays as $omeCla$\$Password . So basically \ doesn't work.
I tried '$omeCla$$Password' to try and escape and it displays as '$omeCla$Password'
I want to keep this value as a normal pipeline variable before review
So basically how can I escape this?
Or should I add a Secret Token here in the replace token task (see screenshot below)? and then make the pipeline variable secret? If so, what should I set for Secret Token? Also, in app.config in my repo what should I use instead of the regex __VariableName__ that I use for normal variables?
The solution was to use 4 $. So if you have $$ together you need to add $$$$
Example: $someCla$$$$Password
#JaneMa-MSFT as requested
https://developercommunity.visualstudio.com/content/problem/1296808/azure-pipeline-how-to-escape-special-characters-in.html
I'd like to send myself a text when a job is finished. I understand how to change the job name so that the .o and .e files have the appropriate name. But I'm not sure if there's a way to change the job ID from a string of numbers to a specified key so I know which job it is. I usually have a lot of different jobs going at once, so it's difficult to remember all the different job ID numbers. Is there a way in the .pbs script to change the job ID so that when I get the message I can see which job it is rather than just a string of numbers?
If you are using Torque and add the -N flag, then you can add a name to the job. It will still use the numeric portion of the job id as part of the output and error filenames, but this allows you to add something to help you distinguish among your jobs. For example:
$ echo ls | qsub -N whatevernameyouplease
first of all thanks for taking the time in helping me out on this one.
I have a 12300 long list of snapshots, working on deleting certain snapshots, so im trying to list them all first thru the CLI.
I want to get the SnapshotID, the StartTime, and from the tags, the 'Name'
I tried quite a few querys, but all of them result in null on the name :/
THsi is my latest one:
aws ec2 describe-snapshots --query 'Snapshots[*].{ID:SnapshotId,Time:StartTime,Name:Tags[?Key=='Name'].Value[*]}'
Is this something one can do? or should i query all Key pairs, and then filter them out with --filters ?
Few issues to be considered:
Beware of the type of quote marks around the Key Names(backticks, not single quotes)
Forcing a single value out of the tag array.
You should specify the --owner-ids otherwise all accessible snapshots will be listed (including ones that don't belong to your account)
This command works:
aws ec2 describe-snapshots --query 'Snapshots[*].{ID:SnapshotId,Time:StartTime,Name:Tags[?Key==`Name`]|[0].Value}' --owner-ids *<YOUR-ACCOUNT-ID>*
Trying to monitor filesystems with zabbix. I found this : https://github.com/vintagegamingsystems/Zabbix-Read-Only-Filesystem-Check
and been trying to implement it. But I don't understand given this user parameter: UserParameter=checkro[*],/etc/zabbix/scripts/checkro.sh $1
What should be the item key. According to the documents checkro should work but I keep getting Status Unsupported. Tried posting this on zabbix forms but it takes 3-5 days for them to approve my post :/
EDIT : Files changed : /etc/zabbix/zabbix_agentd.conf I added a line for the UserParameter and added the checkro.sh script. I restarted zabbix after was (it's a container, so technically restarted the container)
What I was expecting was for checkro[something] to be supported as item key but it isn't.
[*] indicates that this item key takes parameters. The script has this line: mountPoint=$1.
Thus the item key should have the mountpoint passed as a parameter like so:
checkro[/home]
Maybe too late. But I just used this script. It works only if / and /boot. If your FS is on say /dev/MAPPER etc it does not work.
Github API allows us to search users by different parameters, and one of those parameters is location. Running the following query will give all the users living in Pakistan:
curl https://api.github.com/search/users?q=location:pakistan
Now, I would like to get all the users that either live in Pakistan or in India, but it seems that Github doesn't define a way for having an or between Pakistan & India.
I have tried the following queries, but these aren't working:
curl https://api.github.com/search/users?q=location:pakistan&location:india
curl https://api.github.com/search/users?q=location:(pakistan|india)
Your first attempt is close, but doesn't work because location isn't its own HTTP GET argument. The entire string location:pakistan is the value to the q parameter.
When you do ?q=location:pakistan&location:india you are actually submitting something like
q has the value location:pakistan
location:india is a key, but has no value
Instead, join multiple location keys with + or %20:
curl https://api.github.com/search/users?q=location:pakistan+location:india
Now the entire location:pakistan+location:india string is passed as the value to the q key.
A literal space can work too, but then you have to escape it or wrap the arguments in quotes.