Passing variables to curl command in child_process.exec fails - node.js

I was trying to use child_process.exec to call curl with a long command in order to send some data to an API. Something similar to the following example:
exec('git log --oneline | wc -l', function(error, stdin, stderr) {
if (stdin > 1) {
exec('curl -H "Content-Type: application/json" -X POST -d \'{"value1": "\'"$arg"\'"}\' https://https://maker.ifttt.com/trigger/{event}/with/key/<my-key>', { "env" : {"arg": stdin } });
}
})
So if a git repo includes more than one line in its git log output, then you execute a POST request to some API (here, a simple webhook in ifttt.com), in which you're passing some variable (arg) in the process.
Notice that this is the best attempt, but in general, I was struggling quite a bit to escape single and double quotes. In this particular case, the HTTP request was not sent correctly because the body includes a line break:
POST / HTTP/1.1
Host: <some-host>
User-Agent: curl/7.50.1
Accept: */*
Content-Type: application/json
Content-Length: 16
{"value1": "2
"}
At the end, I had to use an external bash script:
exec('./send_request.sh $arg', { "env": {"arg": stdin } });
but I'm still very curious on how to make it work within the same js file.
If it helps, I'm running node 6.11.0 and curl 7.52.1.

Try:
exec('git log --oneline | wc -l', function(error, stdin, stderr) {
if (stdin > 1) {
exec('curl -H "Content-Type: application/json" -X POST -d \'{"value1": "\'"$arg"\'"}\' https://https://maker.ifttt.com/trigger/{event}/with/key/<my-key>', { "env" : {"arg": stdin.replace(/\n/g, '') } });
}
})
What is happening is that your variable 'stdin' (you should rename it to 'stdout') has a \n at the end of it.

Related

How to get a list of subfolders and files in jfrog Artifactory

I am looking to fetch the subfolders and files inside jfrog artifactory repo and for that I am running the below script which I am running in Groovy
def test = sh(script: "curl -u uname:password -X POST -k https://artifactory.xxxx.com/artifactory/api/search/aql -d 'items.find({\"type\" : \"file\",\"\$or\":[{\"repo\" : {\"\$match\" : \"war*\"}, \"repo\" : {\"\$match\" : \"web*\"} }]}).include(\"name\",\"repo\",\"path\",\"size\").sort({\"\$desc\": [\"size\"]}).limit(10)'", returnStdout: true).trim()
echo "The list is ${test}"
But its not returning any value.
Any solution would be helpful.
Thanks
You can use api/storage get the children of a artifact path.
For example, your Artifactory has repository: maven-prerelease-local for maven, you can open
https://artifactory.xxxx.com/maven-prerelease-local in browser, it will list file and folders under it.
By adding api/storage in URL, it will return a JSON response.
def test = sh(script: """
curl -u uname:password -X GET -k \
"https://artifactory.xxxx.com/api/storage/maven-prerelease-local/com/xxx/xxx/"
""", returnStdout: true).trim()
echo "The list is ${test}"
To get detailed information about the existing subfolders under a specific directory/repository, you can use the following format of execution.
$ jfrog rt search --spec=test.aql
[Info] Searching artifacts...
[Info] Found 1 artifact.
[
{
"path": "delta-generic-local/alpha/beta",
"type": "folder",
"created": "2022-08-04T13:53:36.173Z",
"modified": "2022-08-04T13:53:36.173Z"
}
]
& the spec file includes the following content.
$ cat test.aql
{
"files":
[
{
"aql":
{
"items.find" :
{
"type":"folder",
"repo":{"$eq":"delta-generic-local"},
"path":{"$eq":"alpha"}
}
}
}
]
}
I am guessing you are in escape special character hell. Put your query in a *.aql file and then point to it. See below.
// Create the aql file and write the query to it
writeFile file: 'sizeQuery.aql', text: 'items.find({"type":"file"}).sort({"$desc":["size"]}).limit(10)'
// Pass the aql file to your curl command
sh 'curl -u uname:password -H "Content-Type: text/plain" -X POST -d #sizeQuery.aql "https://artifactory.xxxx.com/artifactory/api/search/aql"'

How to use curl to post a linting request with the contents of .gitlab-ci.yml to the gitlab api?

Trying to make a curl request to gitlab.com api for linting .gitlab-ci.yaml file but receiving bad request response: {"status":400,"error":"Bad Request"}
#!/usr/bin/env bash
PAYLOAD=$( cat << JSON
{ "content":
$(<$PWD/../.gitlab-ci.yml)
JSON
)
echo "Payload is $PAYLOAD"
curl --include --show-error --request POST --header "Content-Type: application/json" --header "Accept: application/json" "https://gitlab.com/api/v4/ci/lint" --data-binary "$PAYLOAD"
Has anyone managed to successfully lint a .gitlab-ci.yml via a bash script? Also tried wrapping the content payload in braces and receive same response.
Update
I think what is happening is that the GitLab CI endpoint expects the contents of the .gitlab-ci yaml file to be converted to json for the POST request. See here
Modifed the script to use ruby to convert yaml to json before sending and this works for simple .gitlab-ci.yml. However when using the yaml file for my project it gives an error: {"status":"invalid","errors":["(\u003cunknown\u003e): did not find expected ',' or ']' while parsing a flow sequence at line 1 column 221"]}% When I use the gitlab web page for linting the file is valid.
{"content": "{ \"stages\": [ \"build\", \"test\", \"pages\", \"release\" ], \"variables\": { \"DOCKER_DRIVER\": \"overlay2\" }, \"services\": [ \"docker:19.03.11-dind\" ], \"build:plugin\": { \"image\": \"docker:19.03.11\", \"stage\": \"build\", \"before_script\": [ \"echo \"$CI_JOB_TOKEN\" | docker login -u gitlab-ci-token --password-stdin \"$CI_REGISTRY\"\" ].....
Column 221 is \"image\": \"docker:19.03.11\" in the above json extract, specifically at the closing escaped quote. Think it is a problem with incorrectly escaped quotes??
#!/usr/bin/env bash
json=$(ruby -ryaml -rjson -e 'puts JSON.pretty_generate(YAML.load(ARGF))' < .gitlab-ci.yml)
# escape quotes
json_content=$(echo $json | perl -pe 's/(?<!\\)"/\\"/g')
# Add object contect for GitLab linter
json_content='{"content": "'${json_content}'"}'
echo "${json_content}"
curl --include --show-error --request POST \
--header "Content-Type: application/json" \
--header "Accept: application/json" \
"https://gitlab.com/api/v4/ci/lint" \
--data-binary "$json_content"
Second Update
Using the above bash script this yaml file:
stages:
- test
test:
stage: test
script:
- echo "test"
gets converted to this json:
{"content": "{ \"stages\": [ \"test\" ], \"test\": { \"stage\": \"test\", \"script\": [ \"echo \"test\"\" ] } }"}
When this is sent to the api receive the following json error response:
{"status":"invalid","errors":["(\u003cunknown\u003e): did not find expected ',' or ']' while parsing a flow sequence at line 1 column 62"]}%
Got it working finally using the following script:
#!/usr/bin/env bash
json=$(ruby -ryaml -rjson -e 'puts(YAML.load(ARGF.read).to_json)' custom_hooks/valid.yml)
# escape quotes
json_content=$(echo $json | python -c 'import json,sys; print(json.dumps(sys.stdin.read()))')
echo $json_content
# Add object contect for GitLab linter
json_content="{\"content\": ${json_content}}"
# Output escaped content to file
echo $json_content > custom_hooks/input.json
echo "Escaped json content written to file input.json"
curl --include --show-error --request POST \
--header "Content-Type: application/json" \
--header "Accept: application/json" \
"https://gitlab.com/api/v4/ci/lint" \
--data-binary "$json_content"
N.B will be tweaking script to read file from system args rather than the fixed file location custom_hooks/valid.yml. Also the JSON response needs parsing using jq or python / ruby command shell. Including this script on the offchance that it will help others.
The problem was that initially I was sending YAML contents of the file directly to the api:
{ "content": { <contents of .gitlab-yml> } }
It looks as though GitLab accepts YAML converted to an escaped JSON string in their API. So used ruby to convert the yaml to JSON and then used python to escape the resulting JSON produced by ruby. Finally was able to use curl to send the escaped JSON string to the GitLab API for validating.....
Not sure if Ruby has something equivalent to python's json.dumps .... but this solution allows me to validate gitlab-ci....Next stage hookup to git pre-commit hooks / server side pre-receive (if possible!) to prevent invalid .gitlab-ci.yml files breaking CI pipeline.
Newbie to ruby...since posting original answer have had a go at creating a ruby script that can be used from pre-commit hooks etc. Now only require bash and ruby:
#!/usr/bin/env ruby
require 'json'
require 'net/http'
require 'optparse'
require 'yaml'
=begin
POST to GitLab api for linting ci yaml
Params:
+url+ :: Api url
+yaml+ :: Yaml payload for linting
Returns:
Json validation result from API for HTTP response Success
Aborts with HTTP Message for all other status codes
=end
def call_api(url, yaml)
uri = URI.parse(url)
req = Net::HTTP::Post.new(uri)
req.content_type='application/json'
req['Accept']='application/json'
req.body = JSON.dump({"content" => yaml.to_json})
https = Net::HTTP.new(uri.host, uri.port)
https.use_ssl = true
https.verify_mode = OpenSSL::SSL::VERIFY_PEER
response = https.request(req)
case response
when Net::HTTPSuccess
puts "request successful"
return JSON.parse response.body
when Net::HTTPUnauthorized
abort("#{response.message}: invalid token in api request?")
when Net::HTTPServerError
abort('error' => "#{response.message}: server error, try again later?")
when Net::HTTPBadRequest
puts "Bad request..." + request.body
abort("#{response.message}: bad api request?")
when Net::HTTPNotFound
abort("#{response.message}: api request not found?")
else
puts "Failed validation\nJSON payload :: #{request.body}\nHTTP Response: #{response.message}"
abort("#{response.message}: failed api request?")
end
end
=begin
Display exit report and raise the appropriate system exit code
Params:
+status+ :: Validation status string. Legal values are valid or invalid
+errors+ :: String array storing errors if yaml was reported as invalid
Returns:
Exits with 0 when successful
Exits with 1 on validation errors or fails to parse legal status value
=end
def exit_report(status, errors)
case status
when "valid"
puts ".gitlab-ci.yml is valid"
exit(0)
when "invalid"
abort(".gitlab-ci.yml is invalid with errors:\n\n" + errors.join("\n"))
else
abort("A problem was encountered parsing status : " + status)
end
end
=begin
Load yaml file from path and return contents
Params:
+path+ :: Absolute or relative path to .gitlab-ci.yml file
=end
def load_yaml(path)
begin
YAML.load_file(path)
rescue Errno::ENOENT
abort("Failed to load .gitlab-ci.yml")
end
end
=begin
Parse command line options
Returns:
Hash containing keys: {:yaml_file,:url}
=end
def read_args()
options = {}
OptionParser.new do |opt|
opt.on('-f', '--yaml YAML-PATH', 'Path to .gitlab-ci.yml') { |o| options[:yaml_file] = o }
opt.on('-l', '--url GitLab url', 'GitLab API url') { |o| options[:url] = o }
end.parse!
options
end
=begin
Load yaml to send to GitLab API for linting
Display report of linting retrieved from api
Returns:
Exits with 0 upon success and 1 when errors encountered
=end
def main()
# try and parse the arguments
options = read_args()
unless !options.has_key?(:yaml_file) || !options.has_key?(:url)
# try and load the yaml from path
puts "Loading file #{options[:yaml_file]}"
yaml = load_yaml(options[:yaml_file])
# make lint request to api
puts "Making POST request to #{options[:url]}"
response_data=call_api(options[:url], yaml)
# display exit report and raise appropriate exit code
unless !response_data.has_key?("status") || !response_data.has_key?("errors")
exit_report response_data["status"], response_data["errors"]
else
puts "Something went wrong parsing the json response " + response_data
end
else
abort("Missing required arguments yaml_file and url, use -h for usage")
end
end
# start
main

Convert curl command into a working script

I am using an API from check-host.net to ping an website.
My issue is right now that I have no ideea how I could transform the curl command api into an working python script. I tried different approaches which I found on here but sadly none has give me the ouput I am looking for.
Working curl command:
curl -H "Accept: application/json" \ https://check-host.net/check-tcp?host=smtp://gmail.com&max_nodes=1
the respons looks something like that:
{ "ok": 1, "request_id": "29", "permanent_link":
"https://check-host.net/check-report/29", "nodes": {
"7f000001": ["it","Italy", "Marco"] } }
You have to send a Accept: application/json header in your request. You can also use the builtin json decoder in requests.
import requests
headers={
'Accept': 'application/json'
}
r=requests.get('https://check-host.net/check-tcp?host=smtp://gmail.com&max_nodes=1',headers=headers)
print(r.json())
Output
{'nodes': {'us2.node.check-host.net': ['us', 'USA', 'New Jersey', '199.38.245.218', 'AS394625', 'OK']}, 'ok': 1, 'permanent_link': 'https://check-host.net/check-report/a462c3ck399', 'request_id': 'a462c3ck399'}

POST request containing CSR fails in Bash

I've written a bash script that sends a POST request to a server. The request contains a certificate signing request and the server signs it and returns a certificate.
When I copy and paste the CSR text in the POST's body, then the POST request is successful. But when I read the CSR from a variable, then the POST request fails. I've attached a snippet of the program below.
PROGRAM - Bash
openssl req -new -newkey rsa:2048 -nodes -out cert.csr -keyout priv.key -subj "/C=MyCountry/ST=MyState/L=MyCity/O=MyCompany/OU=MyDept/CN=MyComp"
if [ $? == 0 ]; then
csr=$(<cert.csr)
fi
response=$(curl -o - -s -w "%{http_code}\n" -X POST \
https://xxx.xxx.com/URI-END-POINT \
-H "authorization: $token" \
-H "content-type: application/json" \
-d '{
"digicert": {
"csr": "'$csr'",
"profileName": "pn123",
"signatureHash": "sh123",
"userPrincipalName": "pn123",
"validationScopeId": "vsi123"
},
"IccId": "sim123",
"MacAddress": "mac123"
}')
if [ $?==0 ]; then
status=$(echo $response | tail -c 4)
if [ "$status" == "$http_success" ]; then
echo -e "Request for certificate SUCCESS"
else
echo -e "Request for certificate FAILED with return code $status"
fi
else
echo -e "Request for certificate FAILED"
fi
OUTPUT - Bash
curl: option -----END: is unknown
curl: try 'curl --help' or 'curl --manual' for more information
In the above script, if I replace the line "csr": "'$csr'", with "csr": "----BEGIN CERTIFICATE REQUEST---- XXXXXXX ----END CERTIFICATE REQUEST----", then this will work fine.
Can you help me debug this?
Thanks!
Maybe the string in $csr is being evaluated, like if put in double quotes and the resulting string is something different than expected.
For start, try to see if $csr is same as "$csr".
To post the contents of a file, use jq to generate the JSON blob for you: this will take care of any necessary quoting automatically. The output of jq is pipe directly to curl by using the #- argument for the -d option. (A #-prefixed string indicates the name of a file curl should read from; - is the alias for standard input.)
response=$(jq -n --arg csr "$(<csr)" '{
digicert: {
csr: $csr,
profileName: "pn123",
signatureHash : "sh123",
userPrincipalName: "pn123",
validationScopeId: "vsi123"
},
IccId: "sim123",
MacAddress: "mac123"
}' |
curl -o - -s -w "%{http_code}\n" -X POST \
https://xxx.xxx.com/URI-END-POINT \
-H "authorization: $token" \
-H "content-type: application/json" \
-d #-
)

Windows curl string formatting

I am trying to use curl to make an HTTP POST request.
The request contains some environment variables. Here is the command:
curl -X POST -u username:pass -H "Content-Type: application/json" -d "{ \"fields\": { \"project\": { \"key\": \"myproject\" }, \"summary\": \"${var1.name} - ${var2.name}\", \"description\": \"Testing testing!:\n${url}\", \"issuetype\": { \"name\": \"Task\" }}}" http://myurl.com/rest
The information is sent, but the ${var1.name} and ${var2.name} are being sent as literal strings and not as their actual values.
The command is run on windows so that is why I am escaping the quotes. Could that be a problem as to why they're being sent as strings?
Windows environment variables are deferenced as %var1% and %var2%. This works:
C:\>set var1.name=test1
C:\>set var2.name=test2
C:\>set var
var1.name=test1
var2.name=test2
curl.exe -X POST -u username:pass -H "Content-Type: application/json" -d "{ \"fields\": { \"project\": { \"key\": \"myproject\" }, \"summary\": \"%var1.name% - %var2.name%\", \"description\": \"Testing testing!:\n${url}\", \"issuetype\": { \"name\": \"Task\" }}}" http://myurl.com/rest

Resources