This command works fine:
json -I -f ./src/environments/build.json -e 'this.patch++'
I'm trying to create a custom build NPM command in my package.json file that runs this command before the actual build, but first I just tried to run the json command, just to see if it's working, but it doesn't :/
package.json
{
...
"scripts": {
...
"svrge-build-dev": "json -I -f ./src/environments/build.json -e 'this.patch++'",
...
}
then I get this output (which is exactly the same as when I run the JSON code by itself) which means that the command is definitely running
> web-client#1.0.0 svrge-build-dev D:\repos\test\web-client
> json -I -f ./src/environments/build.json -e 'this.patch++
json: updated "./src/environments/build.json" in-place //<- this is exactly the same'
However, the build.json file is not being updated
No errors in the terminal
any idea how I can get it to work? I can't seem to find anything regarding that.
I would appreciate any help, been scratching my head for hours
Tom
Re-stating what I said in the comment: try to replace the single quotes with double quotes:
{
// ...
"scripts": {
// ...
"svrge-build-dev": "json -I -f ./src/environments/build.json -e \"this.patch++\"",
// ...
}
What triggered that thought was this line in the output:
> json -I -f ./src/environments/build.json -e 'this.patch++
After doing some digging, I think it depends on the OS and/or command line interpreter. It shows the command that was run, but not with the finishing single quote.
After some searching, it appears that is indeed a bug: see this issue and this issue. You might want to give the maintainers a heads up about it: the last issue I've linked to, makes it seem like it's a Windows issue.
Related
I,am building a script to update files on Bitbucket using the rest api.
My problems are:
Running the command using subprocess lib and running the command directly on the command line gives two different results.
If I run the command using the command line, when I inspect my commits on the Bit bucket app I can see a Commit message and a Issue.
If I run the command using the help of the subprocess lib I don't have a commit message and a Issue in the end. The commit message sets itself by default to "edited by bitbucket" and the issue is null.
This is the command:
curl -X PUT -u user:pass -F content=#conanfile_3_f62hu.py -F 'message= test 4' -F branch=develop -F sourceCommitId={} bitbucket_URL".format(latest_commit)
The other problem is that I need to pass a file to the content in order to update it.
If I pass it like above it works. The problem is that I am generating the file content as raw string and creating a temporary file with that content.
And when I pass the file as a variable, it does not get the content of the file.
My code:
content = b'some content'
current_dir = os.getcwd()
temp_file=tempfile.NamedTemporaryFile(suffix=".py",prefix="conanfile", dir=current_dir)
temp_file.name = temp_file.name.split("\\")
temp_file.name = [x for x in temp_file.name if x.startswith("conanfile")][0]
temp_file.name = "#" + temp_file.name
temp_file.write(content)
temp_file.seek(0)
update_file_url = "curl -X PUT -u user:pass -F content={} -F 'message=test 4' -F branch=develop -F sourceCommitId={} bitbucket_url".format(temp_file.name, latest_commit)
subprocess.run(update_file_url)
Basically I'am passing the file like before, just passing the name to the content, but it does not work.
If I print the command everything looks good, so I don't know why the commit message does not get set and the file content as well.
Updated:
I was able to pass the file, My mistake was that I was not passing it like temp_file.name.
But I could not solve the problem of the message.
What I found is that the message will only take the first word. If there is a space and one more word after, it will ignore it.
The space is causing some problem.
I found the solution, if someone found himself with this problem we need to use a \ before the message= .
Example: '-F message=\" Updated with latest dependencies"'
i run the following function in (git-)bash under windows:
function config_get_container_values() {
local project_name=$1
local container_name=$2
#local container_name="gitea"
echo "###"
buildcmd="jq -r \".containers[]."
echo "$buildcmd"
buildcmd="${buildcmd}${container_name}"
echo "$buildcmd"
buildcmd="${buildcmd}foobar"
echo "$buildcmd"
echo "###"
}
The output of this is the following. Whyever, after using the variable to extend the string, he starts to overwrite $buildcmd. I tried this also with everything in one line as well with the append command (=+). Everytime the same result.
###
jq -r ".containers[].
jq -r ".containers[].gitea
foobar".containers[].gitea
###
The really strange thing is: When i enable the line local container_name="gitea" everything works as expected. The output is:
###
jq -r ".containers[].
jq -r ".containers[].gitea
jq -r ".containers[].giteafoobar
###
When i put this all into a news file, its also works as expected. So i think something goes wrong in the thousands of line before calling this function. Any idea, what could be cause of this behavior?
Regards
Dave
This is not how you should build up the command, DOS line endings aside. Use --arg to pass the name into the filter as a variable. For example,
config_get_container_values() {
local project_name=$1
local container_name=$2
jq -r --arg n "$container_name " '.containers[][$n+"foobar"]'
}
config_get_container foo gitea < some.json
If the function is invoked with
config_get_container_values proj gitea
it produces the "expected" output. If it is invoked with
config_get_container_values proj $'gitea\r'
it produces output that looks like the first output example. $'gitea\r' expands to a string that consists of 'gitea' followed by a Carriage return (CR) character.
One possible cause of the problem is that the container name (gitea) was read from a file that had Windows/DOS line endings (CR-LF). Problems like that are common. See the first question ("Check whether your script or data has DOS style end-of-line characters") in the "Before asking about problematic code" section of the Stack Overflow 'bash' Info page.
I have a Jenkins console output that looks like this:
Started by remote host 10.16.17.13
Building remotely on ep9infrajen201 (ep9) in workspace d:\Jenkins\workspace\Tools\Provision
[AWS-NetProvision] $ powershell.exe -NonInteractive -ExecutionPolicy ByPass "& 'C:\Users\user\AppData\Local\Temp\jenkins12345.ps1'"
Request network range: 10.1.0.0/13
{
"networks": [
"10.1.0.0/24"
]
}
Finished: SUCCESS
I get this from a curl command that I run. to check the JENKINS_JOB_URL/lastBuild/consoleText
My question is, for the sake of some other automation I am doing, how do I get just "10.1.0.0/24" so I can assign it to a shell variable using LINUX tools?
Thank you
Since you listed jq among the tags of your duplicate question, I'll assume you have jq installed. You have to clean up your output to get JSON first, then get to the part of JSON you need. awk does the former, jq the latter.
.... | awk '/^{$/{p=1}{if(p){print}}/^}$/{p=0}' | jq -r .networks[0]
The AWK script looks for { on its own on a line to turn on a flag p; prints the current line if the flag is set; and switches off the flag when it encounters } all by itself.
EDIT: Since this output was generated on a DOS machine, it has DOS line endings (\r\n). To convert those before awk, additionally pipe through dos2unix.
I am trying to make a simple npm script to run eslint and check if it's in CI or not and output the results to a file if it is.
This works to output the results to the terminal:
"lint": "eslint src --cache --format $(if [ -z ${SOMEVAR} ]; then echo \"stylish\"; else echo \"checkstyle\"; fi)",
But I want to save them to a file if there is an ENV var present using > checkstyle.xml
Is there a way to tack this onto that command? I've tried several ways, but no luck getting the file to output.
Edit:
I was able to get this working by adding --color | tee checkstyle.xml which writes the xml file regardless of ENV var value and displays a colorized version to terminal. This is not ideal, but does work. Open to other ideas though.
I found this great chart that shows what combos of output you can use together to achieve this: https://askubuntu.com/a/731237/541276
Do you mean something like this?
if [ "$somevar" ]; then exec >checkstyle.xml; fi; eslint ...
I write a crontab mission to make 3 POST request every 10 minutes by cURL, and here is pseudo:
#!/bin/sh
echo `date` >>/tmp/log
curl $a >>/tmp/log
curl $b >>/tmp/log
curl $c >>/tmp/log
That is all the code, but after the first echo to my /tmp/log, other output was saved in random file name like "A6E0U9~D", it doesn't happen all the time, I got no clues why.:(
PS. I don't use "$a", I use a raw string which copy from CHROME Dev Tool, and one of them is added below. And every single line's output is good, the only problem is some of the output was redirected to a random name file.
the cURL link is deleted because it contained my login cookie
Not really a solution, but you can redirect the output of everything at once, rather than repeatedly appending to the same file.
#!/bin/sh
{
date
curl ...
curl ...
curl ...
} > /tmp/log
The benefit here is that all the output will appear in the same file, whether that file is /tmp/log or an oddly named file. If you still end up with another file aside from /tmp/log, then you know there must be a problem with one of the curl calls.
(Note that capturing and re-printing the output of date is redundant.)
In order to run each curl in parallel, you'll need to save the output from each, and concatenate them once all have finished.
#!/bin/sh
{
date
tmp1=$(mktemp) && curl ... > "$tmp1" &
tmp2=$(mktemp) && curl ... > "$tmp2" &
tmp3=$(mktemp) && curl ... > "$tmp3" &
wait
cat "$tmp1" "$tmp2" "$tmp3"
} > /tmp/log
rm "$tmp1" "$tmp2" "$tmp3"