I am trying to run this line on a linux machine:
curl --request GET \ --url 'https://www.tenable.com/downloads/api/v2/pages/nessus/files/Nessus-10.4.2-ubuntu1404_amd64.deb' \ --output 'nessus.deb'
but I am getting this error:
curl: (3) Host name ' --url' contains bad letter
Warning: Binary output can mess up your terminal. Use "--output -" to tell
Warning: curl to output it to your terminal anyway, or consider "--output
Warning: " to save to a file.
I ended up using: wget https://www.tenable.com/downloads/api/v2/pages/nessus/files/Nessus-10.4.2-ubuntu1404_amd64.deb
but it seems I should not have done it this way because now the subsequent line I want to run isn't working: dpkg -i nessus.deb
The output for the curl was 'nessus.deb'. So to install you had to use 'dpkg -i nessus.deb'.
However, because you used wget, it did not create the output name, but just saved 'INSERTFILENAMEHERE'. So to install it, you would use:
'dpkg -i *INSERTFILENAMEHERE*'
For example, you would have put in
'wget https://www.tenable.com/downloads/api/v2/pages/nessus/files/Nessus-10.4.2-ubuntu1404_amd64.deb'
Then it would say something like
''Nessus.deb' saved'
So you would then put in.
'dpkg -i Nessus.deb'
And it would install the package. In short, the reason why the line was not working was because the file was there, but the name did not match.
Related
Hi I am writing a auto script in test.sh , attempting to download a file. It works fine when I use all hard code string. But it does not work with variables. Belong are my code example:
#!/bin/bash
USER="admin"
PWD="adminpass"
curl -v -k -u ${USER}:${PWD} ${NEXUS_URL}/${SP1}/60/${SP1}-60.zip --output ${SP1}-60.zip
Above code not working not able to download my file, but if I put it as :
curl -v -k -u "admin":"adminpass" ${NEXUS_URL}/${SP1}/60/${SP1}-60.zip
--output ${SP1}-60.zip
Then it works. So how do I get the variable credential working with this curl command?
Thanks
Option 1
The parameter expansion will not include the double quotes. You can use:
#!/bin/bash
USER='"'admin'"' #single quote, double quote, single quote
PASS='"'adminpass'"'
curl -v -k -u ${USER}:${PASS} ${NEXUS_URL}/${SP1}/60/${SP1}-60.zip
Option 2
Alternatively, you can create a .netrc file and use curl -n as follows:
Documentation from https://ec.haxx.se/usingcurl-netrc.html
Create .netrc containing the following and place it in your home directory.
machine http://something.com
login admin
password adminpass
Run the command
curl -n -k --output ${SP1}-60.zip
curl will automatically look for the .netrc file. You can also specify the file path with curl --netrc-file <netrc_file_path>
I am using curl command to call rest api. I want to post data and my curl command looks like:
curl –x POST -u 'username:PW' -k -H "Content-Type:application/json" -d '{"json-input":{"handler":"getContent","image":true,"video":false,"text":false,"source":"1","lage":"testlage1"}}' -i http://localhost:8080/com.knime.enterprise.server/rest/v4/jobs/3fd2ca61-c173-4160-a20d-45c387f65f64
I am getting following message:
curl: (6) Could not resolve host: xn--x-5gn curl: (6) Could not
resolve host: POST
The letter before X is wrong. It is supposed to be an ascii minus ('-', ascii code 0x2d / 45) and not the unicode dash character (U+2013) as used in the question.
curl will treat all options that don't start with a minus as a URL, which makes it convert the dash-X string to a IDN hostname and try it. It then continues to try the "POST" host name as that follows the dash-X... None of those host names can be resolved, which is the curl error messages you see.
Then finally: don't use -X POST when you do a post with -d (or with -F)! Just remove the -X POST entirely and things will work better.
check the codepage settings of the terminal software in use and compare these to the host settings
in our case we saw the same strange hostname error returned to a simple
curl -v http://{hostname}:{port}
we discovered the problem was the dash character; the codepage specified in Putty was not consistent with the codepage used at the host so curl treated the -v expression as the hostname.
I want to download an image accessible from this link: https://www.python.org/static/apple-touch-icon-144x144-precomposed.png into my local system. Now, I'm aware that the curl command can be used to download remote files through the terminal. So, I entered the following in my terminal in order to download the image into my local system:
curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
However, this doesn't seem to work, so obviously there is some other way to download images from the Internet using curl. What is the correct way to download images using this command?
curl without any options will perform a GET request. It will simply return the data from the URI specified. Not retrieve the file itself to your local machine.
When you do,
$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
You will receive binary data:
|�>�$! <R�HP#T*�Pm�Z��jU֖��ZP+UAUQ#�
��{X\� K���>0c�yF[i�}4�!�V̧�H_�)nO#�;I��vg^_ ��-Hm$$N0.
���%Y[�L�U3�_^9��P�T�0'u8�l�4 ...
In order to save this, you can use:
$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png > image.png
to store that raw image data inside of a file.
An easier way though, is just to use wget.
$ wget https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
$ ls
.
..
apple-touch-icon-144x144-precomposed.png
For those who don't have nor want to install wget, curl -O (capital "o", not a zero) will do the same thing as wget. E.g. my old netbook doesn't have wget, and is a 2.68 MB install that I don't need.
curl -O https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
If you want to keep the original name — use uppercase -O
curl -O https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
If you want to save remote file with a different name — use lowercase -o
curl -o myPic.png https://www.python.org/static/apple-touch-icon-144x144-precomposed.png
Create a new file called files.txt and paste the URLs one per line. Then run the following command.
xargs -n 1 curl -O < files.txt
source: https://www.abeautifulsite.net/downloading-a-list-of-urls-automatically
For ones who got permission denied for saving operation, here is the command that worked for me:
$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png --output py.png
try this
$ curl https://www.python.org/static/apple-touch-icon-144x144-precomposed.png > precomposed.png
I'm learning shell scripting! for the same I've tried downloading the facebook page using curl on ubuntu terminal.
t.sh content
vi#vi-Dell-7537(Desktop) $ cat t.sh
curlCmd="curl \"https://www.facebook.com/vivekkumar27june88\""
echo $curlCmd
($curlCmd) > ~/Desktop/fb.html
Getting error when running the script as
vi#vi-Dell-7537(Desktop) $ ./t.sh
curl "https://www.facebook.com/vivekkumar27june88"
curl: (1) Protocol "https not supported or disabled in libcurl
But if the run the command directly then it is working fine.
vi#vi-Dell-7537(Desktop) $ curl "https://www.facebook.com/vivekkumar27june88"
<!DOCTYPE html>
<html lang="hi" id="facebook" class="no_js">
<head><meta chars.....
I will appreciate if anyone let me know the mistake I am doing in the script.
I've verified that curl library have ssl enabled.
A command embedded within a parenthesis runs as a sub-shell so your environment variables will be missing.
Try eval:
curlCmd="curl 'https://www.facebook.com/vivekkumar27june88' > ~/Desktop/fb.html"
eval $curlCmd
Create your script t.sh as this single line only:
curl -k "https://www.facebook.com/vivekkumar27june88" -o ~/Desktop/fb.html
As per man curl:
-k, --insecure
(SSL) This option explicitly allows curl to perform "insecure" SSL connections transfers.
All SSL connections are attempted to be made secure by using the CA certificate bundle
installed by default. This makes all connections considered "insecure" fail unless -k,
--insecure is used.
-o file
Store output in the given filename.
As #Chepner said, go read BashFAQ #50: I'm trying to put a command in a variable, but the complex cases always fail!. To summarize, how you should do things like this depends on what your goal is.
If you don't need to store the command, don't! Storing commands is difficult to get right, so if you don't need to, just skip that mess and execute it directly:
curl "https://www.facebook.com/vivekkumar27june88" > ~/Desktop/fb.html
If you want to hide the details of the command, or are going to use it a lot and don't want to write it out each time, use a function:
curlCmd() {
curl "https://www.facebook.com/vivekkumar27june88"
}
curlCmd > ~/Desktop/fb.html
If you need to build the command piece-by-piece, use an array instead of a plain string variable:
curlCmd=(curl "https://www.facebook.com/vivekkumar27june88")
for header in "${extraHeaders[#]}"; do
curlCmd+=(-H "$header") # Add header options to the command
done
if [[ "$useSilentMode" = true ]]; then
curlCmd+=(-s)
fi
"${curlCmd[#]}" > ~/Desktop/fb.html # This is the standard idiom to expand an array
If you want to print the command, the best way to do it is usually with set -x:
set -x
curl "https://www.facebook.com/vivekkumar27june88" > ~/Desktop/fb.html
set +x
...but you can also do something similar with the array approach if you need to:
printf "%q " "${curlCmd[#]}" # Print the array, quoting as needed
printf "\n"
"${curlCmd[#]}" > ~/Desktop/fb.html
Install following softwares in ubuntu 14.04
sudo apt-get install php5-curl
sudo apt-get install curl
then run sudo service apache2 restart
check your phpinfo() is enable with curl "cURL support: enabled"
Then check your command in shell script
RESULT=curl -L "http://sitename.com/dashboard/?show=api&action=queue_proc&key=$JOBID" 2>/dev/null
echo $RESULT
You will get response;
Thanks you.
I am trying to recursively download files from a specific website and I am encountering an error I've never seen before and which google comes up blank on. The command I'm entering is:
wget -m -p -E -k -K -np http://www.slac.stanford.edu/~timb/500/1f_3f_production/ae_1f/E0500-TDR_ws.Pae_ea.Gwhizard-1.95.eB.pL.I37470/
and the output is:
Conversion from 'ANSI_X3.4-1968' to 'ANSI_X3.4-1968' isn't supported
zsh: segmentation fault (core dumped) wget -m -p -E -k -K -np
The error seems to occur no matter what arguments I use. More strangely, it has no problems if I download each file in the directory individually. Does anybody have an idea what this error means?
Your exact command works for me on Windows 7 CMD line. I get 9 files. Google reports some problems with zsh. Can you try bash?