Wget in bash script with 406 Not Acceptable Error - linux

On my Centos 6.2 I have this bash script:
[le_me]$ cat get.nb
#! /bin/bash
/usr/bin/wget -O /var/www/html/leFile.xml http://www.leSite.com/leFeed.xml
[le_me]$ source getFeeds.nb
: command not found
--2012-06-22 12:46:18-- http://www.leSite.com/leFeed.xml%0D
Resolving www.leSite.com... 1.2.3.4
Connecting to www.leSite.com|1.2.3.4|:80... connected.
HTTP request sent, awaiting response... 406 Not Acceptable
2012-06-22 12:46:18 ERROR 406: Not Acceptable.
The strange thing for me is that when I run this command
/usr/bin/wget -O /var/www/html/leFile.xml http://www.leSite.com/leFeed.xml
in the console, everything works fine and the file is downloaded without a problem.
I did google about it and I noticed this %0D which supposed to be a carrige return character, and I tried putting another space after the link like so: http://www.leSite.com/leFeed.xml[spaceChar]
and I got the file downloaded but I'm concerned about the command not found output and fetching that carrige return in the end (which ofc I know it's because of the space, but now at least I downloaded the file I originally wanted):
[le_me]$ source get.nb
: command not found
--2012-06-22 13:05:26-- http://www.leSite.com/leFeed.xml
Resolving www.leSite.com... 2.17.249.51
Connecting to www.leSite.com|2.17.249.51|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 35671435 (34M) [application/atom+xml]
Saving to: “/var/www/html/leFile.xml”
100%[=================================>] 35,671,435 37.2M/s in 0.9s
2012-06-22 13:05:27 (37.2 MB/s) - “/var/www/html/leFile.xml” saved [35671435/35671435]
--2012-06-22 13:05:27-- http://%0D/
Resolving \r... failed: Name or service not known.
wget: unable to resolve host address “\r”
FINISHED --2012-06-22 13:05:27--
Downloaded: 1 files, 34M in 0.9s (37.2 MB/s)
Can anyone shed some light on this please?

Your script file apparently has DOS-style lines, and the carriage return character is interpreted as just another character in the command line. If you have no space after the URL, it is interpreted as the last character of the URL; if you have a space, it is interpreted as a separate one-character parameter.
You should save your script file with UNIX-style lines. How you do that depends on your editor.

I'd suggest to quote the URL.
/usr/bin/wget -O /var/www/html/leFile.xml 'http://www.leSite.com/leFeed.xml'

The
: command not found
error suggests there is a problem in http:// part. As a rule of thumb I always quote those urls when using in command line. You often have bash/shell special characters in there.
In my case this command works for me without 406 problem (with some real http address). You should copy/paste the exact address. It probably contains something that causes it.

If none of the other answers work, try an alternative for wget
curl -o /var/www/html/leFile.xml 'http://www.leSite.com/leFeed.xml'

Related

Why sometimes I am getting a bad hostname?

This is something really bizarre. We have a Shell script, that is to do server configuration on every Linux box. and it contains this line of command:
#!/bin/bash
...
hostname=`hostname -f 2>/dev/null`
Most of time, this line of script returns back the correct host name value, as:
+ hostname=xyz.companyname.com
But I have seen couple of times, the whole configure fails, because it gives back such output:
+ hostname=xyz.companyname.COM
I don't know why the last piece of domain name becomes UP-Case value.
I don't see anything suspicious in the /etc/hosts file. Any idea what could make such happen ?
Thanks,
Jack
Check /etc/hosts.
My understanding is that hostname -f can retrieve the hostname from DHCP (?) or /etc/hosts—based on what condition(s), I don't know.
But you may have a
123.45.67.89 xyz.companyname.com xyz.companyname.COM
or something similar in there.

Alert!: Unsupported URL scheme! error when sending bulk sms using lynx

Team kindly help on the error Alert!: Unsupported URL scheme! when snding bulk sms in linux bash script. the lynx command works fine for static URL. This is what i have got below
#!/bin/bash
a="lynx -dump 'http://localhost:13013/cgi-bin/sendsms?from=8005&to="
b="&username=tester&password=foobar&smsc=smsc1&text=Test+mt+update'"
for i in cat numbers.txt;do $a$i$b;echo sent $i; done;
numbers
258909908780
256789123456
676675234789
The problem is the single quote before http:. Quotes are not processed after expanding variables, so it's being sent literally to lynx. There's no 'http URL scheme, hence the error message.
Remove the quotes before http: and after +update.
#!/bin/bash
a="lynx -dump http://localhost:13013/cgi-bin/sendsms?from=8005&to="
b="&username=tester&password=foobar&smsc=smsc1&text=Test+mt+update"
for i in $(cat numbers.txt);do $a$i$b;echo sent $i; done;
For more information about this, see
Setting an argument with bash

I get a scheme missing error with cron

when I use this to download a file from an ftp server:
wget ftp://blah:blah#ftp.haha.com/"$(date +%Y%m%d -d yesterday)-blah.gz" /myFolder/Documents/"$(date +%Y%m%d -d yesterday)-blah.gz"
It says "20131022-blah.gz saved" (it downloads fine), however I get this:
/myFolder/Documents/20131022-blah.gz: Scheme missing (I believe this error prevents it from saving the file in /myFolder/Documents/).
I have no idea why this is not working.
Save the filename in a variable first:
OUT=$(date +%Y%m%d -d yesterday)-blah.gz
and then use -O switch for output file:
wget ftp://blah:blah#ftp.haha.com/"$OUT" -O /myFolder/Documents/"$OUT"
Without the -O, the output file name looks like a second file/URL to fetch, but it's missing http:// or ftp:// or some other scheme to tell wget how to access it. (Thanks #chepner)
If wget takes time to download a big file then minute will change and your download filename will be different from filename being saved.
In my case I had it working with the npm module http-server.
And discovered that I simply had a leading space before http://.
So this was wrong " http://localhost:8080/archive.zip".
Changed to working solution "http://localhost:8080/archive.zip".
In my case I used in cpanel:
wget https://www.blah.com.br/path/to/cron/whatever

Error running make: missing separator (did you mean TAB instead of 8 spaces?)

I'm trying to get PHP phar command line tool installed on my Debian VM, how here described:
(1) download the php-src, I assume it's in /tmp/php/src
(2) make the dir /tmp/phar
(3) Save this as /tmp/php-src/ext/phar/Makefile.
(4) cd /tmp/php-src/ext/phar
(5) run sudo make
Now after step 5 I get an error:
:/tmp/php-src/ext/phar# make
Makefile:11: *** missing separator (did you mean TAB instead of 8 spaces?). Stop.
As I know, there can be two possible causes for this error message:
Tabs in the make file. I've tested the file with od -t c Makefile. The file contains no tabs (\t).
It could be a bug of make v3.81 and need a patch or an upgrade to (yet instable: "Warning: This package is from the experimental distribution.") v3.82. I've downloaded and istalled (dpkg -i make_3.82-1_amd64.deb) it, but the error is still occuring.
What causes the error? How can it be avoided?
Thx
(Answered in a comment: See Question with no answers, but issue solved in the comments (or extended in chat))
#Beta wrote:
The line should begin with a tab, not a bunch of spaces.
The OP wrote:
I've replaced all 8-spaces sequences with tabs and can execute the make script now.
I used:
cat Makefile|sed "s/ /\t/" > Makefile

remy inliner command line tool returns path.existsSync is now called `fs.existsSync`

I'm trying to use inliner command line tool locally to combine some files. But I get the following error message in the console.
path.existsSync is now called `fs.existsSync`
So i went into /usr/local/lib/node_modules/inliner/bin/inliner and changed line 65 from:
if (path.existsSync(url))
to
if (fs.existsSync(url))
but I get still the same error message. Can anybody give me a hint what is wrong and how I can fix this?
There is already a question here but that didn't fix my problem. Or am I editing the wrong file?
Cheers
:fab
I got inliner working by using the -i command
#-i, --images don't encode images - keeps files size small, but more requests
inliner -i http://fabiantheblind.info/coding.html > test2.html

Resources