I'm trying to download a tar.gz file from a github repo with curl, but it's evidently downloading plain ASCII and so I can't unzip or untar the file (as evidenced by the file command - see the third line of my stack trace below).
One other important detail is that this is running inside an AWS CodeBuild instance. However, I can download this with curl just fine on my mac and it is a proper tar.gz file.
Here's the command I'm running:
curl -Lk0s https://github.com/gohugoio/hugo/releases/download/v0.49/hugo_0.49_Linux-64bit.tar.gz -o /tmp/hugo.tar.gz
The full stack trace is:
[Container] 2018/12/03 05:39:44 Running command curl -Lk0s https://github.com/gohugoio/hugo/releases/download/v0.49/hugo_0.49_Linux-64bit.tar.gz -o /tmp/hugo.tar.gz
[Container] 2018/12/03 05:39:45 Running command file /tmp/hugo.tar.gz
/tmp/hugo.tar.gz: ASCII text, with no line terminators ***[NB. This is the output of the file command]***
[Container] 2018/12/03 05:39:45 Running command tar xvf /tmp/hugo.tar.gz -C /tmp
tar: This does not look like a tar archive
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now
[Container] 2018/12/03 05:39:45 Command did not exit successfully tar xvf /tmp/hugo.tar.gz -C /tmp exit status 2
[Container] 2018/12/03 05:39:45 Phase complete: INSTALL Success: false
[Container] 2018/12/03 05:39:45 Phase context status code: COMMAND_EXECUTION_ERROR Message: Error while executing command: tar xvf /tmp/hugo.tar.gz -C /tmp. Reason: exit status 2
What am I doing wrong here?
-L works for me:
curl -L https://github.com/gohugoio/hugo/releases/download/v0.49/hugo_0.49_Linux-64bit.tar.gz -o /tmp/hugo.tar.gz
I tried it without any flags first and it downloaded the redirection page.
Added -L to follow redirects and the result was a well-formed, complete .tar.gz file that decompressed perfectly. The result was a folder with a few files in it:
$ ls -l
total 41704
-rw-r--r-- 1 xxxxxxxxxxx staff 11357 Sep 24 05:54 LICENSE
-rw-r--r-- 1 xxxxxxxxxxx staff 6414 Sep 24 05:54 README.md
-rwxr-xr-x 1 xxxxxxxxxxx staff 21328256 Sep 24 06:03 hugo
UPDATE: I didn't at first try your set of params (-Lk0s) assuming it wouldn't work for me either. But I just now tried it and it works for me. I get the same .tar.gz that I got with -L and it decompresses accurately. Please cat the contents of the text file that gets downloaded and show at least some of it here. It's probably an error of some sort being sent back as plain text or html.
Related
I have problem with installin Rocket.Chat server on ElementaryOS.
I try to unpack tgz file, but getting error.
Listing:
sudo apt-get install npm curl graphicsmagick
curl -L http://rocket.chat/releases/latest/download/ -o rocket.chat.tgz
tar zxvf rocket.chat.tgz
And after last command I get this error:
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error is not recoverable: exiting now
I try file rocket.chat.tgz:
rocket.chat.tgz: HTML document, UTF-8 Unicode text
Instead use this url:
curl -L https://releases.rocket.chat/latest/download -o rocket.chat.tgz
This is the first time I am using wkhtmltopdf.
Works fine on my Windows local pc, but I can't make it work on the live Linux shared host server.
I tried all kinds of solutions I found on google but nothing works.
I am a windows user, so I don't know anything about Linux, but I do have SSH access to the host server and the host confirmed that this is the right file to use on their server.
First I got the file, and then I tried to extract it with no success:
wget https://github.com/wkhtmltopdf/wkhtmltopdf/releases/download/0.12.4/wkhtmltox-0.12.4_linux-generic-amd64.tar.xz
tar xvjf wkhtmltox-0.12.4_linux-generic-amd64.tar.xz
bzip2: (stdin) is not a bzip2 file.
tar: Child returned status 2
tar: Error is not recoverable: exiting now
tar -xvjf wkhtmltox-0.12.4_linux-generic-amd64.tar.xz
bzip2: (stdin) is not a bzip2 file.
tar: Child returned status 2
tar: Error is not recoverable: exiting now
tar xvJf wkhtmltox-0.12.4_linux-generic-amd64.tar.xz
tar (child): xz: Cannot exec: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
So I extracted the file on my local windows machine.
I manually created these folders on the host server:
usr/local/bin.
I uploaded the 2 binaries files from my local machine to this bin folder.
And I tried a few ways to test it:
./wkhtmltopdf-amd64 http://www.example.com example.pdf
-bash: ./wkhtmltopdf-amd64: No such file or directory
/wkhtmltopdf-amd64 http://www.example.com ex.pdf
-bash: /wkhtmltopdf-amd64: No such file or directory
./wkhtmltopdf http://www.example.com ex.pdf
-bash: ./wkhtmltopdf: No such file or directory
/wkhtmltopdf http://www.example.com ex.pdf
-bash: /wkhtmltopdf: No such file or directory
Please, can someone help me make this work?
Make sure wkhtmltopdf binary has executable permission. Assuming you have copied them into /usr/local/bin
chmod +x /usr/local/bin/wkhtmltopdf
then run
/usr/local/bin/wkhtmltopdf --version
I'm a bit confused with the -c flag using bunzip2.
The following line of code works well:
ls -l
-> -rw-r--r-- 1 root root 163 Oct 25 13:06 access_logs.tar.bz2
bunzip2 -c access_logs.tar.bz2 | tar -t
When I would attempt to use this code without the -c flag:
bunzip2 access_logs.tar.bz2 | tar -t
I get the message:
tar: This does not look like a tar archive
tar: Exiting with failure status due to previous errors
But when showing the list ls -l:
-rw-r--r-- 1 root root 10240 Oct 25 13:06 access_logs.tar
Documentation says:
The left side of the pipeline is bunzip –c access_logs.tbz, which
decompresses the file but the (-c option) sends the output to the
screen. The output is redirected to tar –t.
According to the manual:
-c --stdout
Compress or decompress to standard output.
It seems that the decompression also works without the -c flag?
I'm confused about what you're confused about. You observed the answers to your questions, as well as read it in the documentation.
Without -c, bunzip2 will decompress the xx.gz file and save the results as the file xx. With the -c, it will not create a file, but rather send the result to stdout. If you have a pipe, |, then instead of being printed to the terminal (which would be a mess), it becomes the input to the program on the right side of the pipe.
You cant check file information type:
file access_logs.tar.bz2
Check manual: link
I have to transfer a file from server A to B and then needs to trigger a script at Server B. Server B is a Load balance server which will redirect you either Server B1 or B2 that we dont know.
I have achieved this as below.
sftp user#Server
put file
exit
then executing the below code to trigger the target script
ssh user#Server "script.sh"
But the problem here is as I said it is a load balance server, Sometimes I am putting file in one server and the script get triggers in another server. How to overcome this problem?
I am thinking some solutions like below
ssh user#server "Command for sftp; sh script.sh"
(i.e) in the same server call if I put and triggers it will not give me the above mentioned problem. How can I do sftp inside ssh connection? Otherwise any other suggestions?
if you're just copying a file up and then executing a script, and it can't happen as two separate commands you can do:
gzip -c srcfile | ssh user#remote 'gunzip -c >destfile; script.sh'
This gzips srcfile, sends it through ssh to the remote end, gunzips it on that side, then executes script.sh.
If you want more than one file, you can use tar rather than gzip:
tar czf - <srcfiles> | ssh user#remote 'tar xzf -; script.sh'
if you want to get the results back from the remote end and they're files, you can just replicate the tar after the script…
tar czf - <srcfiles> | ssh user#remote 'tar xzf -; script.sh; tar czf - <remotedatafiles>' | tar xzf -
i.e. create a new pipe from ssh back to the local environment. This only works if script.sh doesn't generate any output. If it generates output, you have to redirect it, for example to /dev/null in order to prevent it messing up the tar:
tar czf - <srcfiles> | ssh user#remote 'tar xzf -; script.sh >/dev/null; tar czf - <remotedatafiles>' | tar xzf -
You can use scp command first to upload your file and then call remote command via ssh.
$ scp filename user#machine:/path/to/file && ssh user#machine 'bash -s' < script.sh
This example about uploading a local file, but there is no a problem to run it on server A.
You could create a fifo (Named Pipe) on the server, and start a program that tries to read from it. The program will block, it won't eat any CPU.
From sftp try to write the pipe -- you will fail, indeed, but the listening program will run, and check for uploaded files.
# ls -l /home/inp/alertme
prw------- 1 inp system 0 Mar 27 16:05 /home/inp/alertme
# date; cat /home/inp/alertme; date
Wed Jun 24 12:07:20 CEST 2015
<waiting for 'put'>
Wed Jun 24 12:08:19 CEST 2015
transfer testing with tar gzip compression, ssh default compression. using PV for as pipe meter (apt-get install pv)
testing on some site folder where is about 80k small images, total size of folder about 1.9Gb
Using non-standart ssh-port 2204
1) tar gzip, no ssh compression
tar cpfz - site.com|pv -b -a -t|ssh -p 2204 -o cipher=none root#removeip "tar xfz - -C /destination/"
pv meter started from 4Mb/sec, degradated down to 1.2MB/sec at end. PV shows about 1.3Gb transfered bytes (1.9GB total size of folder)
2) tar nozip, ssh compression:
tar cpf - site.com|pv -b -a -t|ssh -p 2204 root#removeip "tar xf - -C /destination/"
pv meter started from 8-9Mb/sec, degradated down to 1.8Mb/sec at end
when I untar doctrine
-rw-r--r-- 1 root root 660252 2010-10-16 23:06 Doctrine-1.2.0.tgz
I always get this error messages
root#X100e:/usr/local/lib/Doctrine/stable# tar -xvzf Doctrine-1.2.0.tgz
.
.
.
Doctrine-1.2.0/tests/ViewTestCase.php
Doctrine-1.2.0/CHANGELOG
gzip: stdin: decompression OK, trailing garbage ignored
Doctrine-1.2.0/COPYRIGHT
Doctrine-1.2.0/LICENSE
tar: Child returned status 2
tar: Error is not recoverable: exiting now
The untar operation works, but I always get this error messages.
Any clues what I do wrong?
I would try to unzip and untar separately and see what happens:
mv Doctrine-1.2.0.tgz Doctrine-1.2.0.tar.gz
gunzip Doctrine-1.2.0.tar.gz
tar xf Doctrine-1.2.0.tar
It's possible your tar file is not zipped. I just had this same error, but all I had was a plain old tar file. So try just removing the z from your flags. The z flag unzips your tar file as well as whatever other commands you requested with other flags. i.e. try:
tar -xvf Doctrine-1.2.0.tgz
Notice I removed the z from -xvzf
If you got "Error is not recoverable: exiting now" You might have specified incorrect path references.
[me#host ~]$ tar -xvf nameOfMyTar.tar -C /someSubDirectory/
tar: /someSubDirectory: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
[me#host ~]$
Make sure you provide correct relative or absolute directory references e.g.:
[me#host ~]$ tar -xvf ./nameOfMyTar.tar -C ./someSubDirectory/
./foo/
./bar/
[me#host ~]$
Try to get your archive using wget, I had the same issue when I was downloading archive through browser. Than I just copy archive link and in terminal use the command:
wget http://PATH_TO_ARCHIVE
The problem is that you do not have bzip2 installed. The tar program relies upon this external program to do compression.
For installing bzip2, it depends on the system you are using. For example, with Ubuntu that would be on Ubuntu
sudo apt-get install bzip2
The GNU tar program does not know how to compress an existing file such as user-logs.tar (bzip2 does that). The tar program can use external compression programs gzip, bzip2, xz by opening a pipe to those programs, sending a tar archive via the pipe to the compression utility, which compresses the data which it reads from tar and writes the result to the filename which the tar program specifies.
Alternatively, the tar and compression utility could be the same program. BSD tar does its compression using lib archive (they're not really distinct except in name).
use sudo
sudo tar -zxvf xxxxxxxxx.tar.gz
Error messages
Easy way to fix this issue
First Remove files
Download file again
Extract file again ( tar -xzvf(or -xvf) FreeFileSync**.tar.gz
Had the same error code:
tar -xvfz processed.json.gz
tar: z: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
Turned out the file had the .gz extension but wasn't compressed. Just removed the .gz and opened it.