My script does not unzip file - linux

I downloaded a file from a server in the format .tar.gz
my script has the commands
tar -zxvf data.tar.gz
rm -rf data.tar.gz
When I run this manually it unzips it and displays the contents, and then it deletes the file data.tar.gz
However, after running this from the script, when I go in to the folder where it is saved, the file data.tar.gz is there, but none of its contents are present
What I do not get is why it does not unzip or get deleted.
Part of the script:
OUT=$(date +%Y%m%d -d yesterday)-blah.gz
wget ftp://blah:blah#ftp.haha.com/"$OUT" -O /myFolder/Documents/"$OUT"
tar -zxvf /myFolder/Documents/"$OUT"
#whent the file is unzipped it produces 2 files called abcd and efgh
#because i dont need abcd or the original zipped file
rm -rf /myFolder/Documents/"$OUT"
rm -rf /myFolder/Documents/abcd
OK SO ANOTHER UPDATE: When I run this with cron, it does not work, However, when I bash run it, it works.

The problem you have is that you're not specifying the output directory for tar. By default, it outputs the file from where you're executing the script. You have several fixes available, but the easiest is:
OUT=$(date +%Y%m%d -d yesterday)-blah.gz
wget ftp://blah:blah#ftp.haha.com/"$OUT" -O /myFolder/Documents/"$OUT"
cd /myFolder/Documents/
tar -zxvf "$OUT"
#whent the file is unzipped it produces 2 files called abcd and efgh
#because i dont need abcd or the original zipped file
#remember you cd'd here, after we're done ...
rm -rf "$OUT"
rm -rf abcd
#go back
cd -

Its seem you don't launch your script from the directory your archive is in,
You must use cd command in your script to move to that directory and then launch tar and rm command
Edit:
tar -zxvf /myFolder/Documents/"$OUT"
tar will extract the archive in the current directory.

Related

Remove a file has the same name with folder after using tar

I created a script to find files, move them to a folder, compress this folder using tar. Then delete original folder. But after running the script, the folder was removed but a file having same name with the folder was created. I tried to run one by one command, it's OK. There is not this file. I added rm command in the script to remove it but not working.
I don't know why this file was create.
My script is below:
#!/bin/bash
cd /home/tuan/testrm/9/
mkdir compressdir
sudo find . -type f -name "*.txt" -print | xargs -I {} mv {} compressdir > /dev/null 2>&1 &
tar -cvzf compresslog.tar.gz --remove-files compressdir
mv compresslog.tar.gz compresslog-`date +"%d%m%Y"`.tar.gz
rm -rf compressdir
I want to know why this file was create and how to prevent this happen.
You should remove & at the end of sudo find line, and it will works.
Because of The & makes the command run in the background.
Root cause: the sudo find command line and tar -> mv -> rm run at synchronized.
If you had checked the file compresslog.tar.gz which the script generated, you will found that it was null or error, and the compress file contain the same content with someone file which you find.

Download, extract and copy file from a folder that has a version in its name

I'm writing a bash script that downloads an compressed archive from an universal URL (in which new release of the software will automatically be presented), extracts it and copies a file called wimboot to a folder.
This is what I currently have:
sudo curl http://git.ipxe.org/releases/wimboot/wimboot-latest.tar.gz -o ./wimboot.tar.gz
sudo tar -zxvf ./wimboot.tar.gz #Extracts into a folder called "wimboot-2.5.2-signed", in it is the file I need ("wimboot").
cd ./wimboot*/
sudo cp wimboot /my-folder/
But this doesn't work. Is there a method that will allow me to do this?
You can ask tar for a file listing (-t option), which you can then grep for wimboot -- that should give you the relative path to the file also. A naive first try would be something like:
src_file=$(tar tf wimboot.tar.gz | grep wimboot)
cp "$src_file" my_folder/
But you will probably want to add some error checking and stuff to that. And probably a more complicated grep expression to ensure you get the one thing you're after.
There's also no need to extract the entire archive. You can just ask tar to extract the file you're interested in:
tar zxf wimboot.tar.gz "$src_file"
I've built on top of the other answers and this worked for me:
#Create a temporary folder, this folder must be empty!
sudo mkdir temp
#Download the archive and save it in the temporary folder.
sudo curl http://git.ipxe.org/releases/wimboot/wimboot-latest.tar.gz -o ./temp/wimboot-latest.tar.gz
#Extract the downloaded archive to the temporary folder.
sudo tar xvf ./temp/wimboot-latest.tar.gz -C ./temp
#Search for and copy files with the name "wimboot" to the web directory.
sudo find ./temp/ -name 'wimboot' -exec cp {} /var/www/ \;
#Delete the temporary folder.
sudo rm -Rf temp
I do not recommend this for large archives.

Extract tar file after copy without success

Im copy tar file form folder A to folder B and I want that when
it finish to copy it to unzip it, I try with the following in my shell script which doesnt work,any idea?
cp /home/i557956/A/file.tar /home/i557956/B
tar -xvf /home/i557956/B/file.tar
The copy was success but the tar is not extracted in B folder...
Try to move into the B folder before extracting:
cp /home/i557956/A/file.tar /home/i557956/B
(cd /home/i557956/B/ && tar -xvf file.tar)

Can tar extraction erase brother directory ?

I made several backups on different directories with Backup Manager. Eg: /home/user1 /home/user2...
It gives me some tar files. The content of a tar file looks like :
home/user1/
home/user1/.profile
home/user1/.bash_history
home/user1/.bash_logout
...
I tried to test the restoration with something like :
tar -xvzf home.user1.tar.gz -C home/user1
But the command above recreate all the structure inside the choosen directory. That gives /home/user1/home/user1/filname1.
So I guess I should use the command specifying the home directory (/home) instead of the user directory. But is there any risk to erase other user's directories in /home ?
Thks for your time.
Actually tar does not erase data as a default. But any files that are contained within the tar archive will overwrite files of the same name if they are already present. Likewise a sub-directory's contents will not be overwritten if the tar archive does not contain files matching them.
mkdir -p foo/bar/
touch foo/file1 foo/bar/file1
tar -cf foo.tar foo/
rm -rf foo
mkdir -p foo/bar/
touch foo/file2 foo/bar/file2
tar -xf foo.tar
ls foo foo/bar/
As once can see both file1 and file2 are present and the newly unarchived directory did not overwrite the old. Here is the output of ls from my system:
foo:
bar file1 file2
foo/bar/:
file1 file2

linux shell tar unwanted extra directories

I have the following problem:
I have directorties a/b/c and inside c many text files.
I want to make a .tar.gz file in drectory a/mydir with the c directory inside and then unzip it to that same directory to create a/mydir/c (with all the files inside)
I am at directory a and run: (shell)
~:$ tar -czf mydir/output.tar.gz b/c
~:$ tar -zxf mydir/output.tar.gz -c mydir
but the result is directories a/mydir/b/c (with the files inside)
The problem is I don't want directory b in the middle, just c with all its contents
This works for me. Create data
mkdir -p a/b/c
echo 42 > a/b/c/file.dat
Archive
tar zc -f c.tar.gz -C a/b c
created a/b/c directories, from directory a kindly try this command.
so the file under b/c/files were done out.tar.gz
new directory "mydir" create under "b" and files extracted too.
out.tar.gz removed from "a".
# tar -cvzf out.tar.gz b/c/* ; mkdir -p b/mydir ; tar -xvzf out.tar.gz -C b/mydir/ ; rm -rf out.tar.gz
Thanks!

Resources