Linux zip selected folder and create a download link for zipped file - linux

My current directory contains web,api,logs,and some-backup-directory. I want to zip only web and api directory in a single zipped archive and create a direct download link for it, so i will download it over http:// from anywhere because downloading over ftp connection will take more time and also don't allow me to do other tasks on server at the same time. I am using this command to zip the files on server
zip -r mybackup-web.zip /home/projects/web
zip -r mybackup-api.zip /home/projects/api
But it will create two zip files, i need both in one.
I am using windows 7 in my local and Debian 8 on server. I am using putty to connect to the server and execute server commands.

Using zip
What you are doing actually works according to zip's man page:
zip -r <target> <dir1> # Add files from dir1 to archive
zip -r <target> <dir2> # Add files from dir2 to archive
If you execute both commands from the same working directory, the second command updates the existing zip file rather than create a new one.
Using tar
You could also use tar:
tar -zcvf <target>.tar <dir1> <dir2> ...
Flags:
c: Create a new archive containing the specified items
v: Produce verbose output (OPTIONAL)
f: Write the archive to the specified file
z: Compress using gzip
In your case:
tar -zcvf mybackup.zip /home/projects/web /home/projects/api
You can later extract it using:
tar -zxvf mybackup.zip

Related

Tar command keeps bundling up entire directory path

I have a few sub-directories with files inside each of them in /home/user/archived/myFiles that I'm trying to bundle into a single tar file. The issue is, it keeps bundling a full directory path instead of just everything in the myFiles folder.
When I untar the file, I just want all the bundled sub-directories/files inside to appear in the directory I extracted the file rather than having to go through a series of folders that get created.
Instead, when I currently untar the file, I get a "home" folder and I have to go through /home/user/archived/myFiles to reach all the files.
I tried using the -C flag that I saw suggested online here Tar a directory, but don't store full absolute paths in the archive where you insert parameters for the full directory minus the last folder, and then the name of the last folder which contains all the stuff you want bundled. But the tar command doesn't work as I get a no such file or directory error.
#!/bin/bash
archivedDir="/home/user/archived/myFiles"
tar -czvf "archived-files.tar.gz" "${archivedDir}"/*
rm -vrf "${archivedDir}"/*
# Attempt with -C flag
#tar -cvf "${archivedDir}/archived-files.tar.gz" -C "${archivedDir}" "/*"
So for example, if I did an ls on /home/user/archived/myFiles, and it listed two directories called folderOne and folderTwo, and I ran this bash script and did an ls on /home/user/archived/myFiles again, that directory should only contain archived-files.tar.gz.
If I extracted the tar file, then folderOne and folderTwo would appear.
As I explain already here you should first change to this directory and then create the archive.
So change you script to something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -czvf "../archived-files.tar.gz" *
This will create the archive in upper directory so you will not remove it with the next command.
the extraction should be something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -xzvf "../archived-files.tar.gz"

untar file with same name every time new file uploaded via ftp

i am using zoneminder on Raspberry pi 3 for motion detection using ip camera.
zoneminder has ftp upload file option. now the problem is that the zoneminder is uploading tar files to server directory , now i want a program/script that continuously check directory and untar every updated file with same name.
file name is like:
ipcam-2044.tar next file will be ipcam-2045.tar
If I well understood this script should be the one you 're looking for:
it just extract all tar files and delete the source archive
for file in *.tar; do tar xf "${file}" && rm "${file}"; done

Structure of .tar files

I am trying to run some benchmarks which take in input a tar file.
Now there is a runme.sh inside the tar file and that needs to be modified and the folder has to be made a .tar again.
The original benchmark works, but the modifies one doesn't. I believe that it is a problem with the file format.
Note : My modification is not creating the problem. If I just uncompress the working tar and tar it again without modification, it does not work. Surprisingly the size of the new file changes.
What I tried :
file command on the working and non-working tar files.
Both returned same POSIX tar archive
Tried to run the command tar cvf folder_name.tar folder_name/
Does not work.
What works :
I am on Ubuntu (14.04) and I double clicked on the tar, directly edited the file I wanted and updated it. This works, but is not a feasible solution as I have a large number of files and I want to write a script to automate it.
Screenshot of how it works with GUI :
Does the original tar file include the top-level directory name? It doesn't look like it from your screenshot. If you re-create the tar file with a top level directory, as indicated by point 2 in the things you tried, the structure won't be the same, and whatever program is trying to consume the tar file won't be able to parse it.
How do you test "If I just uncompress the working tar and tar it again without modification, it does not work." In a GUI or in a shell? If in a shell - what exact commands do you use?
In a shell, you can get the contents of the tarball with the command tar -tf filename.tar. If all the files it lists starts with the same folder name, your tarball includes a top level directory. If it just lists various files and subdirectories, it doesn't. (Tarballs that don't are an abomination, but if whatever you are using them for requires it, you'll just have to cope.)
I'm guessing that if you do this on your original tar file and your modified, non-working tar file, the results will differ.
The following should work in a shell if you have/need a tarball without a toplevel directory:
$ mkdir workdir
$ cd workdir
$ tar -xf ../tarball.tar
<edit your file however you like>
$ tar -cf ../tarball-new.tar *
$ cd ..
$ rm -r workdir
In case you have/need a tarball with a toplevel directory, the following should suffice:
$ tar -xf ../tarball.tar
$ cd toplevel_directory
<edit your file however you like>
$ cd ..
$ tar -cf tarball-new.tar toplevel_directory
$ rm -r toplevel_directory
Edit: I'm glad it worked for you. The point is, of course, that tar includes the paths of the files it stores, not just the filenames. So if you need a flat list of files, you need to run tar in the directory containing those files, giving all of them as arguments to tar. If you try to take the shortcut of going up a level and only specifying the directory name to pack up, tar will include the directory name in the archive.

Using wget on a directory

I'm fairly new to shell and I'm trying to use wget to download a .zip file from one directory to another. The only file in the directory I am copying the file from is the .zip file. However when I use wget IP address/directory it downloads an index.html file instead of the .zip. Is there something I am missing to get it to download the .zip without having to explicitly state it?
wget is the utility to download file from web.
you have mentioned you want to copy from one directory to other. you meant it is on same server/node?
In that case you can simply use cp command
And if you want if from any other server/node [file transfer] you can use scp or ftp

Unable to untar a file?

I have written a shellscript which tries to pull a tar file from an ftp server and untar it locally. I need to extract specific files from the tar archive. The filename of the tarfile contains a date; I need to be able to select a tar file based on this date.
abc_myfile_$date.tar is the format of the file I am pulling from the ftp server.
My current code looks like this:
for host in ftpserver
do
ftp -inv host <<END_SCRIPT
user username password
prompt
cd remotepath
lcd localpath
mget *myfile_$date*.tar
quit
END_SCRIPT
done
for next in `ls localpath/*.tar`
do
tar xvf $next *required_file_in_tar_file*.dat
done
when i run the script am not able to untar the files
I am able to get a single tar file from the ftp server only if I mention the exact name of that file. I would like to get a file which has myfile_$date in its name. After this I would like to extract it to a local path to get the specified files in that tar file whose names consist of my required_files.
You get the .tar file, but decompress it with z option. Compressed files (those that require z) normally have .tar.gz prefix. Try
tar xvf $next *required_file_in_tar_file*.dat
Firstly, if you want to use wildcards for the file name that you're getting from the server you need to use mget instead of get. Wildcard file expansion (the *) does not work for the get command.
Once you have pulled the file the tar operation will work as expected, most modern versions of linux/bsd have a 'smart' tar, which doesn't need the 'z' command to specify that the tar file is compressed - they'll figure out that the tarball is compressed on their own and uncompress it automatically, providing the appropriate compression/decompression tool is on the system (bzip2 for .jz files, gzip for .gz files).
I'm not quite sure, but does the FTP protocol not have a command mget if you want to download multiple files? (instead of get)

Resources