untar file with same name every time new file uploaded via ftp - linux

i am using zoneminder on Raspberry pi 3 for motion detection using ip camera.
zoneminder has ftp upload file option. now the problem is that the zoneminder is uploading tar files to server directory , now i want a program/script that continuously check directory and untar every updated file with same name.
file name is like:
ipcam-2044.tar next file will be ipcam-2045.tar

If I well understood this script should be the one you 're looking for:
it just extract all tar files and delete the source archive
for file in *.tar; do tar xf "${file}" && rm "${file}"; done

Related

Creating a flat tar file holding every filename in directory starting with "a"

I am using a command terminal inside a VirtualBox CentOS 7 linux system. I am attempting to create a tar file into a seperate directory that contains all the files in my current directory that start with the letter "a".
I have tried tar -cvf fileName.tar /newDirectory ls a* but I think that I'm doing something wrong. I assume this should only take one line of the command terminal to execute, does anybody know the right way to do it?
The first parameter is the tar file name (full path) and the second is the files you want take. Try it:
tar -cvf newDirectory/fileName.tar a*

Linux zip selected folder and create a download link for zipped file

My current directory contains web,api,logs,and some-backup-directory. I want to zip only web and api directory in a single zipped archive and create a direct download link for it, so i will download it over http:// from anywhere because downloading over ftp connection will take more time and also don't allow me to do other tasks on server at the same time. I am using this command to zip the files on server
zip -r mybackup-web.zip /home/projects/web
zip -r mybackup-api.zip /home/projects/api
But it will create two zip files, i need both in one.
I am using windows 7 in my local and Debian 8 on server. I am using putty to connect to the server and execute server commands.
Using zip
What you are doing actually works according to zip's man page:
zip -r <target> <dir1> # Add files from dir1 to archive
zip -r <target> <dir2> # Add files from dir2 to archive
If you execute both commands from the same working directory, the second command updates the existing zip file rather than create a new one.
Using tar
You could also use tar:
tar -zcvf <target>.tar <dir1> <dir2> ...
Flags:
c: Create a new archive containing the specified items
v: Produce verbose output (OPTIONAL)
f: Write the archive to the specified file
z: Compress using gzip
In your case:
tar -zcvf mybackup.zip /home/projects/web /home/projects/api
You can later extract it using:
tar -zxvf mybackup.zip

Create new TAR file in current directory to a new directory without using "mv" (shell script)

I'm currently setting up a backupmanager to automatically archive directories from webserver. I'm searching for an answer how to create a new TAR file of the current directory to a new (different) directory without using mv after the archiving process. See my command:
dcreate=$(date +%Y_%d_%m) tar -cvpzf backup_$dcreate.tar.gz plugins/folder_to_archive/
This command works fine, but i'm struggeling now on how to move it to a new directory directly after the archiv process is terminated, for example:
plugins/plugin_name/ to plugins/backups/
Any help appreciated.
Regards
The -f option of tar is the destination file for the archive; it can be anywhere you want, i.e., you can change
-cvpzf "backup_$dcreate.tar.gz"
to
-cvpzf "plugins/backups/backup_$dcreate.tar.gz"
if you want your new archive to be created in plugins/backups/

Using wget on a directory

I'm fairly new to shell and I'm trying to use wget to download a .zip file from one directory to another. The only file in the directory I am copying the file from is the .zip file. However when I use wget IP address/directory it downloads an index.html file instead of the .zip. Is there something I am missing to get it to download the .zip without having to explicitly state it?
wget is the utility to download file from web.
you have mentioned you want to copy from one directory to other. you meant it is on same server/node?
In that case you can simply use cp command
And if you want if from any other server/node [file transfer] you can use scp or ftp

Unable to untar a file?

I have written a shellscript which tries to pull a tar file from an ftp server and untar it locally. I need to extract specific files from the tar archive. The filename of the tarfile contains a date; I need to be able to select a tar file based on this date.
abc_myfile_$date.tar is the format of the file I am pulling from the ftp server.
My current code looks like this:
for host in ftpserver
do
ftp -inv host <<END_SCRIPT
user username password
prompt
cd remotepath
lcd localpath
mget *myfile_$date*.tar
quit
END_SCRIPT
done
for next in `ls localpath/*.tar`
do
tar xvf $next *required_file_in_tar_file*.dat
done
when i run the script am not able to untar the files
I am able to get a single tar file from the ftp server only if I mention the exact name of that file. I would like to get a file which has myfile_$date in its name. After this I would like to extract it to a local path to get the specified files in that tar file whose names consist of my required_files.
You get the .tar file, but decompress it with z option. Compressed files (those that require z) normally have .tar.gz prefix. Try
tar xvf $next *required_file_in_tar_file*.dat
Firstly, if you want to use wildcards for the file name that you're getting from the server you need to use mget instead of get. Wildcard file expansion (the *) does not work for the get command.
Once you have pulled the file the tar operation will work as expected, most modern versions of linux/bsd have a 'smart' tar, which doesn't need the 'z' command to specify that the tar file is compressed - they'll figure out that the tarball is compressed on their own and uncompress it automatically, providing the appropriate compression/decompression tool is on the system (bzip2 for .jz files, gzip for .gz files).
I'm not quite sure, but does the FTP protocol not have a command mget if you want to download multiple files? (instead of get)

Resources