Linux untar with prefix [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have multiple tar's that I want to untar into a folder and then append a prefix to them. The problem is that I don't know the name of the folder that it would create on the target system since these are build tar's and they have a date-timestamp inside. Here is what I tried -
tar xfz <filename>-*.tar.gz -C $UNTAR_LOCATION
so this creates a folder like this 20140909-0900 on the target UNTAR_LOCATION. How can I append a prefix to the date-timestamp ?
Note - there will be multiple folders with different date-timestamps under UNTAR_LOCATION for which I want to add the same prefix.

With versions of tar that support the --transform flag you should be able to use something like this:
tar -xzf <filename>-*.tar.gz -C "$untar_location" --transform='s,^,prefix,'

Here's how to do it with pax, the portable archiver:
gzip -cd filename.tar.gz | ( cd "$untar_location" && pax -r -s,^,prefix-, )
Most implementations of pax also has a -z option to filter through gzip, in which case it becomes
( cd "$untar_location" && pax -zrf filename.tar.gz -s,^,prefix-, )

Related

Include specific file from exluding folder [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I am using rsyn to copy a folder from source to desination
I am able to use exclude successfully
$ rsync -av --exclude='*/deploy/scb_pdm/*' --exclude='*/logs/*' $COPY_SRC_DIR $COPY_DEST_DIR
server-4.5.0/conf/wrapper.conf
server-4.5.0/deploy/
server-4.5.0/deploy/scb_pdm/
server-4.5.0/deploy/scb_pdm/director.properties
server-4.5.0/deploy/scb_pdm/ocollate_static.madconfig
server-4.5.0/lib/
server-4.5.0/lib/blue-marble-4.5.0.201511121524.jar
Now I am stuck, How can I exclude only
server-4.5.0/deploy/scb_pdm/ocollate_static.madconfig
I'll start with the generic usage of rsync to exclude the directory.
In order to achieve that you need to use --exclude flag.
rsync -arv --exclude cache/ your_src_dir/ your_dest_dir/
to your case, this will exclude the specific file that is ocollate_static.madconfig.
rsync -arv --exclude='*/deploy/scb_pdm/ocollate_static.madconfig*' --exclude='*/logs/*' $COPY_SRC_DIR $COPY_DEST_DIR
You can think of using another flag that is
--delete-excluded also delete excluded files from dest dirs
Another option, excluding multiple files and dirs at the same time.
$ rsync -avz --exclude your_file1.txt --exclude dir3/file4.txt source/ destination/
For detail information about using exclude in rsync.

Linux: how do I tar to a specific location? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am currently backing up my system. If I use
> tar cvpzf backup.tgz --exclude=/proc --exclude=/lost+found--exclude=/backup.tgz --exclude=/mnt --exclude=/sys /
It says the system doesn't have enought space.
Right now, I have mount a usb in directory
~/mnt/sdc1
How do I tar the backup.tgz to this specific location?
You will want to do two things. First insure that your drive is mounted, Then just provide the complete path to your backup location. E.g.
mount | grep -q ~/mnt/sdc1 # or just grep -q sdc1 (if it is the only sdc1)
if [ $? -eq 0 ]; then
tar cvpzf ~/mnt/sdc1/backup.tgz ...
fi
note: the replace the ellipses with the rest of your tar command.

tar two directories with the same name to one archived directory [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
What I'm doing is:
tar -czf etc.tar.gz /etc /usr/local/etc
And when I extract it I will have two directories:
1) etc
2) usr
What I want is to do it this way that I will have only etc after extracting with contents of this two directories.
Thanks.
Is there any other way than creating temporary directory with merged files from /etc and /usr/local/etc and then removing it?
cd /
tar -cf /path/to/etc.tar etc/
cd /usr/local
tar -rf /path/to/etc.tar etc/
cd /path/to
gzip etc.tar

Copying local files with curl [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
Is there a way to copy local files with curl, I need it to work as an alternative for cp command.
This is a bit strange, but I'm working on an environment where cp is not available.
You could say:
curl -o /path/to/destination file:///path/to/source/file
This would copy /path/to/source/file to /path/to/destination.
you can use rsync
rsync -avz /path/to/src/file /path/to/dst/dir
You can also use tar
cd /path/to/src/dir; tar -cpf - sourcefile | (cd /path/to/dest/dir; tar -xpf -)
if your tar support -C
cd /path/to/src/dir; tar -cpf - sourcefile | tar -xpf - -C /path/to/dest/dir

Hashing a Directory in Linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Is there any command in linux to calculate SHA1 hash of a director which contains files + Directories(these directories future contains file and more directories).
tar cf - $DIRECTORY|sha1sum
Deficiencies/advantages (depending on your perspective):
$DIRECTORY must be exactly the same in both cases (so you must use
relative paths).
This takes into account file modification dates, not just file contents.
I think you should be able to use this
find . -type f -exec sha1sum {} \;
Just replace the "." with your directory.
File by file you mean?
$ cd my_folder
$ sha1sum *
d73c8369c7808f7e96561b4c18d68233678f354f xxx.txt
5941a4f547f69b4b6271a351242ce41b3e440795 yyy.txt
Or of all the files together?
$ cat my_folder/* | sha1sum
7713154076812602f6f737cf5ad5924813182298

Resources