Copying local files with curl [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
Is there a way to copy local files with curl, I need it to work as an alternative for cp command.
This is a bit strange, but I'm working on an environment where cp is not available.

You could say:
curl -o /path/to/destination file:///path/to/source/file
This would copy /path/to/source/file to /path/to/destination.

you can use rsync
rsync -avz /path/to/src/file /path/to/dst/dir
You can also use tar
cd /path/to/src/dir; tar -cpf - sourcefile | (cd /path/to/dest/dir; tar -xpf -)
if your tar support -C
cd /path/to/src/dir; tar -cpf - sourcefile | tar -xpf - -C /path/to/dest/dir

Related

Use wget o curl for multiple download [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I would like to download all the audio files on this VoxForge web page. Unfortunately I don't understand how to download them all in a folder of my choice with a single command from the terminal using wget or alternatively curl. I tried wget like this without success:
wget http://www.repository.voxforge1.org/downloads/it/Trunk/Audio/Main/16kHz_16bit/ -P / home / user / download /
doing this way I get only an html index file
curl http://www.repository.voxforge1.org/downloads/it/Trunk/Audio/Main/16kHz_16bit/ | awk -F \" '/tgz/ { print "http://www.repository.voxforge1.org/downloads/it/Trunk/Audio/Main/16kHz_16bit/"$6 }' | xargs wget '{}' \;
Curl the link and parse the output with awk to get the full download address of each tgz file. Pipe this through to xargs and wget to download the links
The following command should work for you.
wget --directory-prefix=download_folder --no-directories --mirror --no-parent http://www.repository.voxforge1.org/downloads/it/Trunk/Audio/Main/16kHz_16bit

How to quickly move to a real directory using a soft link directory in linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have a dir /var/real-dir
I've created a soft link to it like this
ln -s /var/realdir /var/virtual-dir
Having that my working directory is /var/virtual-dir, I'm searching for a way to cd to real-dir with as less typing as possible.
You can use cd -P .
Note that this only updates the PWD and OLDPWD environment variables; the kernel-level current directory remains unchanged.
Alternatively, you can use the -P option with the initial cd like cd -P /var/virtual-dir.
You can:
cd "$(readlink -f .)"
If this is too much typing, you can create a helper function in your .bashrc, like this:
function cdlink() {
cd "$(readlink -f .)"
}
source ~/.bashrc or start a new shell and can simply type:
cdlink

Linux untar with prefix [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have multiple tar's that I want to untar into a folder and then append a prefix to them. The problem is that I don't know the name of the folder that it would create on the target system since these are build tar's and they have a date-timestamp inside. Here is what I tried -
tar xfz <filename>-*.tar.gz -C $UNTAR_LOCATION
so this creates a folder like this 20140909-0900 on the target UNTAR_LOCATION. How can I append a prefix to the date-timestamp ?
Note - there will be multiple folders with different date-timestamps under UNTAR_LOCATION for which I want to add the same prefix.
With versions of tar that support the --transform flag you should be able to use something like this:
tar -xzf <filename>-*.tar.gz -C "$untar_location" --transform='s,^,prefix,'
Here's how to do it with pax, the portable archiver:
gzip -cd filename.tar.gz | ( cd "$untar_location" && pax -r -s,^,prefix-, )
Most implementations of pax also has a -z option to filter through gzip, in which case it becomes
( cd "$untar_location" && pax -zrf filename.tar.gz -s,^,prefix-, )

Linux: how do I tar to a specific location? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am currently backing up my system. If I use
> tar cvpzf backup.tgz --exclude=/proc --exclude=/lost+found--exclude=/backup.tgz --exclude=/mnt --exclude=/sys /
It says the system doesn't have enought space.
Right now, I have mount a usb in directory
~/mnt/sdc1
How do I tar the backup.tgz to this specific location?
You will want to do two things. First insure that your drive is mounted, Then just provide the complete path to your backup location. E.g.
mount | grep -q ~/mnt/sdc1 # or just grep -q sdc1 (if it is the only sdc1)
if [ $? -eq 0 ]; then
tar cvpzf ~/mnt/sdc1/backup.tgz ...
fi
note: the replace the ellipses with the rest of your tar command.

tar two directories with the same name to one archived directory [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
What I'm doing is:
tar -czf etc.tar.gz /etc /usr/local/etc
And when I extract it I will have two directories:
1) etc
2) usr
What I want is to do it this way that I will have only etc after extracting with contents of this two directories.
Thanks.
Is there any other way than creating temporary directory with merged files from /etc and /usr/local/etc and then removing it?
cd /
tar -cf /path/to/etc.tar etc/
cd /usr/local
tar -rf /path/to/etc.tar etc/
cd /path/to
gzip etc.tar

Resources