Why this file do not get downloaded into a specified location? - linux

I am downloading the file in this link. I am using Ubuntu 12.04 and I used the below command to download it.
wget -p /home/ubuadmin/CUDA http://developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run
Below is my command line input and output.
root#ubuserver3:/home/ubuadmin# wget -p /home/ubuadmin/CUDA http://developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run
/home/ubuadmin/CUDA: Scheme missing.
--2014-03-11 08:06:28-- http://developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run
Resolving developer.download.nvidia.com (developer.download.nvidia.com)... 23.62.239.35, 23.62.239.27
Connecting to developer.download.nvidia.com (developer.download.nvidia.com)|23.62.239.35|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 686412076 (655M) [application/octet-stream]
Saving to: `developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run'
100%[======================================>] 686,412,076 663K/s in 16m 56s
2014-03-11 08:23:24 (660 KB/s) - `developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run' saved [686412076/686412076]
FINISHED --2014-03-11 08:23:24--
Total wall clock time: 16m 56s
Downloaded: 1 files, 655M in 16m 56s (660 KB/s)
It says the download is completed but I can't find the file in that folder. I am accessing this server remotely using PuTTY, and using WinSCP to see the file structure. What has gone wrong? Why is it missing even it is downloaded?

To set the target folder, use -P (upper case) instead of -p.
From man wget:
-P prefix
--directory-prefix=prefix
Set directory prefix to prefix. The directory prefix is the directory
where all other files and subdirectories will be saved to, i.e. the
top of the retrieval tree. The default is . (the current directory).

cd /home/ubuadmin
mkdir test
cd test
wget http://developer.download.nvidia.com/compute/cuda/5_5/rel/installers/cuda_5.5.22_linux_32.run
after use this command
stat cuda_5.5.22_linux_32.run
and show output

The "p" needs to be capitalized. Keep in mind Linux is case sensitive in most aspects.

Related

what options to use with rsync to sync online new files to remote ntfs drive?

i run a mixed windows and linux network with different desktops, notebooks and raspberry pis. i am trying to establish an off-site backup between a local raspberry pi and an remote raspberry pi. both run on dietpi/raspbian and have an external hdd with ntfs to store the backup data. as the data to be backuped is around 800GB i already initially mirrored the data on the external hdd, so that only the new files have to be sent to the remote drive via rsync.
i now tried various combinations of options including --ignore-existing --size-only -u -c und of course combinations of other options like -avz etc.
the problem is: nothing of the above really changes anything, the system tries to upload all the files (although they are remotely existing) or at least a good number of them.
could you give me hint how to solve this?
I do this exact thing. Here is my solution to this task.
rsync -re "ssh -p 1234” -K -L --copy-links --append --size-only --delete pi#remote.server.ip:/home/pi/source-directory/* /home/pi/target-directory/
The options I am using are:
-r - recursive
-e - specifies to utilize ssh / scp as a method to transmit data, a note, my ssh command uses a non-standard port 1234 as is specified by -p in the -e flag
-K - keep directory links
-L - copy links
--copy-links - a duplicate flag it would seem...
--append - this will append data onto smaller files in case of a partial copy
--size-only - this skips files that match in size
--delete - CAREFUL - this will delete local files that are not present on the remote device..
This solution will run on a schedule and will "sync" the files in the target directories with the files from the source directory. to test it out, you can always run the command with --dry-run which will not make any changes at all and only show you what will be transferred and/or deleted...
all of this info and additional details can be found in the rsync man page man rsync
**NOTE: I use ssh keys to allow connection/transfer without having to respond to a password prompt between these devices.

No files have been transferred after rsync

When I ran 'rsync' in the following way, no file has been transferred?!
rsync -rv -e "ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null -i /home/user/.ssh/myrsd.pem" /cygdrive/c/user/local/temp/somefolder root#xx.xx.xx.xx:/
terminal output:
sending incremental file list
sent 118 bytes received 26 bytes 96.00 bytes/sec
total size is 1,560 speedup is 10.83
rsync works only on deltas- meaning if a file already exists on destination folder, and it is identical to the file in the source - it won't be copied. only new/updated files will be transferred
so if all files are already there- rsync will have nothing to do
The culprit is the missing slash after the local folder - 'somefolder' in this case. It should be '/cygdrive/c/user/local/temp/somefolder/' instead of '/cygdrive/c/user/local/temp/somefolder'
In the former case the output shows no files after "sending incremental file list" while it will show the files transferred in the latter one.
sending incremental file list
xx/xx/myfile

Rsync to Amazon Linux EC2 instance - failed: No such file or directory

I want to upload the content of one directory to my Amazon EC2 with rsync:
rsync -r -t -v --progress -z -s -e "ssh -i /home/mostafa/keyamazon.pem" /home/mostafa/splitfiles ubuntu#ec2-64-274-161-87.compute-1.amazonaws.com:~/splitfiles
but I receive the following error message:
sending incremental file list
rsync: link_stat "/home/mostafa/splitfiles" failed: No such file or directory (2)
rsync: change_dir#3 "/home/ubuntu//~" failed: No such file or directory (2)
rsync error: errors selecting input/output files, dirs (code 3) at main.c(712) [Receiver=3.1.0]
and if I do a dry run with grsync, it works correctly
In rsync the trailing / is very important. Also you rsync usually defaults to ssh when one of the destinations contains a host.
So you if you want to preserver modification times then you can get rid of the -e and -s options.
Your command could be written as /home/mostafa/splitfiles/ ubuntu#ec2-64-274-161-87.compute-1.amazonaws.com:splitfiles/ - notice the trailing /'s provided that you have ssh configured to read the private key from your home directory.
On ubuntu you can add this to the key chain, by going
ssh-add [key-file]
And this will save you having to specify the keyfile everytime you ssh into the AWS machine.
The errors seem to say that on the local machine you don't have a source directory and the destination doesn't exist.
I completed this task with Filezilla instead, easier to use.
You are at home ~ if you cd ../ to root you will be able to run the command.

how to set destination directory in crontab, with cpanel

I want to use cron to download and unzip then move to direcotry in my server,
what i did:
wget http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz -
|gunzip GeoLiteCity.dat.gz &&
mv GeoLiteCity.dat 37.18.176.133/~work/wp-content/plugins/shabatkeeper/GeoLiteCity.dat
what i get:
--2014-12-22 11:00:03--
http://-/ Resolving -... failed: Name or service not known.
wget: unable to resolve host address `-'
FINISHED --2014-12-22 11:00:03--
Downloaded: 1 files, 11M in 1.1s (10.7 MB/s)
how to set destination directory?
Answer - what id did:
downloaded the file with
wget http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz
entered cpanel file browser and check where the file GeoLiteCity.dat.gz is.
3.then saw which directory was home and how its called and found the the path i needed is
/home/work/www/
that's it
This should do what you're looking for
wget -q -O- http://geolite.maxmind.com/wnload/geoip/database/GeoLiteCity.dat.gz | gunzip > /path/to/where/you/want/GeoLiteCity.dat

Howto created trimmed directory tree while using wget for ftp download

Am using wget for download files from ftp.
Ftp folder have name /var/www/html/
Inside this folder is located tree of folders & files, ~20 levels.
Am trying make ftp download (have no ssh access), of this all with wget.
wget -- recursive -nv --user user --password pass ftp://site.tld/var/www/folder/
This one command runs Ok. But it creates an folder structure.
~/back/site.tld/var/www/html/my-files-and-folders-here
Question:
Is any possibility - to say wget, not create ~/site.tld/var/www/html/ but make all tree, in current folder?
i.e. ~/back/my-files-want-here/ I.e. - to trim/cut certain path?
Thanks
Look for --no-host-directories and --cut-dirs in the manpage.
This should work like expected (maybe you have to increase/decrease cut-dirs):
wget --recursive --no-verbose --no-host-directories --cut-dirs=3 --user user --password password ftp://site.tld/var/folder

Resources