Download non web accessible file with wget - linux

Is it possible to download a file in say /home/... using wget to my local machine? I'm pretty newbish on the bash shell side so perhaps this is just a matter of using the options correctly. What I've gleaned is that something like this should work, but my test aren't downloading the file locally but placeing them within the folder i'm using wget in
root#mysite [/home/username/public_html/themes/themename/images]# wget -O "tester.png"
"http://www.mysite.com/themes/themename/images/previous.png"
--2011-09-08 14:28:49-- http://www.mysite.com/themes/themename/images/previous.png
Resolving www.mysite.com... 173.193.xxx.xxx
Connecting to www.mysite.com|173.193.xxx.xxx|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 352 [image/png]
Saving to: `tester.png'
100%[==============================================================================================>] 352 --.-K/s in 0s
2011-09-08 14:28:49 (84.3 MB/s) - `tester.png' saved [352/352]
Perhaps the above is a bad example but I can't seem how to figure out how to use wget (or some other command) to get something from a non web accessable directory (its a backup file) is wget the correct command for this?

wget uses the http (or ftp) protocol to transfer it's files, so no, you can't use it to transfer anything which is not availible through those services. What you should do is use scp. It uses ssh, and you can use it to get any file (which you have the permission to read, that is).
Say you want /home/myuser/test.file from the computer mycomp, and you want to save it as test.newext. Then you'd invoke it like this:
scp myuser#mycomp:/home/myuser/test.file test.newext
You can do a lot of other nifty stuff with scp so read the manual for more possibilities!

This belongs on superuser, but you want to use scp to copy the file to your local machine.
When a file isn't web accessible, you cant' get it with wget.

Related

how to use wget to download directory with latest timestamp

i have an internal Linux http server where directories with a specific naming convention will be uploaded on a daily basis from a remote site.
url: http://10.10.10.10/test
Contents
test123
test124
test125
test126
All directories will be having date and timestamp as well. Is there any way i can download the latest directory starting with test using wget or curl to my local machine? in this example it is test126
Kindly help
wget doesn't do that automatically, you can do it in two steps:
download http://10.10.10.10/test, parse it, get the last entry
feed the result to wget -r
In these cases though the best solution is to set a symlink on the server that points always to the last directory, in your case:
http://10.10.10.10/test/latest -> http://10.10.10.10/test/test126

Apache and selinux, use of linux command in retrieving files

how can we access an Apache server using a Linux command, to retrieve file.
the file which is to be retrieved has been copied to a directory.
Without knowing more about what you are doing, I would suggest looking into wget or curl.
For example: to copy a file available on a web server via url to the current directory using the wget command:
wget http://www.example.com/path/to/file.txt
or, if you are accessing your web server by IP address:
wget http://192.168.1.1/path/to/file.txt

what is the use of "wgetrc" command in wget download

What is purpose of "wgetrc" in the below commands.
-sh-3.00$ WGETRC=/hom1/spyga/spp/wgetrc_local wget --directory-prefix=/home1/spyga/spp/download ftp://127.0.0.1/outgoing/DATA.ZIP
wgetrc_local files having the credentials of ftp server.
normally i am downloading the files from ftp server using below command.
-sh-3.00$ wget --ftp-user=xyz--ftp-password=12345 ftp://localhost/outgoing/DATA.ZIP
what is the different between above commands.
Please help me out to understand the commands.
Thanks you.
The first command simply specifies an alternative configuration file to use instead of the default ~/.wgetrc. You could also specify it using --config=/hom1/spyga/spp/wgetrc_local as argument to wget.
This file can contain wgetrc commands that change the behaviour of wget. In this case it's probably done so user and passwords don't have to be supplied on the command line. Specially on multiuser systems it is a security risk to pass passwords on the command line, as that can possibly be viewd by other users, so it's a little better to store them in a file with restricted access permissions instead. This way only processes started by the owner of the file can access it.
Another use of the wget startup file is to change it's default settings, user agent etc...
It's all documented here.

How to send file to Sharepoint from Linux creating non existend directories

I have a problem while sending file from linux to SharePoint. Everything is fine if I am uploading to existing directory, I use this method:
curl --ntlm --user username:password --upload-file myfile.xls https://sharepointserver.com/sites/mysite/myfile.xls
Unfortunately problem arises when I point the target to non existing directory, like:
curl --ntlm --user username:password --upload-file myfile.xls https://sharepointserver.com/sites/mysite/nonexist/myfile.xls
I would like it to create all necessary directorie on the path. I've tried to use "--create-dirs" CURL option, but it doesn't work.
Any ideas how to achieve the goal? It doesn't have to be CURL actually, i can use different method available on linux.
As the name (CLIENT URL) suggests, you will not be able to create new directories on remote SERVERS involving http/https while uploading files.
For downloads involving http/https server, --create-dirs option is applicable only on local machines to create new directories (for instance, when you are downloading a content on to your local linux machine).
However, while using ftp/sftp to a server, you will be able to create new directories on the remote server.

how to transfer a file to my server directly from another server

Hey guys what is the easiest way to transfer a file to my server directly from another server, this way I won't download the file to my pc and then upload it to my server, so the requested file should look like http://www.examplesite.com/file.zip
my server is running linux, but I don't have SSH access.
So how can I do this ?
and thanks guys :D
Without SSH it will be very difficult. Possibly rsync might work, if its on both servers with damons set up. RCP (remote copy) exists, its simlar to SCP with out the SSH part, but I doubt its installed due to security concerns.
You have to start a shell on your server. Then try :
man wget
And use :
wget http://www.examplesite.com/file.zip
If you can not have acces to a shell then tell us exactly what control you have over your server.
my2c

Resources