How to download files from a Linux server from the command line? - linux

I'd like to start using my localhost to develop from. I am trying to work out the best way to sync my local folder with the files directory on the remote web server. In some cases there will be 10,000+ files.
This is not for component files such as php, css, javascript etc. This is for content and media files which I do not wish to use git/svn for.
Thanks

I would recommend using rsync. It's built specifically for remote synchronization tasks. Take a look at the compression and differential modes.

There is always rsync
http://linux.about.com/library/cmd/blcmdl1_rsync.htm
It is built for exactly this kind of task, and optimized for syncing when there are small deltas to large file sets.

if rsync is out of the question you can try wget, it has some nifty fetuers.

Related

Updating a website through SSH

I'm only partially familiar with shell and my command line, but I understand the usage of * when uploading and downloading files.
My question is this: If I have updated multiple files within my website's directory on my local device, is there some simple way to re-upload every file and directory through the put command to just update every single file and place files not previously there?
I'd imagine that i'd have to somehow
put */ (to put all of the directories)
put * (to put all of the files)
and change permissions accordingly
It may also be in my best interests to first clear the directory to I have a true update, but then there's the problem of resetting all permissions for every file and directory. I would think it would work in a similar manner, but I've had problems with it and I do not understand the use of the -r recursive option.
Basically such functionality is perfected within the rsync tool. And that tool can also be used in a "secure shell way"; as lined out in this tutorial.
As an alternative, you could also look into sshfs. That is a utility that allows you to "mount" a remote file system (using ssh) in your local system. So it would be completely transparent to rsync that it is syncing a local and a remote file system; for rsync, you would just be syncing to different directories!
Long story short: don't even think about implementing such "sync" code yourself. Yes, rsync itself requires some studying, as many unix tools it is extremely powerful; thus you have to be very diligent when using it. But thing is: this is a robust, well tested tool. The time required to learn about it will pay out pretty quickly.

GNU make's install target to push files on a remote SSH?

I'm working on a project that needs to be tested on an embedded Linux system. After every little change, I have to scp all files to the device over a SSH connection. Can you suggest a more convenient way to deploy files on a remote target? For example some trick on make's install command:
make install INSTALL='scp 192.168.1.100:/'
or something.
if you can use scp, you can probably also use rsync, specifically rsync over ssh. Use of rsync has as advantage is that it builds a delta of source and destination files, and transfers only what is necessary. In case of transfer after changing very little this would be of considerable benefit. I'd probably invoke it if building completes without error, like make ... && upload (where upload could be a script covering the details of transfer)
Just for completeness, sshfs is often quite useful. You can mount a remote folder visible over ssh on to a folder on your local hard disk. Performance is not great, but certainly serviceable enough for a deploy step, and it's transparent to all tools.

Get whole domain content with wget or other commands in linux?

I would like to copy a client project but I only have FTP-access. Normally I'd do it with SSH-access, but in this case it's not possible. The problem is the size of the project (nearly 3GB)
Is there a solution to copy the project to my server only with FTP-access?
The size isn't the problem here. Because of the encryption a SSH upload produces much more overhead than a FTP upload, therefore the answer is: Of course you can use FTP for file uploads, even if they large uploads. FTP was meant for this.
The more important concern is security. If you are normally using SSH for file uploads you'll for sure having security in mind (because FTP would been faster than SSH). If your provider does support SFTP you could use it as an alternative.

How upload my website on different servers at the same time?

I want to create mirrors for my website and I want to know if there is a way to upload my website on different servers at the same time ( through filezilla, wget or other tools ).
Look at using rsync
Mirroring with rsych
When you deploy code you need to make sure that your code changes land on all your servers, one way to do this would be to use a tool like Capistrano if you're inclined to ruby or Fabric or Puppet if you prefer python.
The other way is to deploy onto one server and periodically mirror them over to the other servers using rsync as Paul suggested

Edit files on server with Eclipse

I'm trying to figure out how to do this with Eclipse. We currently run SVN and everything works great, but I'd really like to cut my SSH requests in half and use Eclipse to modify some files directly on the server. I'm using the below build of eclipse... how can I do this?
Eclipse for PHP Developers
Build id: 20100218-1602
Update
I have no intention of eliminating SVN from the equation, but when we need to make a hotfix, or run a specific report or function as a one-time thing, I'd much rather use Eclipse than terminal for modifying that kind of thing.
Have a look at How can I use a remote workspace over SSH? on the Eclipse wiki. I'm just quoting the summary below (read the whole section):
Summing up, I would recommend the
following (in order of preference):
VNC or NX when available remotely, Eclipse can be started remotely and
the network is fast enough (try it
out).
Mounted filesystem (Samba or SSHFS) when possible, the network is fast
enough and the workspace is not too
huge.
rsync when offline editing is desired, sufficient tooling is
available locally, and no merge issues
are expected (single user scenario).
RSE on very slow connections or huge workspaces where minimal data
transfer is desired.
EFS on fast connections when all tooling supports it, and options
like VNC or mounted filesystem or
rsync are not available.
But whatever you'll experiment, don't bypassing the version control system.
You could use something like SSHFS, but really, it's a better idea to use some kind of source control system instead of editing files directly on the server. If Subversion isn't sufficient, perhaps you might try a DVCS like Git or Mercurial.

Resources