ftp push changed files to server - linux

Is there an easy way for me to automatically search "recursively" through a directory and put changed files up through ftp to my live server in their correct spots?
CLI is ideally what I'm looking for.
I'm tired of manually searching out the files I need to do and doing it individually or by queue, trying to make this quick and painless

If the server is under your control, you might want to try rsync instead of FTP.

rsync is the way to go for keeping directories balanced. +1 for Frederic.
The other way to go is change management, like Subversion. Once you set it up, files checked in over time can be brought to productions with a simple "svn up" command.
Subversion: http://www.wandisco.com/subversion/os/downloads

Related

Using haxe to edit remote file?

I've searched in haxelib for a library to use for remotely editing a file on a server using ssh connection with haxe, or listing files in directory..
Has any one done this with haxe?
I want to build a desktop app to create a yaml editor that will change settings files of several servers using a frontend like haxe-ui.
Ok, there are probably a lot of ways you could do it, but I would suggest separating your concerns:
desktop app to create a yaml editor
Ok, that's a fine use case for Haxe / a programming language. Build an editor, check.
change settings files (located on) several servers
Ok, so you have options here. Either
Make the remote files appear as local files via some network file system, or
Copy the files locally, edit them , and copy them back, or
Roll your own network-enabled service that runs on each server, receives commands, and modifies the files.
Random aside: Given that these are settings files, you probably also want to restart some service after changes are made.
I'd say option 2 is the easiest. There are even many ways to do that:
Use scp to both bring the settings files to a local location, edit them locally, and then push them back. And if you setup SSH keys, you won't have to bother with passwords.
Netcat is another tool for pushing bytes (aka files) over the network. It's simpler than scp, but with no security measures.
Or, get creative / crazy, and say, "my settings files will all be stored in a git repo. The 'sync' process will be a push / pull setup."
There are simply lots of ways to get this done.

How can I work on files on my server and keep them in sync?

I have set up a development web server using VMWare and Debian. It's all set up fine, but I have an problem.
I need to be able to work with the files on the server, or a copy of them. But, it's important that both sets of files are in sync. For example, in my text editor if I'm working on index.php I don't want to have to upload with FTP each time, and I don't want to manually keep track of what files I've edited etc.
Any ideas on how I can achieve this?
Besides version controlling you can achieve it with sshfs. It is basically like mounting a remote directory in your local system.
More info:
http://en.wikipedia.org/wiki/SSHFS
https://www.digitalocean.com/community/tutorials/how-to-use-sshfs-to-mount-remote-file-systems-over-ssh
After much searching I felt the best solution for my case is to use lsyncd to upload files to the development server anytime a change is made.
Although I use git I felt setting up a Git server and having to commit and push every time I make a change isn't what I want to be doing. Using lsyncd I'm able to use git on my local machine to keep track of the project.

How to the history of files changed on a Rsync server

How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?
My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.
If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

What's the best way to move to a new Perforce server?

My home Perforce server died. I set up a new one.
The project I set it up to support died in the planning phase. The contents of the depot at that point were some prototype code and we never got to setting up a disaster recovery plan.
The dev machines still have the existing code on them. As much as possible, I'd like the change of servers to be transparent to the developers--use the same depositories and the same directories, just change the name of the server to connect to and get back to work.
What do I need to do in order to make this happen?
I assume you don't have access to the perforce depot files from your dead server? I assume you know that you will lose all your history.
If that's the case all you need to do is setup the new server, create a user / client with the same root clientspec path as your original clientspec was using on your dev machine and checkin all the files into perforce. Pretty simple really...
You may need to rebind is SCM binding that you may have in tools like Visual Studio but that's about it.
What Shane suggested will populate the depot with one person's version of the files. But if you have another user who also has a copy then you'll need a couple of extra steps.
Firstly, just set one machine up as suggested by Shane.
You now need to get the second user set up. If you are confident that the version of the code user 2 has exactly matches what you put in the new server, then just create a client spec (probably same name as used before), and then sync using the "Force" flag. This will overwrite all the files on user 2's machine, and - more importantly - ensure Perforce knows which versions you really have.
However, if you are in any doubt as to any differences in code, then do not do the initial sync from the second user's machine. Instead, set up the client spec, then use the "Reconcile offline work" option - from P4V select the workspace, then it's a right click option. Then just walk through the subsequent dialog to sort out what you need.
Finally, if you want a very quick & dirty backup system for your server, I've posted some notes on my blog here - should take you just a couple of minutes to set up.

Resources