Update / download the actual file from the SVN - linux

I had an emergency change something in the file without going through committing to SVN. That is, instead of to the local computer to make changes and commit the this, I immediately changed the file on the server (for that was the reason).
Tell me, how can I download a file from a server on a computer?

If your server's space is Working Copy and your local workplace is Working Copy (and both WCs are binded to the same Repo-URL) you can commit from server & update from local
If your server isn't WC, you have more headache around syncing diverged changes

Related

How can I work on files on my server and keep them in sync?

I have set up a development web server using VMWare and Debian. It's all set up fine, but I have an problem.
I need to be able to work with the files on the server, or a copy of them. But, it's important that both sets of files are in sync. For example, in my text editor if I'm working on index.php I don't want to have to upload with FTP each time, and I don't want to manually keep track of what files I've edited etc.
Any ideas on how I can achieve this?
Besides version controlling you can achieve it with sshfs. It is basically like mounting a remote directory in your local system.
More info:
http://en.wikipedia.org/wiki/SSHFS
https://www.digitalocean.com/community/tutorials/how-to-use-sshfs-to-mount-remote-file-systems-over-ssh
After much searching I felt the best solution for my case is to use lsyncd to upload files to the development server anytime a change is made.
Although I use git I felt setting up a Git server and having to commit and push every time I make a change isn't what I want to be doing. Using lsyncd I'm able to use git on my local machine to keep track of the project.

How to the history of files changed on a Rsync server

How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?
My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.
If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.

Perforce overwrote code that I forgot to checkout. How do I recover it?

I forgot to check out a source code file before modifying it.
When I get last revision, Perforce overwrote that file, so my work is totally lost.
Is it possible to recover the file?
For future use, update your client workspace so that you specify "noallwrite, noclobber". If noclobber is set, Perforce will not overwrite your writable un-opened files: http://www.perforce.com/perforce/doc.current/manuals/cmdref/client.html
Only if your editor or your operating system saved a copy or it's been modified long enough that it made its way to your backups. Perforce will not make copies of such files, it blindly assumes that you didn't lie and will always honestly tell it when you want to edit a file.
if you are using eclipse then its possible to retrieve the local version using Compare With -> Local history . It helped me.
This has happened to me recently. For some reason, after I "p4 sync"-ed my workspace, and do p4 resolve, I noticed that my changes to a file were missing. I'm not sure if my changes were not saved or I haven't checked out the file. But I really remembered that my changes were saved. :(
I have been using Visual Studio for development and it doesn't have local history unlike in Eclipse. Luckily, that file is a javascript file and I have been testing my application in Internet Explorer. Since IE does some caching on some internet data like js files, what I did is to check the directory where it saves temporarily files (Internet Options -> Browser history settings ) there you'll see different versions of the files saved. I did recover my files! It was really just luck!
After that incident, I installed a plugin for visual studio for storing local history of files everytime it's being saved. http://visualstudiogallery.msdn.microsoft.com/226c2108-9da9-407d-b90d-9783040d27b8
Best thing to avoid these cases is to:
branch out your files first into a separate devline during development and submit every milestones you accomplished
incrementally. In this way you'll always have versions of important
changes you do during development. After this you could
integrate it back to the parent branch/mainline.
http://answers.perforce.com/articles/KB_Article/Branching-Codelines-and-Merging-Changes
Hope this helps!
If you fired the following command (which is a FORCE sync option), only then will Perforce update ALL your files.. including ones which are WRITABLE. The only exception is that any file that you have OPENED in perforce will not be overwritten. So if your file was made WRITABLE using OS command, and not using p4 open.. they will get overwritten by p4 sync -f.
p4 sync -f
The other possibility is that you did p4 sync, and still perforce overwrote your writable files (which were not opened using p4) because your workspace settings don't have noallwrite, noclobber specified. Usually by default, these settings are already specified, so that Perforce doesn't clobber writable files.

tortoise svn delete local file

i am new to tortoise svn, and im working on a group project. the main files are located on a separate server. how can i delete any working folders on my machine without affecting the files on the server?
Deleting your local working folders would never delete the files in svn server unless you do an svn commit to the server. So you can safely delete the working folders on your machine
You can delete the folders/files on your machine - as long as you don't tell SVN the server will never know. Which means - just don't Commit and you should be fine.
When i'm learning an new source control system, i usually will zip the local directory before i delete ... just incase i screw up. Then after i'm sure i didn't screw up I stop doing the zips.

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

Resources