Export SVN repository over FTP to a remote server - linux

I'm using following command to export my repository to a local path:
svn export --force svn://localhost/repo_name /share/Web/projects/project_name
Is there any, quite easy (Linux newbie here) way to do the same over FTP protocol, to export repository to a remote server?
Last parameter of svn export AFAIK have to be a local path and AFAIK this command does not support giving paths in form of URLs, like for example:
ftp://user:pass#server:path/
So, I thing there should be some script hired here to do the job.
I have asked some people about that, and was advised that the easiest way is to export repository to a local path, transfer it to an FTP server and then purge local path. Unfortunately I failed after first step (extract to local path! :) So, the support question is, if it can be done on-the-fly, or really have to be split into two steps: export + ftp transfer?
Someone also advised me to setup local SVN client on remote server and do simple checkout / update from my repository. But this is solution possible only if everything else fails. As I want to extract pure repository structure, without SVN files, which I would get, when go this way.
BTW: I'm using QNAP TS-210, a simple NAS device, with very limited Linux on board. So, many command-line commands as good as GUI are not available to me.
EDIT: This is second question in my "chain". Even, if you help me to succeed here, I won't be able to automate this job (as I'm willing to) without your help in question "SVN: Force svn daemon to run under different user". Can someone also take a look there, please? Thank you!

Well, if you're using Linux, you should be able to mount an ftpfs. I believe there was a module in the Linux kernel for this. Then I think you would also need FUSE.
Basically, if you can mount an ftpfs, you can write your svn export directly to the mounted folder.

not sure about FTP, but SSH would be a lot easier, and should have better compression. An example of sending your repo over SSH may look like:
svnadmin dump /path/to/repository |ssh -C username#servername 'svnadmin -q load /path/to/repository/on/server'
URL i found that info was on Martin Ankerl's site
[update]
based on the comment from #trejder on the question, to do an export over ssh, my recomendation would be as follows:
svn export to a folder locally, then use the following command:
cd && tar czv src | ssh example.com 'tar xz'
where src is the folder you exported to, and example.com is the server.
this will take the files in the source folder, tar and gzip them and send them over ssh, then on ssh, extract the files directly to the machine....

I wrote this a while back - maybe it would be of some use here: exup

Related

Automating an install of Apache Ant

I've manually installed ANT on many servers simply by unzipping the ant files in a location and setting up the ~/.bash_profile to configure the users' path to see it.
I need to automate the setup now on servers which do not have internet connectivity.
We are using Nolio for deployment, but I don't care if the automation is done via nolio. If it can be scripted, I can easily just make Nolio call the script.
I don't think editing the users' .bash_profiles is a good way to do the automation.
So, assuming I get Ant on to the servers and unzip it, what's the best way to install it so that all users will have access to it?
You can try using pssh (parallel ssh). It's pretty awesome. Create a file with all your remote hosts, run:
pssh -h "command1 && command2 && command3"
You can use pscp to deliver scripts, then use pssh to execute them. Works very well. Alternatively, you could become a puppet master and work everything off puppet. You can do some cool stuff with it, like automating builds based on hostname convention. LAMP build? Name the host web01.blarg.awesome or whatever, setup puppet to recognize it based on a regex, then deliver the appropriate packages.
GL.

linux (red hat) compare directories and copy over files that are different

I basically want rsync, but don't have the luxury of being able to install it.
But I need a way to deploy files from one server to another. I edit one or more files on one server and then need to copy all modified files to another server by comparing files that aren't the same (and being able to exclude .htaccess files)
Does anyone know of an easy way to do this?
Thanks,
Scott
(I will assume that you have shell access to both servers)
You do not need to install rsync system-wide. You can install it in your home-directory. First get a copy of the rsync binary for your distribution:
You can extract it from the rsync RPM package using rpm2cpio and cpio
You can copy it from another RedHat installation
You can copy it from another Linux installation for the same platform - there is a string possibility that it will work fine
Then you need to permanently modify the PATH environment variable so that the rsync command is found by your shell. If you do that for your user accounts in both servers, you can use rsync normally without the need for root privileges.
If you have access to install rsync on one server, that's all you need minimum.
If not, the question is what tools do you currently have available? scp? sftp? ftp? ssh? telnet? find?

Creating a local Fedora repository with *Anonymous* rsync

I am trying to setup a local Fedora repository in a local LAN network.
Unfortunately I cannot run rsync in daemon mode because I'm behind a firewall on which I have no control.
Could anyone guide me on how to setup rsync using shell?
I tried the mirrors in http://mirrors.fedoraproject.org/publiclist , I get prompted for passwords. I thought these were supposed to be anonymous access. What am I doing wrong?
Lets say I want to create a local repository for Fedora 13 i386 os, what command would I need to issue on my local system?
Thanks in advance.
The mirrors on the website you linked should all be standard rsync:// mirrors without any passwords
Example:
mkdir os updates
rsync -av rsync://fedora.mirror.netriplex.com/Fedora/releases/13/Everything/i386/os/ os/
rsync -av rsync://fedora.mirror.netriplex.com/Fedora/updates/13/i386/ updates/
If you use cobbler, it will make sure the repositories are up to date for you so long as you cron a "cobbler sync".

Managing user configuration files across multiple computers

I commonly work on multiple computers. I have various configuration files, eg, .bashrc, .gitconfig, .irbrc, .vimrc and configuration folders, eg, .vim/ that contain valuable customizations.
Sometimes I want small variations in configuration between the different computers.
I want to use version control to manage these different files.
do others use version control to manage their configuration files?
what are some hints that might make this easier?
what's the most elegant way of dealing with variations between the computers?
I'm comfortable with git; any other suggestions?
I keep a folder at ~/config/ which is a bzr repository. I push/pull the repository between my various computers to sync it up. I have an install script which I use to make symlinks to my home directory:
#! /bin/sh
# link all files to the home directory, asking about overwrites
cd `dirname $0`
SCRIPT_DIR=`pwd`
SCRIPT_NAME=`basename $0`
FILES=`bzr ls --versioned --non-recursive`
cd $HOME
for FILE in $FILES; do
ln --symbolic --interactive $SCRIPT_DIR/$FILE
done
rm $TARGET_DIR/$SCRIPT_NAME
If you want to use git instead of bzr, you can instead use:
FILES=`git ls-tree --name-only HEAD`
(I had to ask SO to figure that out)
EDIT: I don't actually do this anymore, now I have a dotfiles repo on github, with a nice rake install script that someone else wrote.
At the moment, I use a cloned git repo. To keep things simple, the only file that needs to vary between the different machines is .bashrc. It's nice if there can be just one version of this file that responds differently on different machines. Thus, in my .bashrc:
if [ $(hostname) == 'host1' ]; then
# things to do differently on host1.
elif [ $(hostname) == 'host2' ]; then
# things to do differently on host2.
fi
This obviously has some limitations (such as that a different technique would be required for .vimrc or other config files needing customization), but it works fairly well.
If you use git, you could define an "origin" repo to be the master; and then do a clone on each computer you work. you could use a branch for every computer to have your set of config files.
With CfEngine you can manage config files across machines and do also many more things!
The learning curve is maybe a bit high but worth it if you have to manage/update/maintain a pool of computers running linux regularly.
Easy. Use DropBox for that:
http://www.nixtutor.com/linux/sync-config-files-across-multiple-computers-with-dropbox/
I use slack for a similar situation. slack allows definition of roles/subroles so you can manage files with small variation either through a cloned file or patch. The slack directory is then managed by git in my deployment.
Here are some dotfile managers:
homesick: Based on Ruby
homeshick: Same as first one, but without ruby dependency
dfm: Written in Perl
git with branches for custom computers, with automated sync at login seems like a good solutions to me.
I've used etckeeper for versioning configurations, but I've never actually expanded to user configurations.
This kind of question comes up occasionally, and I've never seen a tool to handle this common use case, so I wrote a script that uses git and symlinks to manage these files.
See http://github.com/bstpierre/dotfiles
It is not perfect. There is currently a bug related to handling directories, and there is no support yet for variations across computers.
Before using any tool of this nature, make sure you have good backups!
I think what you want could be similar to what I've been doing...
Make a directory in home called .host_configs/ . This is version controlled. Or in my case it lives in a special folder on a central computer, I scp it down on any new machine. Inside it make a folder for every host that you want different configurations for. The folder for each host should be named after the short hostname for that machine. So in your git repo you have:
.host_configs/
homecomp1/
girlfriendcomp1/
workcomp1/
workcomp2/
In each host specific folder, put the .vimrc, .irbrc, etc., configuration files for that specific box.
And also, in each host folder make a file called .[SHORT_HOST]_rc. For instance, if your machine is name "sane" have a file named .sane_rc ... This file will contain the lines that would normally be in .bashrc that are unique to that host. For instance, if it's a mac and it needs alias ls='ls -GF' instead of alias ls='ls --color=auto' which works for most nix machines for ls with colors, put that line in the .[SHORT_HOST]_rc for that machine, along with whatever special functions, declarations, etc, that would normally go into the .bashrc or .profile etc. (or .zshrc, .tschrc, as the case may be). So the version controlled ~/.host_configs/ folder looks like:
.host_configs/
homecomp1/
.homecomp1_rc #special shell configs for this hostname
.vimrc #you know the rest
.irbrc
.Xresources
girlfriendcomp1/
.girlfriendcomp1_rc
.vimrc
.bubblebathrc
workcomp1/
.workcomp1_rc
.bashrc
.vimrc
workcomp2/
.workcomp2_rc
.bashrc
.vimrc
I use all the same barebones $HOME/.bashrc (or ~/.tshrc etc) on all of my machines. I just take the basic one that comes with the distro in question and move all of the host-specific configuration into the .host-configs/[SHORT_HOST]/.[SHORT_HOST]_rc file.
Put this at the bottom (of $HOME/.bashrc):
export SHORT_HOST="sane"
for file in `find ~/.host_configs/$SHORT_HOST -name ".*"`
do
ln -s $file `basename $file`
done
source ~/`.$SHORT_HOST`_rc
(Finds all of the dot-files for the host and makes a symlink in home to the~/.host_configs/foo_host folder).
Your dot files are in their normal location but they are symlinked to version control. The above also sources all of the lines in your [$SHORT_HOST]_rc file into .bashrc
You can commit back to git from the ~/.host_configs/ folder whenever you have changes.
That's what it looks like in shell, which is probably all you need, but if you need other features, I would write something that uses the same principles (sourcing an external .rc file into .bashrc and symlinking all the config files to the structured version control folder) in something more versatile/less ugly than shell. So instead of the above in your .bashrc, there could be:
export SHORT_HOST="sane"
ruby ~/import_conf.rb $SHORT_HOST
...and write your import_conf.rb to do more complex conf management, like placing a specific configuration file in some directory besides home, or handling a config folder like .ssh/, .subversion/ etc. That's what I do, it's pretty elegant for me, but there may be better solutions. Dropbox with some creative symlinks is also an excellent idea, though you're relying on a third party, and you need to be in a graphical environment. Also note there are inconsistencies between what you can do with symlinks + dropbox in Linux and shortcuts + dropbox in Windows if you implement something that wants to play with Windows.
Now there is also vcsh
From the README:
vcsh - manage config files in $HOME via fake bare git repositories
[...]
vcsh allows you to have several git repositories, all maintaining their working trees in $HOME without clobbering each other. That, in turn, means you can have one repository per config set (zsh, vim, ssh, etc), picking and choosing which configs you want to use on which machine.
Works perfectly, but may be a bit daunting if you are not an experienced git user.
Most of these answer address sync, but not how to tailor the files for the specific device. filetailor is an open-source Python program for this exact issue. Based on a YAML configuration file, it can make small changes to the files using device-specific variables or using device-specific comments in the files. Then, use another program such as Syncthing or Git to transfer the files.
For example, the following line would be commented out on every device except the one with hostname device1.
alias MYHOME='/home/dev1home/' #{filetailor device1}
Disclaimer: I had this same issue and made filetailor to solve it.

Is there an ftp plugin for gedit that will let me work locally?

I'm trying to switch from a windows environment to Linux. I'm primarily PHP developer, but I do know quite a bit about other languages such as CSS, XHTML and Javascript. I need a way of editing my files locally because I work in a git repository and need to commit my saves. On windows I used Aptana and PDT. I'd save my files, upload via Aptana, then commit my work with git.
I need to get a work flow going on my Linux machine now. If you know a better way to do this let me know, however my real question is, is there a plugin that allows gedit to upload files instead of working remotely?
git was designed for distributed development and works well as a mechanism for deploying code to a web server.
On your Linux PC, git clone your git repository url. Edit and commit locally and then git push the changes to the git repository. Then, if you have shell access on the server, use git pull to copy the changes to your server.
To ftp sync, you could set up a branch, ftpbranch, that corresponds to what is on the server, and then each time you want to sync ftpbranch with master:
filestoput=`git diff --name-only master ftpbranch`
Now upload the files:
for f in $filestoput; do curl --ftp-create-dirs -T $f ftp://serverurl
Now update ftpbranch indicating these files have been copied to the server:
git checkout ftpbranch; git merge master; git checkout master
When using linux, you can mount the ftp server to a local folder, then opening and save file from that folder will automatically download and upload the file to ftp server.
If you use ubuntu, just click on Places > Connect To Server.... Choose FTP in Service Type dropdown, fill in the required info, then don't forget to bookmark it.
After this, you can open the file directly in any text editor, not just gedit. I would recoment geany for serious programming editor, because it have a lot of neat feature, almost same with Notepad++ in Windows.
But, since you already using git, why not just use git push to get the update and git pull to upload the update? I have long since uploading manually to my server. Git do all the work for me, synchronizing it between servers. Any particular reason why you still need ftp?

Resources