linux (red hat) compare directories and copy over files that are different - linux

I basically want rsync, but don't have the luxury of being able to install it.
But I need a way to deploy files from one server to another. I edit one or more files on one server and then need to copy all modified files to another server by comparing files that aren't the same (and being able to exclude .htaccess files)
Does anyone know of an easy way to do this?
Thanks,
Scott

(I will assume that you have shell access to both servers)
You do not need to install rsync system-wide. You can install it in your home-directory. First get a copy of the rsync binary for your distribution:
You can extract it from the rsync RPM package using rpm2cpio and cpio
You can copy it from another RedHat installation
You can copy it from another Linux installation for the same platform - there is a string possibility that it will work fine
Then you need to permanently modify the PATH environment variable so that the rsync command is found by your shell. If you do that for your user accounts in both servers, you can use rsync normally without the need for root privileges.

If you have access to install rsync on one server, that's all you need minimum.
If not, the question is what tools do you currently have available? scp? sftp? ftp? ssh? telnet? find?

Related

debian packaging and package.rules files

I am working on changing machines from the RHEL world over to the debian/ubuntu world, and I am struggling a bit with a packaging problem. I am trying to build a package for Ubuntu 16.4.
I've got an very old pre-compiled application that can only listen through xinetd. I am creating a binary only package similar to what this person was doing: I need my Debian rules file to simply copy files to it's target. I simply need to copy pre-compiled files into directories.
I have no problem getting files in /opt and in /var/log, however I have been trying to get the dpkg to copy the needed setup file into /etc/xinetd.d/
So I have a debian/package.install file something like this:
opt/oldapplication-3.10/* opt/oldapplication-3.10/
var/log/* var/log/
etc/xinetd.d/oldapplication /etc/xinetd.d
The xinetd setup file never makes it to xinetd.d, and trying to look at the dpkg install with debug doesn't give me any hints. The file is definitely in the tarball, it just simply never gets moved.
Looking through the different dh helper applications, I can't see anything that fits, and google does nothing to illuminate the problem.
Do I have to simply move the file over in a postinst script? Is that the only way to solve this, or is there a more "debian" way to do this by creating a file in the dpkg's debian directory? Is there a more generic setup I should be doing to put files into /etc?
Thanks.

how to transfer a home-dir structure from one linux PC to another linux PC

How can I transfer via tar the whole /home/.. tree with all the file attributes from one linux to another linux PC? I know I could write it to a tar file but I have not so much space on the target. Is there a way to pipe it with the tar command?
I would use rsync and ssh
Set up ssh
You need to install openssh-client on the receiving computer and openssh-server on the sending computer. Use the help distribution specific help docs for help setting this up.
rsync files
Go onto the 2nd computer and rsync the folder from the other first pc.
rsync -av ipofpc1:/home/ /home/
There are a variety of ways, but in this case I'd recommend using rsync.
First you need to install a SSH server on the target computer if you don't alrady have one running. For example on Debian/Ubuntu/Mint you'd do apt-get install openssh-server.
Then install rsync on the computer with the source folder. For example : apt-get install rsync.
And finally you can use rsync like this:
rsync --razh user#YOUR_OTHER_COMPUTER:/destination/path /home/

Linux unzip preserve case?

Working on a web site. A number of third party javascript libraries use mixed-case in their files and folders.
I am working on a windows system.
When ready to upload from my local windows XAMPP environment to my linux hosting, I use 7zip to create a zip file of my site. I use 7zip's -xr! feature to skip certain directories like my .git repository.
I FTP the resulting .zip file to my server and use the server's "unzip" function to explode it. All my files are there but they are all changed to lowercase!
This kills the website as the third party libraries that are mixed-case are no longer found.
I've tried unzip -C but that did not seem to do anything.
I also look in the archive prior to uploading and on windows, all the file name cases are preserved.
Tried using GNU32's windows tar but the --exclude function is not allowing me to skip the .git directories.
I need some help in the form of:
How to use unzip in linux such that is preserves case (googled until hairless, but no love found...)
How to use tar on windows such that it excludes particular directories
How to use something else to achieve my goal. I honestly don't care what it is... I'm downloading CYGWIN right now to see if it'll help at all. I may end up installing Linux in a virtual box just to try tar-gz from a virtual machine actually running linux but would REALLY rather avoid that hassle every time I want to pack up a pretty simple archive.
Zip works fine for packing, but unpacking is not kosher.
Use tar's --exclude-vcs option:
--exclude-vcs
exclude version control system directories
Example:
tar --exclude-vcs czf foo.tar.gz foo
or for a *.tar.bz2 archive
tar --exclude-vcs cjf foo.tar.bz2 foo
Try unzip -U file.zip; this might work if you have an old version of unzip. Otherwise, post the output of unzip -v and unzip -l file.zip.

Export SVN repository over FTP to a remote server

I'm using following command to export my repository to a local path:
svn export --force svn://localhost/repo_name /share/Web/projects/project_name
Is there any, quite easy (Linux newbie here) way to do the same over FTP protocol, to export repository to a remote server?
Last parameter of svn export AFAIK have to be a local path and AFAIK this command does not support giving paths in form of URLs, like for example:
ftp://user:pass#server:path/
So, I thing there should be some script hired here to do the job.
I have asked some people about that, and was advised that the easiest way is to export repository to a local path, transfer it to an FTP server and then purge local path. Unfortunately I failed after first step (extract to local path! :) So, the support question is, if it can be done on-the-fly, or really have to be split into two steps: export + ftp transfer?
Someone also advised me to setup local SVN client on remote server and do simple checkout / update from my repository. But this is solution possible only if everything else fails. As I want to extract pure repository structure, without SVN files, which I would get, when go this way.
BTW: I'm using QNAP TS-210, a simple NAS device, with very limited Linux on board. So, many command-line commands as good as GUI are not available to me.
EDIT: This is second question in my "chain". Even, if you help me to succeed here, I won't be able to automate this job (as I'm willing to) without your help in question "SVN: Force svn daemon to run under different user". Can someone also take a look there, please? Thank you!
Well, if you're using Linux, you should be able to mount an ftpfs. I believe there was a module in the Linux kernel for this. Then I think you would also need FUSE.
Basically, if you can mount an ftpfs, you can write your svn export directly to the mounted folder.
not sure about FTP, but SSH would be a lot easier, and should have better compression. An example of sending your repo over SSH may look like:
svnadmin dump /path/to/repository |ssh -C username#servername 'svnadmin -q load /path/to/repository/on/server'
URL i found that info was on Martin Ankerl's site
[update]
based on the comment from #trejder on the question, to do an export over ssh, my recomendation would be as follows:
svn export to a folder locally, then use the following command:
cd && tar czv src | ssh example.com 'tar xz'
where src is the folder you exported to, and example.com is the server.
this will take the files in the source folder, tar and gzip them and send them over ssh, then on ssh, extract the files directly to the machine....
I wrote this a while back - maybe it would be of some use here: exup

Run time installation directory of debian package contents

I have a debian package that I built that contains a tar ball of the files, a control file, and a postinst file. Its built using dpkg-deb and it installs properly using dpkg.
The modification I would like to make is to have the installation directory of the files be determined at runtime based on an environment variable that will be set when dpkg -i is run on the deb file. I echo out the environment variable in the postinst script and I can see that its set properly.
My questions:
1) Is it possible to dynamically determine the installation directory at runtime?
2) If its possible how would I go about this? I have read about the rules file and the mypackage.install files but I don't know if either of these would allow me to accomplish this.
I could hack it by copying the files to the target location in the posinst script but I would prefer to do it the right way if possible.
Thanks in advance!
So this is what I found out about this problem over the past couple of weeks.
With prepackaged binaries you can't build a debian package with a destination directory dynamicall determined at runtime. I believe that this might be possible if installing a package that is built from source where you can set the install directory using configure. But in this case since these are embedded Ubuntu machines they don't have make so I didn't pursue such an option. I did work out a non traditional method (hack) for installing that did work. Since debian packages simply contain a tar ball relative to / simply build your package relative to a directory under /tmp. In the postinst script you can then determine where to copy the files from the archive into a permanent location.
I expected that after rebooting and the automatic deletion of the subdirectory under /tmp that dpkg might not know that the file package existed. This wasn't a problem. When I ran 'dpkg -l myapp' it showed as still installed. Updating the package using dpkg/apt-get also worked without a hitch.
What I did find is that if you attempted to remove the package using 'dpkg -r myapp' that dpkg would try and remove /tmp which wasn't good. However /tmp isn't easily removed so it never succeeded. Plus in our situation we never remove packages but instead simply upgrade them.
I eventually had to abandon the universal package due to code differences in the sources resulting in having to recompile per platform but I would have left it this way and it did work.
I tried using --instdir to change the install directory of the package and it does relocate the files but dpkg fails since the dpkg file can't be found relative to the new instdir. Using --instdir is sort of like a chroot. I also tried --admindir and --root in various combinations to see if I could use the dpkg system relative to / but install relocate the files but they didn't work. I guess rpm has a relocate option that works but not Ubuntu.
You can also write a script that runs dpkg-deb with a different environment for 6 times, generating 6 different packages. When you make a modification, you simply have to run your script, and all 6 packages gets generated and you can install them on your machines avoiding postinst hacking!
Why not install to a standard location, and simply use a postinst script to create symbolic links to the desired location? This is much cleaner, and shouldn't break anything in dpk -I.

Resources