how to copy between two host machine and windows server client using ansible faster? - linux

My aim is to copy from a machine A(ubuntu) to remote server B(windows2012 server) using ansible copy command. I can ping the windows server machine and can even copy a small folder from ubunto to server but when the folder size becomes big it takes so so long to get copied and sometimes not get copied . I am using as follows:
-name:copy file
win_copy:
src: '/service/test.zip'
dest:'D:/test/test.zip'
test.zip folder is around 300 MB. So, win_copy is not solving my purpose. Could you suggest what can be good option in this case?

I've had this problem and just wrote a powershell script to download the file directly to a known location on the target. Write a powershell script to download the file. I would deploy the script to the target using win_copy or win_template (if you need to do substitutions) and then call it using win_command.

Related

copy only new files to remote host after write is finished

I have a java application which is writing images to a directory.
I want to copy these images from external utility (not from the java application directly).
I can use scp and/or rsync or other (not sure which one to select) to copy on every, say 5sec interval(mostly using cron).
But the main problem here is to know when the write has been finished ?
For example, if the application write is on-going and the schedule ticks for copy, then the image copied will be corrupted.
So what utility should I go for & how can I make sure that the image copied is legit ?

How to the history of files changed on a Rsync server

How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?
My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.
If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.

Using SAS X Command with PCFILES server

I have some SAS code that writes out to a specific sheet in an excel workbook. The other sheets have formulas that look to this sheet so the workbook is basically a template that gets populated when my code is run.
I want to be able to run the code multiple times, and end up with a different version of the template each time it is run. I'm thinking the easiest way to achieve this is write out to the template and then use x command or something to copy the workbook and then rename it.
SAS is on a linux server and I use a pc files server to write to excel. How do I set up x command to copy the file and change its name on the remote server?
Sorry for the late answer, but I just encountered this myself and can provide a solution.
Can you access files on the sas server through windows (with an SMB share or similar)? That is the easiest way to do this. If you can't, it is typically very easy to set up Samba on linux.
First, store the excel template on the SAS server where it can be seen from your windows computer (with the PC Files Server on it).
Lets say the file is in '/home/files/template.xlsx' on the SAS server, and that directory is shared and accessible on your windows server as '\linuxservername\files\template.xlsx' (or \192.168.1.5\files\template.xlsx if you are using IP addresses)
Now you just have to use the SMB path for the PCFILES stuff and the local linux path for the x command. Something like
x 'cp /home/files/template.xlsx /home/files/output.xlsx';
libname output PCFILES
path='\\linuxservername\files\template.xlsx'
server='PCFilesServerName'
port=1234;
Note that when feeding a path to PCFiles, you use the network address (since your windows box has to be able to read and write it) and when you issue the copy command, you use the local address.
You may also be able to use the network path for everything depending on your system configuration. When I tried it, I could not make it work since the unix server did not like it (it shares the folder over SMB, but it didn't know how to access it from the command line).

Setting up these Tortoise SVN commit hooks

I'd like to set up a commit hook that will subsequently upload source files from a Windows environment to a Linux server, which is not the same as Linux server running SVN.
I'm familiar with setting up client side hooks, but not sure what the script should be like.
I'm not really sure the easiest way to go about this. I'm thinking a Windows script that will run a copy command that can do this sort of thing. My entire group would use it so the script would have to be located on a Windows NFS. Ideas?
not sure what the script should be like
Client-side (as server-side) hook is any program, which can be executed on this host. Single difference between these type of hooks is location, where program is executed - clent-side hooks of TortoiseSVN will run on developer's host with Working Copy
Your script must be non-interactive set of operations, which will perform needed operation (ssh or ftp to target host, upload files) - can't see any problem here (except one - FTPing a bundle of /random/ files is always a big headache)

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

Resources