Native rsync library? - shared-libraries

Do there exist any good rsync libraries that implement:
The rsync algorithm and
The rsync protocol
Such that one could use the library to build the rsync tool itself? (I want my application to be compatible with a normal rsync server or normal rsync over ssh.)

For rsync algorithm :-
rsync uses zlib as part of it's source code .Inside it is adler-32 algorithm and inflate and deflate algorithms (compression - decompression )implemented.
Also do check this, haven't used it myself :-
http://librsync.sourceforge.net/
for rsync protocol :-
But, I highly doubt there is library for rsync protocol(means there is no library for rsync protocol).

Related

pack the nodejs and javascript code to one execute file

I hope to pack the nodejs(includes its installed module via npm) and javascript code to one execute file for different plateform(windows, osx, linux).
Is it possible or any solution?
Your comment welcome
From my understanding, you can't really create an executable file for multiplatforms. Each platform has it's own packaging format to make it binary executable. What you can do is to create a x.tar.gz file and expand it to your target platform. I myself haven't done it but theoretically it's possible. Here is an example (assuming you're using GNU tar for all your platforms):
To pack it, do:
tar cvzf nodeproject.tar.gz nodeproject
To expand, do
tar xvzf nodeproject.tar.gz

access video file information using standard Unix utilities

I want to access video file information (specifically, the video horizontal and vertical dimensions) in the Bash terminal of Scientific Linux on a system over which I do not have root privileges. The setup is conservative and does not feature the modern utilities, such as exiftool and avprobe, that would be used to do this. What would be a way of accessing this information using standard Unix utilities or some other means likely to work on a conservative Linux setup? To be specific, I am looking for something such as the following:
<utility> video1.mp4
1280x720
Thanks for any ideas!
How to build and run typical open source software from source without root
Even if you don't have root, provided that:
you can at least use the compiler and related tools
you can download source code
you don't need too many strange libraries
then download the source code for your tool of choice and install it into $HOME/opt/somedir.
For example, for avprobe you could probably download the last stable source release, then build it like this:
tar xzf libav.....gz
cd libav.....
./configure --prefix=$HOME/opt/libav
make
make install
then run it as
$HOME/opt/libav/bin/avprobe
You may or may not need to tweak the value of LD_LIBRARY_PATH or various other things.

Linux unzip preserve case?

Working on a web site. A number of third party javascript libraries use mixed-case in their files and folders.
I am working on a windows system.
When ready to upload from my local windows XAMPP environment to my linux hosting, I use 7zip to create a zip file of my site. I use 7zip's -xr! feature to skip certain directories like my .git repository.
I FTP the resulting .zip file to my server and use the server's "unzip" function to explode it. All my files are there but they are all changed to lowercase!
This kills the website as the third party libraries that are mixed-case are no longer found.
I've tried unzip -C but that did not seem to do anything.
I also look in the archive prior to uploading and on windows, all the file name cases are preserved.
Tried using GNU32's windows tar but the --exclude function is not allowing me to skip the .git directories.
I need some help in the form of:
How to use unzip in linux such that is preserves case (googled until hairless, but no love found...)
How to use tar on windows such that it excludes particular directories
How to use something else to achieve my goal. I honestly don't care what it is... I'm downloading CYGWIN right now to see if it'll help at all. I may end up installing Linux in a virtual box just to try tar-gz from a virtual machine actually running linux but would REALLY rather avoid that hassle every time I want to pack up a pretty simple archive.
Zip works fine for packing, but unpacking is not kosher.
Use tar's --exclude-vcs option:
--exclude-vcs
exclude version control system directories
Example:
tar --exclude-vcs czf foo.tar.gz foo
or for a *.tar.bz2 archive
tar --exclude-vcs cjf foo.tar.bz2 foo
Try unzip -U file.zip; this might work if you have an old version of unzip. Otherwise, post the output of unzip -v and unzip -l file.zip.

Export SVN repository over FTP to a remote server

I'm using following command to export my repository to a local path:
svn export --force svn://localhost/repo_name /share/Web/projects/project_name
Is there any, quite easy (Linux newbie here) way to do the same over FTP protocol, to export repository to a remote server?
Last parameter of svn export AFAIK have to be a local path and AFAIK this command does not support giving paths in form of URLs, like for example:
ftp://user:pass#server:path/
So, I thing there should be some script hired here to do the job.
I have asked some people about that, and was advised that the easiest way is to export repository to a local path, transfer it to an FTP server and then purge local path. Unfortunately I failed after first step (extract to local path! :) So, the support question is, if it can be done on-the-fly, or really have to be split into two steps: export + ftp transfer?
Someone also advised me to setup local SVN client on remote server and do simple checkout / update from my repository. But this is solution possible only if everything else fails. As I want to extract pure repository structure, without SVN files, which I would get, when go this way.
BTW: I'm using QNAP TS-210, a simple NAS device, with very limited Linux on board. So, many command-line commands as good as GUI are not available to me.
EDIT: This is second question in my "chain". Even, if you help me to succeed here, I won't be able to automate this job (as I'm willing to) without your help in question "SVN: Force svn daemon to run under different user". Can someone also take a look there, please? Thank you!
Well, if you're using Linux, you should be able to mount an ftpfs. I believe there was a module in the Linux kernel for this. Then I think you would also need FUSE.
Basically, if you can mount an ftpfs, you can write your svn export directly to the mounted folder.
not sure about FTP, but SSH would be a lot easier, and should have better compression. An example of sending your repo over SSH may look like:
svnadmin dump /path/to/repository |ssh -C username#servername 'svnadmin -q load /path/to/repository/on/server'
URL i found that info was on Martin Ankerl's site
[update]
based on the comment from #trejder on the question, to do an export over ssh, my recomendation would be as follows:
svn export to a folder locally, then use the following command:
cd && tar czv src | ssh example.com 'tar xz'
where src is the folder you exported to, and example.com is the server.
this will take the files in the source folder, tar and gzip them and send them over ssh, then on ssh, extract the files directly to the machine....
I wrote this a while back - maybe it would be of some use here: exup

Customize Gnome Archive Manager 7z commands

Archive manager + nautilus is very usefull thing for any work with archives
If you install p7zip-full package Archive manager can work with 7z archives
But Archive manager use default settings for compressing
It is very bad
Classical example with javadoc:
Download it from http://www.oracle.com/technetwork/java/javase/downloads/index.html
unzip jdk-6u23-docs.zip
mv docs javadoc
7z a -t7z -m0=lzma -ms=on javadoc.7z javadoc
du -chb javadoc.7z
24791075 javadoc.7z
But from man 7z and from LzmaLib.h we know that best compression is -mx=9 -mfb=273 -md=64m
Let's try:
7z a -t7z -m0=lzma -mx=9 -mfb=273 -md=64m -ms=on javadoc.7z javadoc
du -chb javadoc.7z
21308619 javadoc.7z
This is real better!
Question:
How to make Archive manager to use custom 7z command as default?
You'll get a faster answer at superuser, for questions like this one.
Looking at the program, I discovered that it was File-Roller and the compression parameters were in an XML file. The manual mentioned nothing about configuration for the compression level. Finally, I found this information with Google(at bottom of page):
Veikk0 wrote on the 24 Jul 10 at 20:17
In my opinion this should get more
attention. Creating archives can be
frustrating and difficult at the
moment, mostly because to change the
compression level you have to:
Open gconf-editor (alt+F2 or from terminal).
Navigate to /apps/file-roller/general
Manually edit the key called compression_level to very_fast, fast,
normal or maximum.
Create your archive with file-roller.
Repeat if you want to create another archive with different
compression level.
Furthermore, there's a bug for this: Bug 450019 - compression level
On Trisquel 6.0/Ubuntu 12.04, it's dconf-editor, and the schema is org.gnome.FileRoller.General.
The best compression with 7-zip can be achieved with
7zr a -mx=9 OUTPUT.7z INPUT
which produces slightly smaller files than the "maximum" compression level of File Roller, due to the fact, that File Roller uses the -m0=lzma2 parameter, which is no longer beneficial as of 7-zip version 9.20.

Resources