I need to install small programs I do not fully trust.
Therefore I would like to monitor all files for changes - whether this script places some files it is not supposed to or edits others.
As I want to monitor all folders and files I thought about using something similar to rsync - but is there an alternative to only watch for changes?
Does this way guarantee that I catch everything the software changes? Or are there some kind of "registry-entries" / changes in the configuration, I could miss?
Thanks a lot!
I would suggest you use some kind of sandbox (probably the most straightforward way nowadays is to use Docker).
You could use Git to track all the changes that are made into the sandbox/container:
Initialize a git repo in the root dir
Add all files and commit as the base version
Execute the install script you do not trust
Using git status is going to show you all the changes that were made during installation.
I would like to clean my local maven repository, but keep the last y snapshot versions for each artifact.
I found this script but it works by date.
I think I could adapt it to make it count files, but I'd like to know if such thing could be done directly with a linux find command.
Any idea?
Thanks
If you have an in-house repository manager it's probably simpler to clear your local repository and let it populate itself again over time.
If you do not have a repository manager, get one, even if it is just for yourself. I use Nexus, but there are alternatives. Nexus let's you decide how many snapshots to keep and/or for how long. In the end it's going to be easier to manage artifact lifetime with Nexus rather than try to devise complicated scripts.
I am searching for a solution to automatically merge files on upload.
To be more precise, we are working in small groups doing web-development, working on the same folder on our Debian Server remotely, so the Problem is of course that if we often have the situation, where up to 3 People need to write in the same php file, at the moment we are trying to coordinate when which person is allowed to work on it.
So my idea was if there is a subversion like solution, to just merge every time we save the file via sshfs.
You should use version control. Here are some options. Which one you should use depends on a variety of factors.
Mercurial
Git
Subversion
You can then have the server your site is on pull from the repository.
I have a couple of dependencies in my Java project on 3rd party libs, and some of them are undergoing development that I would like to track.
I would like to be able to be notified, (By email, desktop popup, or otherwise) when changes are committed to the remote svn repo so I can examine their impact etc.
I looked at svnmailer, but it would seem to require the repo to be local (I think??)
also I found some windows tools that do the job, but I am running linux desktop. so no go there.
worst case, I can do some cron script to poll for remote changes using the command line tools, but I would prefer some existing tool.
Sounds like a good use for a continuous integration server. Something like CruiseControl or Hudson are designed for this use case - the whole point of them is to to check your source control regularly, retrieve any changes, build the project and notify someone. In this case, it sounds like you don't even need to build the project, just send an email anyway.
If you don't already have a CI server this might seem like a little overkill but I bet once you've got one set up you'll find yourself using it again.
I have 3 Linux machines, and want some way to keep the dotfiles in their home directories in sync. Some files, like .vimrc, are the same across all 3 machines, and some are unique to each machine.
I've used SVN before, but all the buzz about DVCSs makes me think I should try one - is there a particular one that would work best with this? Or should I stick with SVN?
I've had this problem for years, and I don't think version control is necessarily the right way to go. I've had good success with the the Unison file synchronizer which is designed for the express purpose of maintaining consistent home directories on two machines. I'm currently managing seven replicas with unison, and the details are a bit tricky, but it is a great tool and if you start with two you will be extremely pleased.
The key difference between Unison and a VCS is that Unison is willing to delay dealing with conflicts that have to be merged. Plus it gets all the defaults right. And it is fast: I use it daily, over a DSL line, to synchronize about 40GB of data.
Any DVCS would likely work fine. My favorite is Bazaar. It would be easiest to keep your config files in .config, version that, and then symlink as appropriate.
A benefit of DVCS is that you can version the per-machine config files as well, without interfering with versioning global configs.
I've had the same problem, and built a tool on top of Subversion that adds permission, ownership and secontext tracking, keeps the .svn directories out of the actually versioned trees, and adds a concept of layers so you can for example track all your config related to development, which you then only check out on machines you use for developing.
This has helped me organize my settings much better across the 50+ machines I log into.
Here's the project page. It's still a little rough around the edges, but we also use it at work to version system configuration for our 60+ servers.
In general, any version control system that uses some sort of metadata files to track stuff is going to cause you pain as is when actually using it.
Version control software isn't really great for home directories. Worse, some software doesn't really like the .svn folders or starts to interpret their contents. You could of course try to fix this with some very complex mirroring setup, but that's hard.
Here's a Mozilla developer that's tried to do this: Version controlling my home dir, there's a couple of suggestions in the comments.
git or Mercurials's cheap branching would work great for this situation. I started with Mercurial, because it is simpler, but have subsequently moved to git.
One way to handle this very flexibly is to have a build directory under revision control, not try and svn your actual home directory (which has its own issues)
so inside this you keep a structure like
/home/you/code/dotfiles
/home/you/code/dotfiles/dotbashrc
/home/you/code/dotfiles/dotemacs
...
/home/you/code/dotfiles/makefile
and the makefile can contain logic for specializing files (or not)
might be heavier than you need, but if your actual setup is complex (I've done this across 3 or 4 different unices at a time) then it's worth doing something like this.
I use git for this. So far, I have been able to keep the home directories on several machines synchronized, with no need for branching and merging. Instead, I use git rebase. Conflicts so far have been few and far between and easy to resolve.
I keep files that need to have separate contents out of revision control by putting them into .gitignore.
I keep configuration files for the following tools in git:
various shells
emacs and applications, i.e.
gnus
BBDB
emacs-w3m
mutt
screen
various utilities and scripts
I keep notes and such in a subdirectory which has its own git repository.
I would suggest looking into etckeeper if you haven't already. It's designed for versioning configuration files in /etc using a version control system:
etckeeper is a collection of tools to
let /etc be stored in a git,
mercurial, darcs, or bzr repository.
It hooks into apt (and other package
managers including yum and pacman-g2)
to automatically commit changes made
to /etc during package upgrades. It
tracks file metadata that revison
control systems do not normally
support, but that is important for
/etc, such as the permissions of
/etc/shadow. It's quite modular and
configurable, while also being simple
to use if you understand the basics of
working with revision control.
Although it's designed for /etc I think it would probably also work well (perhaps with some adaptation) for home directories since the basic needs are the same.
I know this is an old thread but found it while searching for some dotfiles.
My current system is using subversion. The key thing I did was check out the working copy into ~/.svnhome/ (in hindsight should have called it .dotfiles or something more generic). I then create symlinks to the files I actual use on that computer into home. For example my .procmail and .spamassassin folders are only needed on the mail server so I don't link those on my home server.
The only file that has some differences is the .bashrc file has some extra lines on my mac for macports. So at the bottom of .bashrc I have it check if .bashrc_local exists and parses that.
This is the last remaining thing I have using subversion (everything else is using git aside from work). The benefit of svn is because it's not a dvcs so I don't have to worry about accidentally committing on one server and forgetting to push it.
I have considered moving it to git so I could create branches. Using the above example I would have a branch for my main server that I would add the .procmail and .spamassassin folders but not have those in the master branch. But the current system has worked fine for years--before git even existed--and don't have any particular motivation to change it now.