Where do I put a post-commit hook script? - linux

I've just thrown together the following shell script:
cd /home/firefli/webprojects/project1
svn checkout file:///home/firefli/svn/project1/trunk .
rm -rf /home/firefli/public_html/project1
svn export . /home/firefli/public_html/project1
It does work when I do a commit and then run the script manually but I still have a couple of questions.
Can I run a bash script, or does it have to be C? (I've seen lots of C examples)
Where do I put it to make it execute post-commit?

There is a hooks directory inside your Subversion repository. It should contain a number of templates that you can modify and use.
Your script can happily be a bash script. The provided templates use /bin/sh
Just remove the .tmpl extension and you're good to go.
The Subversion docs provide more info here

Related

Alias Multiple Commands in Linux

I am managing a website using git. One of the requirements for the git repository is that bare = true. It uses a post-receive hook to manage pushes from my local computer. The problem is that sometimes I would like to make changes to a WordPress directory on my website using the wp-admin view online. So then I would just ssh into the directory and run git --work-tree="BLAH" add . and git --work-tree="BLAH" commit -m "BLAH". Is there a way to set up an alias, like alias git="git --work-tree=\"BLAH\"" and have that work for all git commands?
There are times when alias are a great tool. Then there are times when things start getting too complicated where a shell script is better.
To create a single command that executes other commands just create a file (maybe call it git-add-all) then type the following:
#! /bin/bash
git --work-tree="BLAH" add .
git --work-tree="BLAH" commit -m "BLAH"
Then you can run the script by simply doing:
bash git-add-all
Even better, make the script executable:
chmod +x git-add-all
Then you can use it like any command:
./git-add-all
Advanced tips:
To be able to run the script from any git directory you can copy/move the file to one of the directories in your $PATH. For example /usr/loca/bin. Then you can simply run git-add-all instead of ./git-add-all.
Even better is to create your own personal scripts directory and include it in $PATH. I personally use ~/bin. To add the directory to $PATH you just need to add the following to .bashrc or .profile:
export PATH=/home/username/bin:$PATH
or if you're doing this for the root user:
export PATH=/root/bin:$PATH
In case anyone is curious how I solved it (thanks to shellter's comment), I wrote a bash script then prompted the user for input like so:
#!/bin/bash
function fix {
git --work-tree="PATH_TO_WORKING_TREE" $1
}
echo -n "git "
read -e INPUT
until [ "$INPUT" = "quit" ]; do
fix $INPUT
echo -n "git "
read -e INPUT
done
Running it:
user#server [repo.git] $ git-fix
git status
# On branch master
nothing to commit (working directory clean)
git quit
There is a .bashrc file in Linux. You can edit it for creating alias for your favorite and frequently used commands.
To create an alias permanently add the alias to your .bashrc file
gedit ~/.bashrc
The alias should look like:
alias al='cmd'
You can read more about it over here.

Unix Bash script causing git clone to not work

I have a bash script (script1.sh) where I perform a git clone.
Then, from that repository, I run a another script (script2.sh) which runs fine.
I run script2.sh just fine, but the git repo is non existant. Any folder just isn't there. If I run the git clone on the command line, it clones it just fine.
Why is my script no git cloning my repo correctly?
How I run the first script.
sudo bash script1.sh
script1.sh
cd /home/ubuntu
git clone http://mygit-thing.com/myrepository.git localfolder
#run script from the repo
sudo bash localfolder/script2.sh
script2.sh
~ Some unrelated unix commands
Notes: I tried looking at the /home/ubuntu folder and could not find it. It's not a "hidden" file as well.
This could solve your problem replace your script1.sh:
home=/home/ubuntu
folder=$home/localfolder
git clone http://mygit-thing.com/myrepository.git $folder
#run script from the repo
bash $folder/script2.sh
if it does not work, it is maybe possible that you are not allowed to write on $home because of your current user permissions or because your fs is read-only You can check that by executing mount without option, it will list all mounted fs.
Another point, sudoing from inside a script is not recommended. Currently you are basically sudoing on a sudo. If you want to be sure the right user is executing your script, you better check the current id than doing nested sudo.
As #jibe suggested, you're calling bash from sudo, which will call bash from different location. Provide full path to the local git repository.
I have same issue during to write blue-green deployment script , I resolved calling with bash.
#!/bin/bash
bash /home/ubuntu/deployment-guru/green-deployment.sh
green-deployment.sh
#!/bin/sh
cd /home/ubuntu/my_api && git pull origin blue-green-jenkins-integration
before calling without bash i was facing following issue
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.

Linux SVN post-commit doesn't work?

I read and try alot of Blog an Post entries from stackoverflow and other pages, but there was no solution.
SVN Version : 1.6.11
Linux Version : Linux 2.6.32-358.23.2.el6.x86_64 x86_64
I've created a script, which should be execute after a svn commit.
I've renamed the file post-commit.tmpl to post-commit.
I use absolut paths and all files (the script, post-commit, log..) are in mode 777.
In the script and in the post-commit the PATH is set.
When I commit something in my Project, my debug.log is working .
echo "START">>/svn/test/debug.log
sudo echo /svn/test/hookScripts/generateDocumentation.sh "$1" "$2">>./svn/test/error.log
echo "END">>/svn/test/debug.log
There are the START and END entries in the debug.log File, but the script won't be executed.
I tried some from this link, but it doesn't work.
The Problem was: I try to run /svn/test/hookScripts/generateDocumentation.sh in a script. To execute this I've to write it in $(...)

How to track changes of my Linux distrib with git?

I am experimenting some linux configuration and I want to track my changes? Of course I don't want to to put my whole OS under version control?
Is there a way (with git, mercurial or any VCS) to track the change without storing the whole OS?
This is what I imagine:
I do a kind of git init -> all hashes of all files are stored, but not the content of the files
I make some changes to my file system -> git detect that the hash of this file has changed
I commit -> the content of the file is stored (or even better the original file and the diff are stored! I know, that is impossible... )
Possible? Impossible? Work-arounds?
EDIT: What I care about is just to minimize the size of the repository and to have a repository containing only my changes. Having all files in my repository is not relevant for me. For example if i push to github I just want it to contain only the files that has changed.
Take a look at etckeeper, it will probably do the job.
What you want is git update-index --info-only or ... --index-info, from the man page: " --info-only is used to register files without placing them in the object database. This is useful for status-only repositories.". --index-info is its industrial-scale cousin.
Do that with the files you want to track, write-tree to write the index structure into the object db, commit-tree that, and update-ref to update a branch.
To get the object name use git hash-objectfilename.
Here is what we do...
su -
cd /etc
echo "*.cache" > .gitignore
git init
chmod 700 .git
cd /etc; git add . && git add -u && git commit -m "Daily Commit"
Then setup crontab:
su -
crontab -e
# Put the following in:
0 3 * * * cd /etc; git add . && git add -u && git commit -m "Daily Commit"
Now you will have a nightly commit of all changes in /etc
If you want to track more than /etc in one repo, then you could simply do it at the root of your filesystem, except add the proper ignore paths to your /.gitignore. I am unclear on the effects of having git within git, so you might want to be extra careful in that case.
I know this question is old, but I thought this might help someone. Inspired by #Jonathon's comment on the How to record concrete modification of specific files question, I have created a shell script that enables you to monitors all the changes done on a specific file, while keeping all the changes history. the script depends on the inotifywait and git packages being installed.
You can find the script here
https://github.com/hisham-hassan/linux-file-monitor
Usage: file-monitor.sh [-f|--file] <absolute-file-path> [-m|--monitor|-h|--history]
file-monitor.sh --help
-f,--file <absolute-file-path> Adding a file to the monitored files List. The <absolute-file-path>
is the absolute file path of the file we need to action.
PLEASE NOTE: Relative file path could cause issues in the script,
please make sure to use the abolute path of the file. also try to
avoid sym links, as it has not been tested.
example: file-monitor.sh -f /absolute/path/to/file/test.txt -m
-m, --monitor Monitoring all the changes on the file. the monitoring will keep
happening as long as the script is running; you may need to run it
in the background.
example: file-monitor.sh -f /absolute/path/to/file/test.txt -m
-h, --history showing the full history of the file.
To exit, press "q"
example: file-monitor.sh -f /absolute/path/to/file/test.txt -h
--uninstall uninstalls the script from the bin direcotry,
and removes the monitoring history.
--install Adds the script to the bin directory, and creates
the directories and files needed for monitoring.
--help Prints this help message.

Local Git as autosave

I would like a local GIT is my home directory to implement autosave to the repository that happens every five minutes.
I have two Questions:
Is this s sane thing to do?
How does one go about writing a script that implements this functionality for a specified set of directories in the home directory on linux?
The aim is to capture all the histories all the important files in my home directory automatically without any input from me. I can use this whenever I screw-up.
Sanity is all relative!
I guess it depends on why you are backing up. If it's for hardware failure, then this won't work because the repository is in the same folder (/home/) so if the folder goes, the repo goes. Unless of course you are pushing it to a storage repo on another machine somewhere as the actual backup.
We do use git to store important things, especially research papers and PDF's, so we can easily share them.
You would write a cron job that runs a script every so often. Basically you would write a simple bash script that does a git commit -a -m "commit message" periodically in your folder. The tricky part is doing the git add on the new files that were created so they are tracked. You will likely need to do a git status and parse the output from it in your script to find the new files, then git add that list. Python may be the easiest way to do that. Then you register that with cron.
Google is your friend here, there are plenty of examples on how to register scripts with cron.
Write a shell script that would enter each directory you want and run
git add .
git commit -m "new change"
git push
and then use cron to run the script each 5 minutes.
Write a shell script to do the following
1) git status --u=no //It gives you the files which are modified
2) Iterate through the file list from step 1 and do git add <file>
3) git commit -m "latest change <date:time>"
Schedule this script in cron.

Resources