How to remove untracked files from a Perforce working tree? - perforce

What's the equivalent of "git clean" with Perforce?
git-clean - Remove untracked files from the working tree
Cleans the working tree by recursively removing files that are not under version control, starting from the current directory.
-x Don’t use the standard ignore rules read from .gitignore (per
directory) and $GIT_DIR/info/exclude, but do still use the ignore
rules given with -e options. This allows removing all untracked files,
including build products. This can be used (possibly in conjunction
with git reset) to create a pristine working directory to test a clean
build.
-X Remove only files ignored by Git. This may be useful to rebuild
everything from scratch, but keep manually created files.

Using P4V, you can right click on a folder and select "Reconcile Offline Work...". In the middle panel ("Local files not in depot"), all the files that Perforce can find in the folder tree that are not in the depot will be listed. You can select some or all of these and right click and select "Delete Local File"
This isn't a script, but it's much easier than most other solutions on Windows.

Try (for Unix) from your top-level:
# Find all files and filter for those that are unknown by Perforce
find . -type f | p4 -x - fstat 2>&1 > /dev/null | sed 's/ -.*$//' > /tmp/list
### manually check /tmp/list for files you didn't mean to delete
# Go ahead and remove the unwanted files.
xargs rm < /tmp/list
Or, for a clean -f kind of approach just pipe directly to xargs rm instead of first staging the file list in /tmp/list.

The new P4 clean should do the trick

There are subtle differences in behaviour between p4 clean "Restore workspace files to match the state of corresponding depot files
" and git clean "Remove untracked files from the working tree", so read the manuals carefully.
In particular, vanilla p4 clean will revert changes you've made to files outside any pending changelist. Thus it's prudent to the test the outcome with the 'dry run' option p4 clean -n.
As I understand:
p4 clean -a emulates git clean "Remove untracked files from the working tree."
p4 clean -a -I emulates git clean -x "Remove all untracked files, including build products."
To my frustration, there's no exact analogue of git clean -X "Remove only files ignored by Git", but you can approximate it working with p4 reconcile:
p4 reconcile -a && p4 clean -a -I. Untracked files that match the ignore list will be deleted, others will be moved to the default changelist.

There is no equivalent. Perforce has no command to remove files that are not under its control. You can see them in P4V, on the Workspace tab (they have plain white icons rather than the lined icons with the green dot) and delete them manually. If you want to automate the process, the easiest thing to do would be to remove the files from your workspace, delete everything in the directory, then sync it back up. A batch file to do it would look something like this:
p4 sync "//depot/someFolder/...#0"
erase C:\projects\someFolder\*.* /s /q /f
rd C:\projects\someFolder /s /q
p4 sync -f "//depot/someFolder/..."
The first line is optional if you use the force switches on the erase and sync commands.
That solution has its drawbacks however. If you're currently working on any of the files, you obviously don't want to wipe them out. Also, a full sync can take quite a while if there is a huge amount of data in the directory tree you wish to clean.
A better way to do it would be to have your clean utility (I think we've grown beyond a batch file at this point) grab the list of files under version control using the p4 files command. Then iterate through all the files in the directory, deleting those that don't appear on the list.

You can't, frustratingly,. There's no analogue of svn clean or git clean. A shame, these are very useful commands.
Here's an appeal on Perforce's website for the developers to add a clean command. https://perforce.ideas.aha.io/ideas/P4V-I-7

Easy Gui solution as well as examples like the ones already mentioned here:
How to find untracked files in a Perforce tree? (analogue of svn status)

on linux, you can do it without a temporary file:
find . -type f | p4 -x- have 2>&1 >/dev/null | grep "not on client" | cut -d " " -f1 | xargs -ifoo echo rm foo

In Windows:
#ECHO OFF
SETLOCAL EnableDelayedExpansion
IF "%1"=="" (
SET proceed=N
ECHO Are you sure you wish to nuke all files under:
ECHO %CD%
SET /P proceed="Y/N: "
ECHO !proceed!
IF /I "!proceed!"=="Y" (
ECHO Nuking %cd%
FOR /F "tokens=1 delims=-#" %%G IN ('p4 reconcile -nlad -f ...') do (
DEL /F "%%G"
)
)
) ELSE IF /I "%1"=="-n" (
ECHO Files to be deleted 1>&2
FOR /F "tokens=1 delims=-#" %%G IN ('p4 reconcile -nlad -f ...') do (
ECHO "%%G"
)
) ELSE GOTO :USAGE
GOTO :STOP
:USAGE
ECHO Usage: nuke [-n]
ECHO -n will only write the files to be deleted but will not actually delete them
:STOP

This one works much better:
#ECHO OFF
SETLOCAL EnableDelayedExpansion
IF "%1"=="" (
SET proceed=N
ECHO Are you sure you wish to nuke all files under:
ECHO %CD%
SET /P proceed="Y/N: "
ECHO Looking for files to be deleted. This may take some time depending on the depth of your tree. 1>&2
IF /I "!proceed!"=="Y" (
ECHO Nuking %cd%
FOR /F "tokens=1 delims=#" %%G IN ('p4 reconcile -nla -f ...') DO (
DEL /F %%G
)
)
) ELSE IF /I "%1"=="-n" (
ECHO Looking for files to be deleted. This may take some time depending on the depth of your tree. 1>&2
ECHO Files to be deleted 1>&2
FOR /F "tokens=1 delims=#" %%G IN ('p4 reconcile -nla -f ...') DO (
ECHO %%G
)
) ELSE GOTO :USAGE
GOTO :STOP
:USAGE
ECHO Usage: nuke [-n]
ECHO -n will only write the files to be deleted but will not actually delete them
:STOP

I use next commands in case if my changelist doesn't have changes.
Run 'p4 reconcile' command that reconcile all changes made outside of Perforce (-a key can be used if you need only untracked files).
Run 'p4 revert path\...,v' that removes service files (with ,v suffix, they are added by reconcile) from changelist but left them in file system
Run 'p4 revert -w path\...' thet removes all other files from changelist and disk.

Related

Bash script to automatically zip different subfolders and store the zipped files into a special folder with dated names

My directory structure is thus:
\myproject\
src\
include\
zipdir\
.vscode\
auto7zip.bat
There are specific files from within .vscode\ subfolder that I would like to archive, and not the entire folder.
On windows, the auto7zip.bat file has the following content:
#ECHO OFF
SETLOCAL EnableExtensions EnableDelayedExpansion
Set TODAY=%date%
ECHO %TODAY%
set path="c:\Program Files\7-Zip";%path%
set projectdir=%CD%
set "zipdirsrc=%projectdir%\src"
set "zipdirinc=%projectdir%\include"
set "zipdirothers=%projectdir%\.vscode"
set "movedir=%projectdir%\zipdir"
pushd %zipdirsrc%
ECHO Zipping all contents of %zipdirsrc% and moving archive src to %movedir%
7z a -tzip "%movedir%\src_%TODAY%.zip" -r "%zipdirsrc%\*.*" -mx5
ECHO SRC Task Complete ...
pushd %zipdirinc%
ECHO Zipping all contents of %zipdirinc% and moving archive include to %movedir%
7z a -tzip "%movedir%\include_%TODAY%.zip" -r "%zipdirinc%\*.*" -mx5
ECHO INCLUDE Task Complete ...
pushd %zipdirothers%
ECHO Zipping all other contents of %zipdirothers% and moving archive others to %movedir%
7z a -tzip "%movedir%\others_%TODAY%.zip" -r "%zipdirothers%\*.json" "%zipdirothers%\Makefile" "%zipdirothers%\Makefile*.mk" "%zipdirothers%\windows.vcxproj" "%zipdirothers%\nbproject\" -mx5
ECHO OTHERS Task Complete ...
EXIT /B 0
This batch file, if run today (July 2nd, 2021) on windows with 7zip installed creates include_02-07-2021.zip which is the zipped version of include\, src_02-07-2021.zip which is the zipped version of src\ and others_02-07-2021.zip which is the zipped version of the specific files from the .vscode\ directory. These zipped files are created automatically inside the zipdir\ directory.
Is there an equivalent bash script that when run from within \myproject\ accomplishes the same thing on linux distributions, in particular, ubuntu?
This should do what you need.
#!/bin/bash
today=$(date +'%d-%m-%Y')
echo $today
movedir=../zipdir
function backup() {
if [[ -n $2 ]]; then
newname="$2"
else
newname="$1"
fi
cd "$1"
[[ -z $custom ]] && custom="./*"
echo "Zipping all contents of ./$1 and moving archive $newname to $movedir"
zip "$movedir/${newname}_$today.zip" -r $custom -9
unset custom
cd ..
}
unset custom
backup src
backup include
custom='./*.json ./Makefile ./Makefile*.mk ./windows.vcxproj ./nbproject/'
backup .vscode others
Update 1:
Removed irrelevant part of answer
Updated script to exclude root directory from archive by changing into relevant directory (since request was clarified by OP).

How do I write this in a batch file

I am trying to convert a shell script into a batch file but I am stuck at this place.
I am not quite sure how to convert this line into my batch file.
$(echo $(cd $(dirname $0)/../..; pwd))
I tried
echo %(cd %(dirname %0%)%/.../..; pwd)%
but I am not sure if I am right!
You can't do command chaining like this in Batch. If I see that correctly, you want to get the absolute path of the directory two levels above the directory of the currently executed script. You might do this:
#ECHO OFF
CD %~dp0\..\..
ECHO %CD%
Which changes the current directory. To avoid that, use:
#ECHO OFF
PUSHD %~dp0\..\..
ECHO %CD%
POPD

rsync with --remove-sent-files option and open files

Every minute I need to copy recorded files from 3 servers to one data storage. I don't need to save original files - data processing is outside of all of them.
But when i use option --remove-sent-files, rsync sends and removes not finished (not closed) files.
I've tried to prevent sending these open files with lsof and --exclude-from, but it seems, that rsync does not unserstand full paths in exlude list:
--exclude-from=FILE read exclude >>patterns<< from FILE
lsof | grep /projects/recordings/.\\+\\.\\S\\+ -o | sort | uniq
/projects/recordings/<uid>/<path>/2012-07-16 13:24:32.646970-<id>.WAV
So, the script looks like:
# get open files in src dir and put them into rsync.exclude file
lsof | grep /projects/recordings/.\\+\\.\\S\\+ -o | sort | uniq > /tmp/rsync.exclude
# sync without these files
/usr/bin/rsync -raz --progress --size-only --remove-sent-files --exclude-files=/tmp/rsync.excldude /projects/recordings/ site.com:/var/www/storage/recordings/
# change owner
ssh storage#site.com chown -hR storage:storage /var/www/storage/recordings
So, may be i should try another tool? Or why rsync does not listen to exludes?
I'm not sure if this helps you, but here's my solution to only rsync files which are not currently being written to. I use it for tshark captures, writing to a new file every N seconds with the -a flag (e.g. tshark -i eth0 -a duration:30 -w /foo/bar/caps). Watch out for that tricky rsync, the order of the include and exclude is important, and if we want sub-directories we need to include "*/".
-G
$save_path=/foo/bar/
$delay_between_syncs=30
while true;
do
sleep $delay_between_syncs
# Calculate which files are currently open (i.e. the ones currently being written to)
# and avoid uploading it. This is to ensure that when we process files on the server, they
# are complete.
echo "" > /tmp/include_list.txt
for i in `find $save_path/ -type f`
do
op=`fuser $i`
if [ "$op" == "" ]
then
#echo [+] $i is good for upload, will add it list.
c=`echo $i | sed 's/.*\///g'`
echo $c >> /tmp/include_list.txt
fi
done
echo [+] Syncing...
rsync -rzt --include-from=/tmp/include_list.txt --include="*/" --exclude \* $save_path user#server:/home/backup/foo/
echo [+] Sunk...
done
rsync the files, then remove the ones that have been rsync'd by capturing the list of transferred files, and then removing only the transferred files that are not currently open. Rsync figures out what files to transfer when it gets to the directory, so your solution was bound to fail later even if it worked at first, when a newly opened file (since rsync started) was not in the exclude list.
An alternate approach would be to do a
find dir -type f -name pattern -mmin +10 | xargs -i rsync -aP {} dest:/path/to/backups

Retaining file permissions with Git

I want to version control my web server as described in Version control for my web server, by creating a git repo out of my /var/www directory. My hope was that I would then be able to push web content from our dev server to github, pull it to our production server, and spend the rest of the day at the pool.
Apparently a kink in my plan is that Git won't respect file permissions (I haven't tried it, only reading about it now.) I guess this makes sense in that different boxes are liable to have different user/group setups. But if I wanted to force permissions to propagate, knowing my servers are configured the same, do I have any options? Or is there an easier way to approach what I'm trying to do?
Git is Version Control System, created for software development, so from the whole set of modes and permissions it stores only executable bit (for ordinary files) and symlink bit. If you want to store full permissions, you need third party tool, like git-cache-meta (mentioned by VonC), or Metastore (used by etckeeper). Or you can use IsiSetup, which IIRC uses git as backend.
See Interfaces, frontends, and tools page on Git Wiki.
The git-cache-meta mentioned in SO question "git - how to recover the file permissions git thinks the file should be?" (and the git FAQ) is the more staightforward approach.
The idea is to store in a .git_cache_meta file the permissions of the files and directories.
It is a separate file not versioned directly in the Git repo.
That is why the usage for it is:
$ git bundle create mybundle.bdl master; git-cache-meta --store
$ scp mybundle.bdl .git_cache_meta machine2:
#then on machine2:
$ git init; git pull mybundle.bdl master; git-cache-meta --apply
So you:
bundle your repo and save the associated file permissions.
copy those two files on the remote server
restore the repo there, and apply the permission
This is quite late but might help some others. I do what you want to do by adding two git hooks to my repository.
.git/hooks/pre-commit:
#!/bin/bash
#
# A hook script called by "git commit" with no arguments. The hook should
# exit with non-zero status after issuing an appropriate message if it wants
# to stop the commit.
SELF_DIR=`git rev-parse --show-toplevel`
DATABASE=$SELF_DIR/.permissions
# Clear the permissions database file
> $DATABASE
echo -n "Backing-up permissions..."
IFS_OLD=$IFS; IFS=$'\n'
for FILE in `git ls-files --full-name`
do
# Save the permissions of all the files in the index
echo $FILE";"`stat -c "%a;%U;%G" $FILE` >> $DATABASE
done
for DIRECTORY in `git ls-files --full-name | xargs -n 1 dirname | uniq`
do
# Save the permissions of all the directories in the index
echo $DIRECTORY";"`stat -c "%a;%U;%G" $DIRECTORY` >> $DATABASE
done
IFS=$IFS_OLD
# Add the permissions database file to the index
git add $DATABASE -f
echo "OK"
.git/hooks/post-checkout:
#!/bin/bash
SELF_DIR=`git rev-parse --show-toplevel`
DATABASE=$SELF_DIR/.permissions
echo -n "Restoring permissions..."
IFS_OLD=$IFS; IFS=$'\n'
while read -r LINE || [[ -n "$LINE" ]];
do
ITEM=`echo $LINE | cut -d ";" -f 1`
PERMISSIONS=`echo $LINE | cut -d ";" -f 2`
USER=`echo $LINE | cut -d ";" -f 3`
GROUP=`echo $LINE | cut -d ";" -f 4`
# Set the file/directory permissions
chmod $PERMISSIONS $ITEM
# Set the file/directory owner and groups
chown $USER:$GROUP $ITEM
done < $DATABASE
IFS=$IFS_OLD
echo "OK"
exit 0
The first hook is called when you "commit" and will read the ownership and permissions for all the files in the repository and store them in a file in the root of the repository called .permissions and then add the .permissions file to the commit.
The second hook is called when you "checkout" and will go through the list of files in the .permissions file and restore the ownership and permissions of those files.
You might need to do the commit and checkout using sudo.
Make sure the pre-commit and post-checkout scripts have execution permission.
We can improve on the other answers by changing the format of the .permissions file to be executable chmod statements, and to make use of the -printf parameter to find. Here is the simpler .git/hooks/pre-commit file:
#!/usr/bin/env bash
echo -n "Backing-up file permissions... "
cd "$(git rev-parse --show-toplevel)"
find . -printf 'chmod %m "%p"\n' > .permissions
git add .permissions
echo done.
...and here is the simplified .git/hooks/post-checkout file:
#!/usr/bin/env bash
echo -n "Restoring file permissions... "
cd "$(git rev-parse --show-toplevel)"
. .permissions
echo "done."
Remember that other tools might have already configured these scripts, so you may need to merge them together. For example, here's a post-checkout script that also includes the git-lfs commands:
#!/usr/bin/env bash
echo -n "Restoring file permissions... "
cd "$(git rev-parse --show-toplevel)"
. .permissions
echo "done."
command -v git-lfs >/dev/null 2>&1 || { echo >&2 "\nThis repository is configured for Git LFS but 'git-lfs' was not found on you
r path. If you no longer wish to use Git LFS, remove this hook by deleting .git/hooks/post-checkout.\n"; exit 2; }
git lfs post-checkout "$#"
In case you are coming into this right now, I've just been through it today and can summarize where this stands. If you did not try this yet, some details here might help.
I think #Omid Ariyan's approach is the best way. Add the pre-commit and post-checkout scripts. DON'T forget to name them exactly the way Omid does and DON'T forget to make them executable. If you forget either of those, they have no effect and you run "git commit" over and over wondering why nothing happens :) Also, if you cut and paste out of the web browser, be careful that the quotation marks and ticks are not altered.
If you run the pre-commit script once (by running a git commit), then the file .permissions will be created. You can add it to the repository and I think it is unnecessary to add it over and over at the end of the pre-commit script. But it does not hurt, I think (hope).
There are a few little issues about the directory name and the existence of spaces in the file names in Omid's scripts. The spaces were a problem here and I had some trouble with the IFS fix. For the record, this pre-commit script did work correctly for me:
#!/bin/bash
SELF_DIR=`git rev-parse --show-toplevel`
DATABASE=$SELF_DIR/.permissions
# Clear the permissions database file
> $DATABASE
echo -n "Backing-up file permissions..."
IFSold=$IFS
IFS=$'\n'
for FILE in `git ls-files`
do
# Save the permissions of all the files in the index
echo $FILE";"`stat -c "%a;%U;%G" $FILE` >> $DATABASE
done
IFS=${IFSold}
# Add the permissions database file to the index
git add $DATABASE
echo "OK"
Now, what do we get out of this?
The .permissions file is in the top level of the git repo. It has one line per file, here is the top of my example:
$ cat .permissions
.gitignore;660;pauljohn;pauljohn
05.WhatToReport/05.WhatToReport.doc;664;pauljohn;pauljohn
05.WhatToReport/05.WhatToReport.pdf;664;pauljohn;pauljohn
As you can see, we have
filepath;perms;owner;group
In the comments about this approach, one of the posters complains that it only works with same username, and that is technically true, but it is very easy to fix it. Note the post-checkout script has 2 action pieces,
# Set the file permissions
chmod $PERMISSIONS $FILE
# Set the file owner and groups
chown $USER:$GROUP $FILE
So I am only keeping the first one, that's all I need. My user name on the Web server is indeed different, but more importantly you can't run chown unless you are root. Can run "chgrp", however. It is plain enough how to put that to use.
In the first answer in this post, the one that is most widely accepted, the suggestion is so use git-cache-meta, a script that is doing the same work that the pre/post hook scripts here are doing (parsing output from git ls-files). These scripts are easier for me to understand, the git-cache-meta code is rather more elaborate. It is possible to keep git-cache-meta in the path and write pre-commit and post-checkout scripts that would use it.
Spaces in file names are a problem with both of Omid's scripts. In the post-checkout script, you'll know you have the spaces in file names if you see errors like this
$ git checkout -- upload.sh
Restoring file permissions...chmod: cannot access '04.StartingValuesInLISREL/Open': No such file or directory
chmod: cannot access 'Notebook.onetoc2': No such file or directory
chown: cannot access '04.StartingValuesInLISREL/Open': No such file or directory
chown: cannot access 'Notebook.onetoc2': No such file or directory
I'm checking on solutions for that. Here's something that seems to work, but I've only tested in one case
#!/bin/bash
SELF_DIR=`git rev-parse --show-toplevel`
DATABASE=$SELF_DIR/.permissions
echo -n "Restoring file permissions..."
IFSold=${IFS}
IFS=$
while read -r LINE || [[ -n "$LINE" ]];
do
FILE=`echo $LINE | cut -d ";" -f 1`
PERMISSIONS=`echo $LINE | cut -d ";" -f 2`
USER=`echo $LINE | cut -d ";" -f 3`
GROUP=`echo $LINE | cut -d ";" -f 4`
# Set the file permissions
chmod $PERMISSIONS $FILE
# Set the file owner and groups
chown $USER:$GROUP $FILE
done < $DATABASE
IFS=${IFSold}
echo "OK"
exit 0
Since the permissions information is one line at a time, I set IFS to $, so only line breaks are seen as new things.
I read that it is VERY IMPORTANT to set the IFS environment variable back the way it was! You can see why a shell session might go badly if you leave $ as the only separator.
In pre-commit/post-checkout an option would be to use "mtree" (FreeBSD), or "fmtree" (Ubuntu) utility which "compares a file hierarchy against a specification, creates a specification for a file hierarchy, or modifies a specification."
The default set are flags, gid, link, mode, nlink, size, time, type, and uid. This can be fitted to the specific purpose with -k switch.
I am running on FreeBSD 11.1, the freebsd jail virtualization concept makes the operating system optimal. The current version of Git I am using is 2.15.1, I also prefer to run everything on shell scripts. With that in mind I modified the suggestions above as followed:
git push: .git/hooks/pre-commit
#! /bin/sh -
#
# A hook script called by "git commit" with no arguments. The hook should
# exit with non-zero status after issuing an appropriate message if it wants
# to stop the commit.
SELF_DIR=$(git rev-parse --show-toplevel);
DATABASE=$SELF_DIR/.permissions;
# Clear the permissions database file
> $DATABASE;
printf "Backing-up file permissions...\n";
OLDIFS=$IFS;
IFS=$'\n';
for FILE in $(git ls-files);
do
# Save the permissions of all the files in the index
printf "%s;%s\n" $FILE $(stat -f "%Lp;%u;%g" $FILE) >> $DATABASE;
done
IFS=$OLDIFS;
# Add the permissions database file to the index
git add $DATABASE;
printf "OK\n";
git pull: .git/hooks/post-merge
#! /bin/sh -
SELF_DIR=$(git rev-parse --show-toplevel);
DATABASE=$SELF_DIR/.permissions;
printf "Restoring file permissions...\n";
OLDIFS=$IFS;
IFS=$'\n';
while read -r LINE || [ -n "$LINE" ];
do
FILE=$(printf "%s" $LINE | cut -d ";" -f 1);
PERMISSIONS=$(printf "%s" $LINE | cut -d ";" -f 2);
USER=$(printf "%s" $LINE | cut -d ";" -f 3);
GROUP=$(printf "%s" $LINE | cut -d ";" -f 4);
# Set the file permissions
chmod $PERMISSIONS $FILE;
# Set the file owner and groups
chown $USER:$GROUP $FILE;
done < $DATABASE
IFS=$OLDIFS
pritnf "OK\n";
exit 0;
If for some reason you need to recreate the script the .permissions file output should have the following format:
.gitignore;644;0;0
For a .gitignore file with 644 permissions given to root:wheel
Notice I had to make a few changes to the stat options.
Enjoy,
One addition to #Omid Ariyan's answer is permissions on directories. Add this after the for loop's done in his pre-commit script.
for DIR in $(find ./ -mindepth 1 -type d -not -path "./.git" -not -path "./.git/*" | sed 's#^\./##')
do
# Save the permissions of all the files in the index
echo $DIR";"`stat -c "%a;%U;%G" $DIR` >> $DATABASE
done
This will save directory permissions as well.
Another option is git-store-meta. As the author described in this superuser answer:
git-store-meta is a perl script which integrates the nice features of git-cache-meta, metastore, setgitperms, and mtimestore.
Improved version of https://stackoverflow.com/users/9932792/tammer-saleh answer:
It only updates the permissions on changed files.
It handles symlinks
It ignores empty directories (git can not handle them)
.git/hooks/pre-commit:
#!/usr/bin/env bash
echo -n "Backing-up file permissions... "
cd "$(git rev-parse --show-toplevel)"
find . -type d ! -empty -printf 'X="%p"; chmod %m "$X"; chown %U:%G "$X"\n' > .permissions
find . -type f -printf 'X="%p"; chmod %m "$X"; chown %U:%G "$X"\n' >> .permissions
find . -type l -printf 'chown -h %U:%G "%p"\n' >> .permissions
git add .permissions
echo done.
.git/hooks/post-merge:
#!/usr/bin/env bash
echo -n "Restoring file permissions... "
cd "$(git rev-parse --show-toplevel)"
git diff -U0 .permissions | grep '^\+' | grep -Ev '^\+\+\+' | cut -c 2- | /usr/bin/bash
echo "done."

cmake source and out-of-source navigation

cmake advises to use out-of-source builds. While in general I like the idea I find it not comfortable to navigate from out-of-source sub directory to the corresponding source directory. I frequently need the code to perform some actions with code (e.g. grep, svn command etc.).
Is there an easy way in shell to navigate from out-of-source sub directory to the corresponding source directory?
Thanks
Dima
I prefer to keep it simple and have the source checkouts in a src/ directory, and the corresponding builds in a build/ directory. Then I can use
function cs() {
cd "${PWD/build/src}"
}
function cb() {
cd "${PWD/src/build}"
}
Cf. also KDE's TechBase for another approach.
I think that the most straightforward and convenient way to do this is simply open more than one Shell session, i.e. tabs. For example, KDE4 Konsole supports tabs and you can navigate through them with Shift + ArrowLeft or ArrowRight. IMHO very comfortable plus it preserves history better.
Have you tried the pushd and popd builtins?
When in the /source/directory
pushd /to/build/directory
Work there
popd ## Back to source directory
You can even stack that deeper...
Ok this is what what I found. The idea is to use CMakeCache.txt which is found in out-of-source tree. We go up while looking for cache file. Once found we extract source directory, which is stored in CMAKE_HOME_DIRECTORY varible.
function cdoos2src () {
path=$(pwd)
while [[ $path != "/" ]]; do
if [[ -f $path/CMakeCache.txt ]]; then
break;
fi
# go up one directory
path=$(dirname $path)
done
if [[ $path == "/" ]]; then
echo "This is not svn out-of-source directory - CmakeCache.txt not found"
return
fi
oos_dir=$(pwd)
code_dir=$(sed -n '/CMAKE_HOME_DIRECTORY/s/.*=//p' $path/CMakeCache.txt)
echo "Code dir [$code_dir]"
code_sub_dir=$(echo $(pwd) | sed -n 's#^'$path'##p')
echo "Code sub dir [$code_sub_dir]"
cd $code_dir/$code_sub_dir
}

Resources