how to copy between folders and parent folder without complete path - linux

This is a basic question but I am struggling to find a decent solution. This is hindering my script from automation.
I have the following path.
/home/hassan/Dyna/ProjectSimulation
in project simulation I have 3 folders
friction time force
like
/home/hassan/Dyna/ProjectSimulation/friction
Now I have a file friction1.txt in this friction folder and I want to copy it to ProjectSimulation.
is it possible to avoid complete path and just one step down?
Also if I have to copy this friction1.txt to folder force, is there anyway to avoid the complete path.
I mean I have a subroutine but this is path dependent , whenever I run it , I have to run in the same folder and then copy my results so I can run only one instance of my simulation.
Experts please guide me.
PS: This is part of a 600 lines shell.

This comes across as so basic that I must have misunderstood something in your question.
If you want to refer to a parent directory, .. is the way to do that. So, if you want to copy friction1.txt to two places you just do
cp friction1.txt ..
cp friction1.txt ../force
All you need to take care of is making sure that CWD is
/home/hassan/Dyna/ProjectSimulation/friction
so that the references point at the right place.

You can temprarily change the current directory to ProjectSimulation, copy the file (cp friction/friction1.txt .), then change the path back to the original (so the rest of the script works as before)
Alternatively, you can use dirname to get he name of the parent directory and use that.

Change to the root dir of your known directory structure. Then do the copy operations with relative paths. Then change back to your dir where you came from.
Your friends are:
cd
cd -
or better:
pushd
popd
(see man bash)
I.e.
pushd /home/hassan/Dyna/ProjectSimulation
cp friction/friction1.txt .
cp friction/friction1.txt force
popd

Related

Execute a bash script without typing ./ [duplicate]

I feel like I'm missing something very basic so apologies if this question is obtuse. I've been struggling with this problem for as long as I've been using the bash shell.
Say I have a structure like this:
├──bin
├──command (executable)
This will execute:
$ bin/command
then I symlink bin/command to the project root
$ ln -s bin/command c
like so
├──c (symlink to bin/command)
├──bin
├──command (executable)
I can't do the following (errors with -bash: c: command not found)
$ c
I must do?
$ ./c
What's going on here? — is it possible to execute a command from the current directory without preceding it with ./ and also without using a system wide alias? It would be very convenient for distributed executables and utility scripts to give them one letter folder specific shortcuts on a per project basis.
It's not a matter of bash not allowing execution from the current directory, but rather, you haven't added the current directory to your list of directories to execute from.
export PATH=".:$PATH"
$ c
$
This can be a security risk, however, because if the directory contains files which you don't trust or know where they came from, a file existing in the currently directory could be confused with a system command.
For example, say the current directory is called "foo" and your colleague asks you to go into "foo" and set the permissions of "bar" to 755. As root, you run "chmod foo 755"
You assume chmod really is chmod, but if there is a file named chmod in the current directory and your colleague put it there, chmod is really a program he wrote and you are running it as root. Perhaps "chmod" resets the root password on the box or something else dangerous.
Therefore, the standard is to limit command executions which don't specify a directory to a set of explicitly trusted directories.
Beware that the accepted answer introduces a serious vulnerability!
You might add the current directory to your PATH but not at the beginning of it. That would be a very risky setting.
There are still possible vulnerabilities when the current directory is at the end but far less so this is what I would suggest:
PATH="$PATH":.
Here, the current directory is only searched after every directory already present in the PATH is explored so the risk to have an existing command overloaded by an hostile one is no more present. There is still a risk for an uninstalled command or a typo to be exploited, but it is much lower. Just make sure the dot is always at the end of the PATH when you add new directories in it.
You could add . to your PATH. (See kamituel's answer for details)
Also there is ~/.local/bin for user specific binaries on many distros.
What you can do is add the current dir (.) to the $PATH:
export PATH=.:$PATH
But this can pose a security issue, so be aware of that. See this ServerFault answer on why it's not so good idea, especially for the root account.

How to go to another directory sharing same parent directory?

I am beginner in linux and wondering about a shorter way to go a directory having same parent directory. Here I elaborate that.
dir1
- dir11
- dir12
- dir13
- dir14
Directory dir1 has sub-directories dir11,...,dir14. I am at directory dir13 and want to go to dir12. What is direct way to do this?
I can do
cd..
cd dir12/
But I am wondering whether I can do this in single step. Any ideas?
cd doesn't only take one directory to change to. You can provide a complete path to which you want to go.
You can simply type cd ../dir12
You can also go all nuts with cd ../dir12/../dir13/../dir14/..
Try also the following: type cd ..<press Tag twice>. This will list you all directories in the path you have given.
There are wildcards that cd can use. For example ~ is your home directory. cd ~ or just cd will change into your $HOME.
See man cd (in a terminal) for more.

cd command : how to go back an unknown number of levels from current subdirectory to a particular parent directory (unix and dos)

Ok, so I am trying to resolve a uri in an xmlcatalog and I want to go back from a particular sub-directory back to a parent-directory that is an-unknown-number-of-levels behind.
eg:
file:///D:/Sahil/WorkSpaces1/Cartridges1/Project1/ParticularFolder/Level1/Level2/<so-many-levels>/CurrentFolder
I want to go back from "CurrentFolder" to "ParticularFolder" without typing in the full FilePath.
I want to achieve this because, I work in multiple Projects which all have "ParticularFolder" in it, so the codes inside the sub-directories of this folder should dynamically have access to all other files in other sub-directories inside this parent folder. I do not want to specify separate full filepaths for my various projects and make the code too rigid.
Is it possible? Please mention how to achieve this in windows, unix as well as linux os.
In UNIX/Linux/OS X/etc.:
while [ "$(basename $PWD)" != "ParticularFolder" ]; do cd ..; done

How can I preserve aliases when copying folders on the command line in OSX?

I'm trying to write a personal backup command-line utility on OSX. Let's say I have two folders:
foo/bar/
foo/baz/
foo/bar contains, among other things, OSX aliases to files in foo/baz:
foo/bar/file_alias# -> foo/baz/file
I want to copy both foo/bar and foo/baz to an external hard drive, but for various reasons I do not just want to copy the entire folder foo. I can't figure out a way to copy these folders separately and make the aliases come out right in the end:
cp -r foo/bar /external_hd/foo/bar follows the aliases, replacing them with the original files.
cp -R foo/bar /external_hd/foo/bar preserves the aliases, but they (not surprisingly) continue to point to the original files (e.g. foo/baz/file, not external_hd/foo/baz/file).
rsync -avE foo/bar /external_hd/foo/bar (see this question) seems to do the same thing as cp -R.
Is there any way to accomplish this without copying the entire parent folder foo?
I know of no way where you can automatically copy folders and relink symbolic links to a new destination without some manual intervention. If you know the new paths its quite simple to script, though.
For your specific example; the following should do the trick to relink:
cd /external_hd/foo
find . -type l | while read x; do y=$(readlink "$x" | sed s'|/foo|/external_hd/foo|'); ln -sf "$y" "$x";done
rsync will get you close, the command:
rsync -avHER --safe-links foo/{bar,baz} /external_hd/
will copy the two folders, preserve "safe" relative symlinks between, and ignore "unsafe" symlinks - those that may reference files outside of the copied tree. Change it to:
rsync -avHER --copy-unsafe-links foo/{bar,baz} /external_hd/
and "safe" relative symlinks are preserve and "unsafe" symlinks are replaced by their destination.
If you only have "safe" relative symlinks the first option will do, the second option may do if some extra copying is OK.
However, the definition of "safe" is over-restrictive. Any absolute symlink is "unsafe" even if its target is within the copied tree. Furthermore even a relative link which goes too far towards the root, or maybe is just too complicated, is also "unsafe".
If you need to fix this it should be possible, as the above options show rsync is pretty close to what you need and the source code is available from Apple's Open Source site. Examine the code around the options --links, --copy-links, --copy-unsafe-links & unsafe-links and you may find fixing the definition of "safe" is fairly easy (and you can re-write the symlinks to use the shortest possible relative path at the same time).
HTH

Change working directory while looping over folders

Currently I am trying to run MRI software (TBSS) on imaging files(scan.nii.gz) on the Linux command line.
The scans are all stored in separate folders for different participants and the file names are identical,so:
/home/scans/participant1/scan.nii.gz
/home/scans/participant2/scan.nii.gz
/home/scans/participant3/scan.nii.gz
What this software does is it creates the result of the analysis in the current working directory.Since the scans have the same image name, they get overwritten al the time.
I would like to loop through all the participant folders, make it my working directory and then execute the tbss command, which is simply tbss_1_preproc scan.nii.gz. In this way, the file will be stored in the current working directory,which is the participant directory.
Is there any sensible way of doing this in Linux ?
Thanks so much !
Try it in BASH. The code below is untested, but it should give you a clue
#! /bin/bash
find . -name scan.nii.gz | while read line
do
cd $(dirname "${line}")
tbss_1_preproc $(basename "${line}")
done
Put it in a file and make it executable. Copy it to your scans folder and execute it.

Resources