Symlink multiple files to an existing folder - linux

I have this command:
ln -sf src/* lang/golang/src/genericc/
I want to symlink all the files in src to the existing genericc directory, but when I run the above command I get broken symlinks in the destination. Anyone know how to do this?

Symlinks created with relative paths (i.e. where the source path doesn't start with "/") get resolved relative to the directory the link is in. That means a link to "src/foo.c" in the lang/golang/src/genericc/ directory would try to resolve to lang/golang/src/genericc/src/foo.c which probably doesn't exist.
Solution: either use an absolute path to the source files, like this:
ln -sf /path/to/src/* lang/golang/src/genericc/
or, to get the * wildcard to work right with a correct command, cd to the target directory so the relative paths will work the same way during creation that they will during resolution:
cd lang/golang/src/genericc
ln -sf ../../../../src/* ./

First of all, you can try ln -s $PATH_TO_SRC/* $PATH_TO_TARGET/.
However, it might have the "Argument list too long error".
Then you can use:
find $PATH_TO_SRC/ -type f -name "*.jpg" -exec cp -s {} . \;
Because if you use ln -s with find or bash loop, it will only create an empty link. Instead, we can use cp -sto create a smylink as well.

With the -r option, ln creates a link to the actual files wherever they are:
ln -srf src/* lang/golang/src/genericc/

Related

How to force "ln --symbolic" as --force does not work?

I am trying to do a ln -symbolyc link for all the .* files.
The problem is that it works first time, but second time it fails, and it looks that --force does not work
This is the code:
ln --symbolic --relative --force ./websites/web1es/.* ./websites/webtable/
This is the error:
ln: ./websites/webtable/.: cannot overwrite directory
ln: './websites/web1es/..' and './websites/webtable/..' are the same file
Does anyone have any idea?
Thanks a lot in advance for any clue!
When you specify .* in the shell, that includes . and ... If you specify a directory as the last argument, all the input files are linked into the destination directory with their same name as in the source directory.
As a result, your script is linking ./websites/web1es/.. to ./websites/webtable/... Unfortunately, the latter exists and is the parent directory of both directories, so deleting it is not possible. Moreover, as ln is telling you, the source and destination are the same file (or, in this case, directory), so even if ln could delete the destination, you'd experience data loss by doing so, so it's refusing.
Your solution should be to avoid handling . and ... For example, you could write this:
find ./websites/web1es -mindepth 1 -maxdepth 1 -name '.*' -print0 | \
xargs -0 -I {} ln -srf {} ./website/webtable
find does not enumerate . and .. here.

Run script on every level of directory

I have a script called summarize.sh which produces a summary of the file/dirs inside of a directory. I would like to have it run recursively down the whole tree from the top. Whats a good way to do this?
I have tried to loop it with a for loop with
for dir in */; do
cd $dir
./summarize.sh
cd ..
however it returns ./summarize.sh: no file or directory
Is it because I am not moving the script as I run it? I am not very familiar with unix directories.
You can recursively list files using find . -type f and make your script take the interested file as a first argument, so you can do find . -type f -exec myScript.sh {} \;
If you want directories only, use find . -type d instead, or if you want both use just find . without restriction.
Additional option by name, e.g. find . -name '*.py'
Finally, if you do not want to recurse down the directory structure, i.e. only summarize the top level, you can use -maxdepth 1 option, so something like find . -type d -maxdepth 1 -exec myScript.sh {} \;.
The issue is that you are changing to a different directory with the cd command while your summarize.sh script is not located in these directories. One possible solution is to use an absolute path instead of a relative one. For example, change:
./summarize.sh
to something like:
/path/to/file/summarize.sh
Alternatively, under the given example code, you can also use a relative path pointing to the previous directory like this:
../summarize.sh
Try this code if you are running Bash 4.0 or later:
#! /bin/bash -p
shopt -s nullglob # Globs expand to nothing when they match nothing
shopt -s globstar # Enable ** to expand over the directory hierarchy
summarizer_path=$PWD/summarize.sh
for dir in **/ ; do
cd -- "$dir"
"$summarizer_path"
cd - >/dev/null
done
shopt -s nullglob avoids an error in case there are no directories under the current one.
The summarizer_path variable is set to an absolute path for the summarize.sh program. That is necessary to allow it to be run in directories other than the current one. (./summarize.sh only works in the current directory, ..)
Use cd -- ... to avoid problems if any directory name begins with '-'.
cd - >/dev/null to cd to the previous directory, and throw away its path when it is output by cd -.
Shellcheck issues several warnings about the code above, all to do with the use of cd. I'd fix them for "real" code.

link files within directory, with simple command similar to cp

Where my question originated:
When running cp source/files.all destination/, all the files within source will now also exist in destination
Question:
What if I didn't want to duplicate the data from source into destination, but simply link them (with absolute path). Usually, I would run something like:
for f in $(ls source/); do ln -s $(pwd)/${f} $(pwd)/destination; done
Is there a simple command/tool that I can use (e.g. ln -a source/files.all destination/) which would create a softlink to all files in a directory, while automatically adding the absolute path as prefix. ln -r is close to what I need, but the absolute path, not the relative one?
I would use find "$PWD/source" -exec ln -s {} destination \;. The absolute path used as the first argument to find will cause {} to be replaced by an absolute path to the source file for each command.
GNU ln supports the -t option to specify the destination directory, allowing you to use a more efficient invocation of find:
find "$PWD/source" -exec ln -s -t destination {} +
The -exec ... + form requires {} to be the last argument in the command; -t lets you move the destination argument up to accommodate that requirement.
So I eventually sort of found a simple way to do this:
Simply run ln -s $(pwd -P)/source/* destination/

linux cp: how to have it follow links but not stop if a link target doesn't exist

I want to recursively copy a dir and have the targets of the links copied, but I do not want the cp to stop if a target of a link does not exist.
For example, I run this command:
cp -fprL /path/to/src_dir /path/to_dest_dir
But the first time it hits symlink where the target doesn't exist it exits:
cp: cannot stat `/path/to/non-existent/file': No such file or directory
Is there some way to get cp to silently skip these and continue on?
With the standard GNU toolchain, no, there's no way.
You could instead copy your files, keeping symlinks as symlinks, then use find -follow -type l -delete to delete the broken symlinks, and then copy again, this time following symlinks.
Of course, you could also just write a python etc. program to do the copy for you, or find all files in the original trees that are not broken symlinks and use these with cp, replacing parts of the path with the target path using sed:
find -type d|sed 's/^\(.*\)/"\1" "\/target\/\1"/g'|xargs -p mkdir
find -follow -not -type l -not -type d|sed 's/^\(.*\)/"\1" "\/target\/\1"/g'|xargs -n2 cp
sed will duplicate your found file path, prefixing it with the target directory.

CentOS: Copy directory to another directory

I'm working with a CentOS server. I have a folder named test located in /home/server/folder/test. I need to copy the directory test to /home/server/. How can I do it?
cp -r /home/server/folder/test /home/server/
To copy all files, including hidden files use:
cp -r /home/server/folder/test/. /home/server/
As I understand, you want to recursively copy test directory into /home/server/ path...
This can be done as:
-cp -rf /home/server/folder/test/* /home/server/
Hope this helps
This works for me.
cp -r /home/server/folder/test/. /home/server
For copy directory use following command
cp -r source Destination
For example
cp -r /home/hasan /opt
For copy file use command without -r
cp /home/file /home/hasan/

Resources