LINUX: How to softlink specific files in all subdirectories - linux

I want to create soft links (ln -s) to folder2 of all the files that contain *foo* in its name, and can be found in some or all the subdirectories of folder1.
I've tried it with for, find, and find -exec ln, and a combination of them, but all I get is a broken link named *foo* or a link to everything inside folder1

With globstar at one stroke.
shopt -s globstar
cd folder2
ln -s ../folder1/**/*foo* .
cd is needed for relative links (this applies for below answers that utilizes find too). If you want absolute links, do
shopt -s globstar
ln -s /where/is/it/folder1/**/*foo* folder2/

If you're in the target folder you want to create the symbolic link, just use ln -s <target file>. The sym link name will be same as the target file.
If you need to do this for several file, use a for loop.
Example:
$ mkdir folder1 folder2
$ cd folder1
$ touch foo foobar foofoobar foobarfoo bar barfoo barbar
$ ls
bar barbar barfoo foo foobar foobarfoo foofoobar
$ cd ../folder2
$ for i in ../folder1/*foo*; do ln -s $i; done
$ ls -l
total 0
lrwxrwxrwx 1 abc abc 17 oct. 26 11:57 barfoo -> ../folder1/barfoo
lrwxrwxrwx 1 abc abc 14 oct. 26 11:57 foo -> ../folder1/foo
lrwxrwxrwx 1 abc abc 17 oct. 26 11:57 foobar -> ../folder1/foobar
lrwxrwxrwx 1 abc abc 20 oct. 26 11:57 foobarfoo -> ../folder1/foobarfoo
lrwxrwxrwx 1 abc abc 20 oct. 26 11:57 foofoobar -> ../folder1/foofoobar

Try this,
for fileName in `find folder1 -name *foo*`
do
name1=`basename $fileName`
ln -sf $fileName folder/$name1
done

Check this
for file in `find . -name *foo* -print`
do
ln -s $file folder2/
done

Related

replacement on xargs variable returns empty string

I need to search for XML files inside a directory tree and create links for them on another directory (staging_ojs_pootle), naming these links with the file path (replacing slashes per dots).
the bash command is not working, I got stuck on the replacement part. Seems like the variable from xargs, named 'file', is not accessible inside the replacement code (${file/\//.})
find directory/ -name '*.xml' | xargs -I 'file' echo "ln" file staging_ojs_pootle/${file/\//.}
The replacement inside ${} result gives me an empty string.
Tried using sed but regular expressions were replacing all or just the last slash :/
find directory/ -name '*.xml' | xargs -I 'file' echo "ln" file staging_ojs_pootle/file |sed -e '/^ln/s/\(staging_ojs_pootle.*\)[\/]\(.*\)/\1.\2/g'
regards
Try this:
$ find directory/ -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 staging_ojs_pootle/\1|e'
For example:
$ mkdir -p /tmp/test
$ touch {1,2,3,4}.xml
# use /tmp/test as staging_ojs_pootle
$ find /tmp/test -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 /tmp/test/\1|e'
$ ls -al /tmp/test
total 8
drwxr-xr-x. 2 root root 4096 Jun 15 13:09 .
drwxrwxrwt. 9 root root 4096 Jun 15 11:45 ..
-rw-r--r--. 2 root root 0 Jun 15 11:45 1.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 2.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 3.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 4.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.1.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.2.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.3.xml
-rw-r--r--. 2 root root 0 Jun 15 11:45 .tmp.test.4.xml
# if don NOT use the e modifier of s command, we can get the final command
$ find /tmp/test -name '*.xml' |sed -r 'h;s|/|.|g;G;s|([^\n]+)\n(.+)|ln \2 /tmp/test/\1|'
ln /tmp/test/1.xml /tmp/test/.tmp.test.1.xml
ln /tmp/test/2.xml /tmp/test/.tmp.test.2.xml
ln /tmp/test/3.xml /tmp/test/.tmp.test.3.xml
ln /tmp/test/4.xml /tmp/test/.tmp.test.4.xml
Explains:
for each xml file, use h to keep the origin filename in hold space.
the use s|/|.|g to substitute all / to . for xml filename.
use G to append the hold space to pattern space, then pattern space is CHANGED_FILENAME\nORIGIN_FILENAME.
use s|([^\n]+)\n(.+)|ln \2 staging_ojs_pootle/\1|e' to merge the command with CHANGED_FILENAME and ORIGIN_FILENAME, then use e modifier of s command to execute the command assembled above, which will do the actual works.
Hope this helps!
If you can be sure that the names of your XML files do not contain any word-splitting characters, you can use something like:
find directory -name "*.xml" | sed 'p;s/\//./' | xargs -n2 echo ln

Why can I remove file without user permission?

I made small test to check how permissions work:
test#comp ~/Documents $ touch test1
test#comp ~/Documents $ ls -l
-rw-r--r-- 1 test test 0 Jul 24 22:14 test1
test#comp ~/Documents $ chmod 044 test1
test#comp ~/Documents $ ls -l
----r--r-- 1 test test 0 Jul 24 22:14 test1
test#comp ~/Documents $ cat test1
cat: test1: Permission denied
test#comp ~/Documents $ rm test1
rm: remove write-protected regular empty file ‘test1’? y
test#comp ~/Documents $ ls -l
total 0
My question is, why when I have no permission on user I can't read file but I can remove it?
In order to remove the file one needs a write permission on the directory that contains this file.
For more information: http://linuxcommand.org/lts0070.php

List all files with file count as one of the output columns in $BASH?

Is there a way to show files similarly to ls -al that would also also show the file count of the directories listed? Sort of like an ls -al with ls -1 | wc -l as the final column? I've tried switching arguments out, and have pretty much given up on a pipe because I hit syntax errors whenever I try to manipulate the results much. Separately, they're golden, so I feel like I'm missing something obvious. A way to modify ls so it would also show file count of directories that it lists seems like it should be, at least. Does anyone know of a way to get this to work?
Directories
ls -al | awk '/^d/{d++}{print}END{print "Directories: "d}'
All files
ls -al | awk '{print}END{print "Files:" NR}'
I think something like this would be closer to what you want
> mkdir testdir && cd testdir && touch a && ln -s a b && mkdir c && touch c/{1..10}
> shopt -s dotglob; for i in *; do [[ -d $i ]] && paste <(ls -ld "$i") <(find "$i" -mindepth 1 | wc -l) || ls -l "$i"; done
-rw-rw-r-- 1 user user 0 Jul 8 00:04 a
lrwxrwxrwx 1 user user 1 Jul 8 00:04 b -> a
drwxrwxr-x 2 user user 4096 Jul 8 00:04 c/ 10

Chmod recursively

I have an archive, which is archived by someone else, and I want to automatically, after I download it, to change a branch of the file system within the extracted files to gain read access. (I can't change how archive is created).
I've looked into this thread: chmod: How to recursively add execute permissions only to files which already have execute permission as into some others, but no joy.
The directories originally come with multiple but all wrong flags, they may appear as:
drwx------
d---r-x---
drwxrwxr-x
dr--r-xr--
Those are just the few I've discovered so far, but could be more.
find errors when tries to look into a directory with no x permission, and so doesn't pass it to chmod. What I've been doing so far, is manually change permissions on the parent directory, then go into the child directories and do the same for them and so on. But this is a lot of hand labour. Isn't there some way to do this automatically?
I.e. how I am doing it now:
do:
$ chmod -R +x
$ chmod -R +r
until I get no errors, then
$ find -type f -exec chmod -x {} +
But there must be a better way.
You can use chmod with the X mode letter (the capital X) to set the executable flag only for directories.
In the example below the executable flag is cleared and then set for all directories recursively:
~$ mkdir foo
~$ mkdir foo/bar
~$ mkdir foo/baz
~$ touch foo/x
~$ touch foo/y
~$ chmod -R go-X foo
~$ ls -l foo
total 8
drwxrw-r-- 2 wq wq 4096 Nov 14 15:31 bar
drwxrw-r-- 2 wq wq 4096 Nov 14 15:31 baz
-rw-rw-r-- 1 wq wq 0 Nov 14 15:31 x
-rw-rw-r-- 1 wq wq 0 Nov 14 15:31 y
~$ chmod -R go+X foo
~$ ls -l foo
total 8
drwxrwxr-x 2 wq wq 4096 Nov 14 15:31 bar
drwxrwxr-x 2 wq wq 4096 Nov 14 15:31 baz
-rw-rw-r-- 1 wq wq 0 Nov 14 15:31 x
-rw-rw-r-- 1 wq wq 0 Nov 14 15:31 y
A bit of explaination:
chmod -x foo - clear the eXecutable flag for foo
chmod +x foo - set the eXecutable flag for foo
chmod go+x foo - same as above, but set the flag only for Group and Other users, don't touch the User (owner) permission
chmod go+X foo - same as above, but apply only to directories, don't touch files
chmod -R go+X foo - same as above, but do this Recursively for all subdirectories of foo
You need read access, in addition to execute access, to list a directory. If you only have execute access, then you can find out the names of entries in the directory, but no other information (not even types, so you don't know which of the entries are subdirectories). This works for me:
find . -type d -exec chmod +rx {} \;
Try to change all the persmissions at the same time:
chmod -R +xr
To make everything writable by the owner, read/execute by the group, and world executable:
chmod -R 0755
To make everything wide open:
chmod -R 0777
Adding executable permissions, recursively, to all files (not folders) within the current folder with sh extension:
find . -name '*.sh' -type f | xargs chmod +x
* Notice the pipe (|)
Give 0777 to all files and directories starting from the current path :
chmod -R 0777 ./

Unable to remove everything else in a folder except FileA

How can I remove everything else in a folder except FileA, even hidden files?
I use Ubuntu.
I tried the following unsuccessfully
rm [^fileA]
find . -not -name fileA -exec rm {} \;
Note that this will only delete files, not folders. Believe me, you don't want to delete folders like that.
Use extglob. Assuming that FileA is a regular file (i.e. does not begin with a .), then you can do:
shopt -s extglob # Enable extglob
rm !(FileA) .* # Remove all regular files not named FileA and all hidden files
If instead FileA is a hidden file, this won't work, since the !(pattern) construct only creates a list of all regular files not matching pattern.
You also could do it interactively,
rm -i * .*
The * is for all files (except hidden files).
The .* is for all hidden files
gene#vmware:/tmp/test$ ls -al
total 8
drwxr-xr-x 2 gene gene 4096 2009-03-11 12:51 .
drwxrwxrwt 12 root root 4096 2009-03-11 12:51 ..
-rw-r--r-- 1 gene gene 0 2009-03-11 12:51 fileA
-rw-r--r-- 1 gene gene 0 2009-03-11 12:51 .fileB
gene#vmware:/tmp/test$ rm -i * .*
rm: remove regular empty file `fileA'? n
rm: cannot remove directory `.'
rm: cannot remove directory `..'
rm: remove regular empty file `.fileB'? y
gene#vmware:/tmp/test$ ls -al
total 8
drwxr-xr-x 2 gene gene 4096 2009-03-11 12:51 .
drwxrwxrwt 12 root root 4096 2009-03-11 12:51 ..
-rw-r--r-- 1 gene gene 0 2009-03-11 12:51 fileA
For multiple files, the following will remove all files apart from those that have FileA or FileB in the name.
for file in *
do
if [ x`echo $file | grep -ve "\(FileA\|FileB\)"` == x ]; then
rm $file
fi
done
It's more useful in a long list of files. If it's only a short list, I'd go with CoverosGene's response.
Most ways of doing this based on parsing the directory list are likely to be error prone.
If you have write access to the parent directory, and your necessary file is in sub-directory foo, how about:
% mkdir bar
% mv foo/fileA bar
% rm -rf foo
% mv bar foo
i.e. get your essential file(s) the hell out of the way first!

Resources