Get ONLY sym links to a file - linux

I looked into symbolic link: find all files that link to this file and https://stackoverflow.com/questions/6184849/symbolic-link-find-all-files-that-link-to-this-file but they didn't seem to solve the problem.
if I do find -L -samefile path/to/file
the result contains hard links as well as sym links.
I've been trying to come up with a solution to fetch ONLY sym links, but can't seem to figure it out.
I've been trying to combine -samefile and -type l but that got me nowhere.
man find
says you can combine some options into an expression, but I failed to do it properly.
Any help greatly appreciated!

Ok, I completely misread the question at first.
To find only symlinks to a certain file, I think it's still good approach to combine multiple commands.
So you know the file you want to link to, let's call it targetfile.txt. We have our directory structure like this:
$ ls -laR
.:
total 24
drwxrwxr-x 4 telorb telorb 4096 Mar 28 09:51 .
drwxrwxr-x 57 telorb telorb 4096 Mar 28 09:49 ..
-rw-rw-r-- 1 telorb telorb 21 Mar 28 09:51 another_file.txt
drwxrwxr-x 2 telorb telorb 4096 Mar 28 09:52 folder1
drwxrwxr-x 2 telorb telorb 4096 Mar 28 09:53 folder2
-rw-rw-r-- 3 telorb telorb 28 Mar 28 09:52 targetfile.txt
./folder1:
total 12
drwxrwxr-x 2 telorb telorb 4096 Mar 28 09:52 .
drwxrwxr-x 4 telorb telorb 4096 Mar 28 09:51 ..
-rw-rw-r-- 3 telorb telorb 28 Mar 28 09:52 hardlink
lrwxrwxrwx 1 telorb telorb 17 Mar 28 09:49 symlink1 -> ../targetfile.txt
./folder2:
total 12
drwxrwxr-x 2 telorb telorb 4096 Mar 28 09:57 .
drwxrwxr-x 4 telorb telorb 4096 Mar 28 09:51 ..
-rw-rw-r-- 3 telorb telorb 28 Mar 28 09:52 hardlink2
lrwxrwxrwx 1 telorb telorb 17 Mar 28 09:57 symlink2_to_targetfile -> ../targetfile.txt
lrwxrwxrwx 1 telorb telorb 19 Mar 28 09:53 symlink_to_anotherfile -> ../another_file.txt
The file in folder1/hardlink is a hardlink to targetfile.txt, folder1/symlink1 is a symbolic link we are interested, and same with folder2/symlink2_to_targetfile. There is also another symlink to another file, which we are not interested in.
The approach I would take is first use find . -type l to get symbolic links recursively from specified folder (and we still have full path information).
Then pipe that to xargs and ls -l to get the information which file the link is pointing to, and finally grep our targetfile.txt, so that we remove links that are not pointing to our desired file. The command in full:
find . -type l | xargs -I % ls -l % | grep targetfile.txt
lrwxrwxrwx 1 telorb telorb 17 Mar 28 09:57 ./folder2/symlink2_to_targetfile -> ../targetfile.txt
lrwxrwxrwx 1 telorb telorb 17 Mar 28 09:49 ./folder1/symlink1 -> ../targetfile.txt
The xargs -I % ls -l % sometimes confuses people. Basically with -I % you are telling xargs that % sign will denote all places where you want xargs place the input it receives. So it will effectively replace it to ls -l output_of_find_command

Related

How to combine multiple tar files into a single tar file

According to gnu documentation, to add one or more archives to the end of another archive, I can use the ‘--concatenate’ operation.
But in my testing, I found that I can't add more than one file at a time.
# ls -al
total 724
drwxr-xr-x. 3 root root 60 Oct 14 17:40 .
dr-xr-xr-x. 32 root root 4096 Oct 14 16:28 ..
-rw-r--r--. 1 root root 245760 Oct 14 18:07 1.tar
-rw-r--r--. 1 root root 245760 Oct 14 18:07 2.tar
-rw-r--r--. 1 root root 245760 Oct 14 18:07 3.tar
# tar tvf 1.tar
-rw-r--r-- root/root 238525 2021-10-14 17:28 1.txt
# tar tvf 2.tar
-rw-r--r-- root/root 238525 2021-10-14 17:29 2.txt
# tar tvf 3.tar
-rw-r--r-- root/root 238525 2021-10-14 17:29 3.txt
It appears that it only picked up the first parameter and ignored that rest
# tar -A -f 1.tar 2.tar 3.tar
# tar tvf 1.tar
-rw-r--r-- root/root 238525 2021-10-14 17:28 1.txt
-rw-r--r-- root/root 238525 2021-10-14 17:29 2.txt
As described in an excellent and comprehensive Super User answer,
this is a known bug in gnu tar (reported in August 2008)

Set the permissions of all files copied in a folder the same

I would like to create a folder (in Linux) that can be used as cloud-like storage location, where all files copied there automatically will have g+rw permissions (without the need of chmod'ing), such that they are readable and writable by people beloning to that specific group.
You can use the command setfacl, e.g.:
setfacl -d -m g::rwx test/
It sets the rwx permissions to every new file in test/ folder.
$ touch test/test
$ ls -la test/
total 48
drwxr-xr-x 2 manu manu 4096 Jan 28 08:39 .
drwxrwxrwt 20 root root 40960 Jan 28 08:39 ..
-rw-r--r-- 1 manu manu 0 Jan 28 08:39 test
$ setfacl -d -m g::rwx test/
$ ls -la test/
total 48
drwxr-xr-x+ 2 manu manu 4096 Jan 28 08:39 .
drwxrwxrwt 20 root root 40960 Jan 28 08:39 ..
-rw-r--r-- 1 manu manu 0 Jan 28 08:39 test
$ touch test/test2
$ ls -la test/
total 48
drwxr-xr-x+ 2 manu manu 4096 Jan 28 08:40 .
drwxrwxrwt 20 root root 40960 Jan 28 08:39 ..
-rw-r--r-- 1 manu manu 0 Jan 28 08:39 test
-rw-rw-r-- 1 manu manu 0 Jan 28 08:40 test2

How do I grep the contents of files returned by ls and grep?

How do I grep on files returned from a ls and grep command
e.g.
# ls -alrth /app/splunk_export/*HSS* | grep 'Nov 24 11:*'
-rw-r--r-- 1 root root 63K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01HAM_CGP.csv
-rw-r--r-- 1 root root 40K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01HAM_USCDB.csv
-rw-r--r-- 1 root root 138K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 167K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01KPR_FE.csv
-rw-r--r-- 1 root root 71K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01KPR_USCDB.csv
-rw-r--r-- 1 root root 63K Nov 24 11:17 /app/splunk_export/A20171124.1000+1300-1100+1300_HSS01KPR.csv
-rw-r--r-- 1 root root 25K Nov 24 11:17 /app/splunk_export/A20171124.1030+1300-1100+1300_HSS01HAM_CGP.csv
-rw-r--r-- 1 root root 75K Nov 24 11:17 /app/splunk_export/A20171124.1030+1300-1100+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 90K Nov 24 11:17 /app/splunk_export/A20171124.1030+1300-1100+1300_HSS01KPR_FE.csv
-rw-r--r-- 1 root root 28K Nov 24 11:17 /app/splunk_export/A20171124.1030+1300-1100+1300_HSS01KPR.csv
-rw-r--r-- 1 root root 15K Nov 24 11:17 /app/splunk_export/A20171124.1045+1300-1100+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 140K Nov 24 11:17 /app/splunk_export/A20171124.1045+1300-1100+1300_HSS01KPR_FE.csv
-rw-r--r-- 1 root root 15K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1115+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 140K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1115+1300_HSS01KPR_FE.csv
-rw-r--r-- 1 root root 25K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1130+1300_HSS01HAM_CGP.csv
-rw-r--r-- 1 root root 75K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1130+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 91K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1130+1300_HSS01KPR_FE.csv
-rw-r--r-- 1 root root 28K Nov 24 11:34 /app/splunk_export/A20171124.1100+1300-1130+1300_HSS01KPR.csv
-rw-r--r-- 1 root root 15K Nov 24 11:34 /app/splunk_export/A20171124.1115+1300-1130+1300_HSS01HAM.csv
-rw-r--r-- 1 root root 139K Nov 24 11:34 /app/splunk_export/A20171124.1115+1300-1130+1300_HSS01KPR_FE.csv
I would like to search the above files for the following string 1693701622
I have tried using xargs, but need some guidance.
# ls -alrth /app/splunk_export/*HSS* | grep 'Nov 24 11:*' | xargs grep -l 1693701622
grep: invalid option -- '-'
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
NOTE: possible duplicate here but I think mine is slightly different
You are not extracting the file name and the whole line (with the leading dashes) is being picked up by xargs and that's why the error.
Use awk to do the filtering. That would work better than grep since it handles repeated spaces gracefully:
ls -alrth | awk 'match($6$7$8, /Nov2411:.*/) { print $9 }' | xargs grep -l 1693701622
In general, it is not a good idea to parse the output of ls. See this post for why.
For your requirement, it might be better to use find to pick up the files based on their timestamp and then pass them to xargs grep ....
See this related post:
Recursively find all files newer than a given time

How to get the name of the executables files in bash with ls

I try to get the name of the executable files using ls -l.
Then I tried to get the lines of ls -l which have a x using grep -w x but the result is not right : some executable files are missing (the .sh).
I just need the name of the executable files not the path but I don't know how ...
user#user-K53TA:~/Bureau$ ls -l
total 52
-rwxrwxrwx 1 user user 64 oct. 6 21:07 a.sh
-rw-rw-r-- 1 user user 11 sept. 29 21:51 e.txt
-rwxrwxrwx 1 user user 140 sept. 29 23:42 hi.sh
drwxrwxr-x 8 user user 4096 juil. 30 20:47 nerdtree-master
-rw-rw-r-- 1 user user 492 oct. 6 21:07 okk.txt
-rw-rw-r-- 1 user user 1543 oct. 6 21:07 ok.txt
-rw-rw-r-- 1 user user 119 sept. 29 23:27 oo.txt
-rwxrwxr-x 1 user user 8672 sept. 29 21:20 prog
-rw-rw-rw- 1 user user 405 sept. 29 21:23 prog.c
-rw-rw-r-- 1 user user 0 sept. 29 21:58 rev
drwxrwxr-x 3 user user 4096 sept. 29 20:51 sublime
user#user-K53TA:~/Bureau$ ls -l | grep -w x
drwxrwxr-x 8 user user 4096 juil. 30 20:47 nerdtree-master
-rwxrwxr-x 1 user user 8672 sept. 29 21:20 prog
drwxrwxr-x 3 user user 4096 sept. 29 20:51 sublime
Don't parse ls. This can be done with find.
find . -type f -perm /a+x
This finds files with any of the executable bits set: user, group, or other.
Use find instead:
find -executable
find -maxdepth 1 -type f -executable
find -maxdepth 1 -type f -executable -ls
One can use a for loop with glob expansion for discovering and manipulating file names. Observe:
#!/bin/sh
for i in *
do # Only print discoveries that are executable files
[ -f "$i" -a -x "$i" ] && printf "%s\n" "$i"
done
Since the accepted answer uses no ls at all:
ls -l | grep -e '^...x'

Script to remove all directories older than x days but keep certain ones

I'm trying to write a bash script to remove all directories and their files but keep certain ones.
drwxr-xr-x 20 ubuntu admin 4096 Jan 21 17:58 .
drwxr-xr-x 8 ubuntu admin 4096 Nov 21 16:45 ..
drwxr-xr-x 11 ubuntu admin 4096 Jan 9 13:09 1763
drwxr-xr-x 11 ubuntu admin 4096 Jan 16 16:46 1817
drwxr-xr-x 11 ubuntu admin 4096 Jan 16 17:39 1821
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 10:15 1823
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 11:57 1826
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 14:55 1827
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 21:34 1828
drwxr-xr-x 11 ubuntu admin 4096 Jan 20 13:29 1833
drwxr-xr-x 11 ubuntu admin 4096 Jan 20 16:13 1834
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 10:06 1838
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 12:51 1842
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 15:20 1845
drwxr-xr-x 11 ubuntu admin 4096 Jan 22 13:00 1848
drwxr-xr-x 11 ubuntu admin 4096 Nov 24 16:34 217
drwxr-xr-x 11 ubuntu admin 4096 Dec 2 20:44 219
drwxr-xr-x 11 ubuntu admin 4096 Dec 15 16:42 221
drwxr-xr-x 11 ubuntu admin 4096 Dec 16 12:04 225
drwxr-xr-x 2 ubuntu admin 4096 Jan 20 16:10 app-conf
lrwxrwxrwx 1 ubuntu admin 19 Jan 21 17:58 latest -> /opt/qudiniapp/1848
In the example above we'd want to clear out all non sym-linked folders except the app-conf folder.
The plan is to have this triggered by my ansible deployment script before deployment so we can keep our server from filling up with builds.
Provided, all directories, that are to be deleted, consist only of numbers, this would be one way solve this:
cd /tempdir
rm -rf $(find . -type d -name "[0-9]*" | grep -v "$(readlink latest)")
As this is a housekeepingjob, you should create a cronjob, that regularly deletes old directories. The find command would then include checking, for example, if the last modification time is beyond a number of days:
rm -rf $(find . -type d -mtime +20 -name "[0-9]*" | grep -v "$(readlink latest)")
bash script:
#!/bin/bash
find /your/path -type d ! \( -path '*app-conf*' -prune \) -mtime +2 -delete
per man find
-P Never follow symbolic links. This is the default behaviour. When find examines or prints information a file, and the file is a symbolic link, the information used shall be taken from the properties of the symbolic link itself.
-mtime n File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.
This is what I use in my Ansible deployments, hope it will be helpful for you as it does almost exactly what you need.
I always remove oldest release on each deployment if there are >= 5 builds in "{{ releases_path }}" directory. "{{ releases_path }}" contains directories which are basically Git commit hashes (long)
- name: Find oldest release to remove
shell: '[[ $(find "{{ releases_path | quote }}" -maxdepth 1 -mindepth 1 -type d | wc -l) -ge 6 ]] && IFS= read -r -d $"\0" line < <(find "{{ releases_path | quote }}" -maxdepth 1 -mindepth 1 -type d -printf "%T# %p\0" 2>/dev/null | sort -z -n); file="${line#* }"; echo "$file";'
args:
executable: /bin/bash
chdir: "{{ releases_path }}"
register: releasetoremove
changed_when: "releasetoremove.stdout != ''"
- debug: var=releasetoremove
- name: Remove oldest release
file: path={{ releasetoremove.stdout }} state=absent
when: releasetoremove|changed
This is what I always have on each server in releases directory (last 5 always kept):
$ ls -lt | cut -c 28-
62 Jan 22 17:42 current -> /srv/releases/2a7b80c82fb1dd658a3356fed7bba9718bc50527
4096 Jan 22 17:41 2a7b80c82fb1dd658a3356fed7bba9718bc50527
4096 Jan 22 15:22 73b1252ab4060833e43849e2e32f57fea6c6cd9b
4096 Jan 22 14:47 9df7f1097909aea69916695194ac41938a0c2e9a
4096 Jan 22 14:16 f6a2862d70f7f26ef75b67168a30fb9ef2202555
4096 Jan 22 13:49 fa89eefc5b2505e153b2e59ed02a23889400c4bf

Resources