How can I get the owner of every file in a directory in Linux? - linux

I need to check if root is the owner of every file in a particular directory. I can do
stat --format=%u /directory/name/here
to get the owner of the directory itself, but not the files in it.
My other idea was to do
ls -lL | grep "^-\|^d" | cut -d ' ' -f 2
but that doesn't work if the last byte in the permissions is a space and not a '.'.
This is also CentOS if that matters.

you can use find:
find /tmp -type f -printf '%u\n' | sort -u
lightdm
root
tiago
If you need UID in numeric form, like using stat:
find /tmp -type f -printf '%U\n' | sort -u
0
1000
104

You're asking two different questions.
I need to check if root is the owner of every file in a particular directory
To find any files that are not owned by root, you can do:
find /yourdir ! -user root
If it returns any filenames at all, then root is not the owner of every file in the particular directory.
How can I get the owner of every file in a directory in Linux?
To print every file in the directory with username:
find /yourdir -printf '%u %p\n'
And if the final step would be to chown the files not owned by root, you can simply do chown -R root /yourdir, since there's no harm in chowning root's files to root.

Try
find /your/dir/ -type f -exec stat --format='%u %n' '{}' \;
I added %n to display the file name.
Read find(1) for more info about find .... You may want -max_depth 1 to avoid going deeply in /your/dir/

for F in /directory/*; do stat --format='%u' "$F"; done
And optionally add dotglob option to match files beginning with . as well:
shopt -s dotglob
for F in /directory/*; do stat --format='%u' "$F"; done
* --format is equivalent to -c.

Related

Delete file if it is owned by specific user

There is a file that is sometimes not owned by root
I want my perl script in linux to basically check if a file is owned by root if it is delete it.
Currently what I have unlink("$File_Path/File_Name");
but this just deletes the file I want it to check if it's owned by root first then delete otherwise ignore.
can you please guide me how I can achieve this I am out of ideas?
The documentation for stat shows that the fifth element in the returned list is "numeric user ID of file's owner". The superuser account on *nix must have uid of 0, so
if ( (stat $fqn)[4] == 0 ) {
unlink $fqn or die "Error with unlink($fqn): $!";
}
If you're doing this to a bunch of files in a folder somewhere, you might be better off by just one of these:
find /folder/somewhere/ -type f -user root -exec rm {} \;
find /folder/somewhere/ -type f -user root -exec rm -i {} \; #interactive y/n each file
find /folder/somewhere/ -type f -user root -print0 | xargs -r0 rm
You might also need sudo in front of find. Careful though, this is a kind of command that can do a lot of harm...

List all user directories and look for specific file

I'm working on a script which will check for a specific file in ~/ of all users with home directory.
I tried ls /home and cd into users into their home directories but it gives too many arguments error.
username=$(ls /home)
cd /home/$username
cat file.json
I except the output of json file but it doesn't gives output of json file even user have a json file.
Edit:
Now I need to extract username of users with file file.json I have tried to do this with grep but it didn't worked.
files=$(find /home -name tilde.json -print)
echo "$files" >> jsons.txt
cat jsons.txt | grep /*/
This will find and list all files called file.json under the /home directory:
find /home -name file.json -print
You may want to redirect errors to /dev/null in the event you don't have access to all users' home dirs.
If you want to print out the contents of all these files, try:
find /home -name file.json -print -exec cat {} \;
To limit the search to only the directories under /home (i.e. not /home itself, and no sub directories in the user home), use:
find /home -mindepth 2 -maxdepth 2 -type f -name file.json -print -exec cat {} \;
I also added the -type flag there to limit the search to files and exclude any dirs that may happen to share the name.
This'll do:
cat /home/*/file.json
It'll print
cat: '/home/*/file.json': No such file or directory
on standard error if it can't find any.
What about:
cd /home
find . -name file.json -maxdepth 1 -exec cat {} \;
Suppose /home contains user1 and user2.
Then your cd command is invoked as
cd /home/user1 user2
That's not what you wanted, and isn't valid syntax for cd, which accepts only a single argument. You probably wanted a for loop instead.
If you can't predict such expansions, set -x enables tracing, which may provide insight into what commands are actually run, and would show your problem here. set +x to turn it off again.
Finally, note that not all users' home directories are necessarily in home. You might want to use getent or similar to find all user home directories.

Display the Files and Folder details under a parent directory in Linux

I have to display all files and folders details under a parent directory.
I am using the command is 'find'. For example,
find /usr/local
/usr/local/bin
It's display only the file name. I have to display file name with details about files like below. Means I have to add below information in the above result set.
-rw-rw-- 1 hduser hduser 213 jan 22 11:51
How to do it?
Thanks in advance.
There's the convenient action -ls:
find /usr/local -ls
If you need some other than the default -ls output format, the action -printf is appropriate; with that you can freely define the format, e. g.:
find /usr/local -printf "%i,%k,%M,%n,%u,%g,%s,%t,%p\n"
Cf. man find: Print File Information.
you can use below command to list it nicely in a order and block wise:-
find . -type d |xargs ls -ltr
For your case:-
find /usr/local -type d |xargs ls -ltr
Try sudo find /usr/local -name "filename" -depth -exec ls -ll {} \;

How do I exclude a folder when performing file operations i.e. cp, mv, rm and chown etc. in Linux

How do you exclude a folder when performing file operations i.e. cp etc.
I would currently use the wild card * to apply file operation to all, but I need to exclude one single folder.
The command I'm actually wanting to use is chown to change the owner of all the files in a directory but I need to exclude one sub directory.
If you're using bash and enable extglob via shopt -s extglob then you can use !(<pattern>) to exclude the given pattern.
find dir_to_start -name dir_to_exclude -prune -o -print0 | xargs -0 chown owner
find dir_to_start -not -name "file_to_exclude" -print0 | xargs -0 chown owner
for file in *; do
if [ $file != "file_I_dont_want_to_chown" ]
then
chown -R Camsoft $file
fi
done
Combine multiple small sharp tools of unix:
To exclude the folder "foo"
% ls -d * | grep -v foo | xargs -d "\n" chown -R Camsoft
For this situation I would recommend using find. You can specify paths to exclude using the -not -iwhilename 'PATH'. Then using exec you execute the command you want to execute
find . -not -iwholename './var/foo*' -exec chown www-data '{}' \;
Although this probably does help for your situation I have also see scripts set the immutable flag. Make sure you remove the flag when your done you should use trap for this just in case the script is killed early (note: run from a script, the trap code runs when the bash session exits). A lot of trouble in my option but it's good in some situations.
cd /var
trap 'chattr -R -i foo > /dev/null 2>&1' 0
chattr -R +i foo
chown -R www-data *
Another option might be to temporarily remove permission on the that file /folder.
In Unix you need 'x' permission on a directory to enter it.
edit: obviously this isn't goign to work if you are backing up a live production database - but for excluding your 'interesting images' collection when copying documents to a USB key it's reasoanable.

How can I generate a list of files with their absolute path in Linux?

I am writing a shell script that takes file paths as input.
For this reason, I need to generate recursive file listings with full paths. For example, the file bar has the path:
/home/ken/foo/bar
but, as far as I can see, both ls and find only give relative path listings:
./foo/bar (from the folder ken)
It seems like an obvious requirement, but I can't see anything in the find or ls man pages.
How can I generate a list of files in the shell including their absolute paths?
If you give find an absolute path to start with, it will print absolute paths. For instance, to find all .htaccess files in the current directory:
find "$(pwd)" -name .htaccess
or if your shell expands $PWD to the current directory:
find "$PWD" -name .htaccess
find simply prepends the path it was given to a relative path to the file from that path.
Greg Hewgill also suggested using pwd -P if you want to resolve symlinks in your current directory.
readlink -f filename
gives the full absolute path. but if the file is a symlink, u'll get the final resolved name.
Use this for dirs (the / after ** is needed in bash to limit it to directories):
ls -d -1 "$PWD/"**/
this for files and directories directly under the current directory, whose names contain a .:
ls -d -1 "$PWD/"*.*
this for everything:
ls -d -1 "$PWD/"**/*
Taken from here
http://www.zsh.org/mla/users/2002/msg00033.html
In bash, ** is recursive if you enable shopt -s globstar.
You can use
find $PWD
in bash
ls -d "$PWD/"*
This looks only in the current directory. It quotes "$PWD" in case it contains spaces.
Command: ls -1 -d "$PWD/"*
This will give the absolute paths of the file like below.
[root#kubenode1 ssl]# ls -1 -d "$PWD/"*
/etc/kubernetes/folder/file-test-config.txt
/etc/kubernetes/folder/file-test.txt
/etc/kubernetes/folder/file-client.txt
Try this:
find "$PWD"/
You get list of absolute paths in working directory.
You can do
ls -1 |xargs realpath
If you need to specify an absolute path or relative path You can do that as well
ls -1 $FILEPATH |xargs realpath
The $PWD is a good option by Matthew above. If you want find to only print files then you can also add the -type f option to search only normal files. Other options are "d" for directories only etc. So in your case it would be (if i want to search only for files with .c ext):
find $PWD -type f -name "*.c"
or if you want all files:
find $PWD -type f
Note: You can't make an alias for the above command, because $PWD gets auto-completed to your home directory when the alias is being set by bash.
If you give the find command an absolute path, it will spit the results out with an absolute path. So, from the Ken directory if you were to type:
find /home/ken/foo/ -name bar -print
(instead of the relative path find . -name bar -print)
You should get:
/home/ken/foo/bar
Therefore, if you want an ls -l and have it return the absolute path, you can just tell the find command to execute an ls -l on whatever it finds.
find /home/ken/foo -name bar -exec ls -l {} ;\
NOTE: There is a space between {} and ;
You'll get something like this:
-rw-r--r-- 1 ken admin 181 Jan 27 15:49 /home/ken/foo/bar
If you aren't sure where the file is, you can always change the search location. As long as the search path starts with "/", you will get an absolute path in return. If you are searching a location (like /) where you are going to get a lot of permission denied errors, then I would recommend redirecting standard error so you can actually see the find results:
find / -name bar -exec ls -l {} ;\ 2> /dev/null
(2> is the syntax for the Borne and Bash shells, but will not work with the C shell. It may work in other shells too, but I only know for sure that it works in Bourne and Bash).
Just an alternative to
ls -d "$PWD/"*
to pinpoint that * is shell expansion, so
echo "$PWD/"*
would do the same (the drawback you cannot use -1 to separate by new lines, not spaces).
fd
Using fd (alternative to find), use the following syntax:
fd . foo -a
Where . is the search pattern and foo is the root directory.
E.g. to list all files in etc recursively, run: fd . /etc -a.
-a, --absolute-path Show absolute instead of relative paths
If you need list of all files in current as well as sub-directories
find $PWD -type f
If you need list of all files only in current directory
find $PWD -maxdepth 1 -type f
You might want to try this.
for name in /home/ken/foo/bar/*
do
echo $name
done
You can get abs path using for loop and echo simply without find.
find jar file recursely and print absolute path
`ls -R |grep "\.jar$" | xargs readlink -f`
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ojdbc8-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ons-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/oraclepki-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_cert-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/osdt_core-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/simplefan-19.3.0.0.jar
/opt/tool/dev/maven_repo/com/oracle/ojdbc/ucp-19.3.0.0.jar
This works best if you want a dynamic solution that works well in a function
lfp ()
{
ls -1 $1 | xargs -I{} echo $(realpath $1)/{}
}
lspwd() { for i in $#; do ls -d -1 $PWD/$i; done }
Here's an example that prints out a list without an extra period and that also demonstrates how to search for a file match. Hope this helps:
find . -type f -name "extr*" -exec echo `pwd`/{} \; | sed "s|\./||"
This worked for me. But it didn't list in alphabetical order.
find "$(pwd)" -maxdepth 1
This command lists alphabetically as well as lists hidden files too.
ls -d -1 "$PWD/".*; ls -d -1 "$PWD/"*;
stat
Absolute path of a single file:
stat -c %n "$PWD"/foo/bar
This will give the canonical path (will resolve symlinks): realpath FILENAME
If you want canonical path to the symlink itself, then: realpath -s FILENAME
Most if not all of the suggested methods result in paths that cannot be used directly in some other terminal command if the path contains spaces. Ideally the results will have slashes prepended.
This works for me on macOS:
find / -iname "*SEARCH TERM spaces are okay*" -print 2>&1 | grep -v denied |grep -v permitted |sed -E 's/\ /\\ /g'
for p in <either relative of absolute path of the directory>/*; do
echo $(realpath -s $p)
done
Recursive files can be listed by many ways in Linux. Here I am sharing one liner script to clear all logs of files(only files) from /var/log/ directory and second check recently which logs file has made an entry.
First:
find /var/log/ -type f #listing file recursively
Second:
for i in $(find $PWD -type f) ; do cat /dev/null > "$i" ; done #empty files recursively
Third use:
ls -ltr $(find /var/log/ -type f ) # listing file used in recent
Note: for directory location you can also pass $PWD instead of /var/log.
If you don't have symbolic links, you could try
tree -iFL 1 [DIR]
-i makes tree print filenames in each line, without the tree structure.
-f makes tree print the full path of each file.
-L 1 avoids tree from recursion.
Write one small function
lsf() {
ls `pwd`/$1
}
Then you can use like
lsf test.sh
it gives full path like
/home/testuser/Downloads/test.sh
I used the following to list absolute path of files in a directory in a txt file:
find "$PWD" -wholename '*.JPG' >test.txt
find / -print will do this
ls -1 | awk -vpath=$PWD/ '{print path$1}'

Resources