List all user directories and look for specific file - linux

I'm working on a script which will check for a specific file in ~/ of all users with home directory.
I tried ls /home and cd into users into their home directories but it gives too many arguments error.
username=$(ls /home)
cd /home/$username
cat file.json
I except the output of json file but it doesn't gives output of json file even user have a json file.
Edit:
Now I need to extract username of users with file file.json I have tried to do this with grep but it didn't worked.
files=$(find /home -name tilde.json -print)
echo "$files" >> jsons.txt
cat jsons.txt | grep /*/

This will find and list all files called file.json under the /home directory:
find /home -name file.json -print
You may want to redirect errors to /dev/null in the event you don't have access to all users' home dirs.
If you want to print out the contents of all these files, try:
find /home -name file.json -print -exec cat {} \;
To limit the search to only the directories under /home (i.e. not /home itself, and no sub directories in the user home), use:
find /home -mindepth 2 -maxdepth 2 -type f -name file.json -print -exec cat {} \;
I also added the -type flag there to limit the search to files and exclude any dirs that may happen to share the name.

This'll do:
cat /home/*/file.json
It'll print
cat: '/home/*/file.json': No such file or directory
on standard error if it can't find any.

What about:
cd /home
find . -name file.json -maxdepth 1 -exec cat {} \;

Suppose /home contains user1 and user2.
Then your cd command is invoked as
cd /home/user1 user2
That's not what you wanted, and isn't valid syntax for cd, which accepts only a single argument. You probably wanted a for loop instead.
If you can't predict such expansions, set -x enables tracing, which may provide insight into what commands are actually run, and would show your problem here. set +x to turn it off again.
Finally, note that not all users' home directories are necessarily in home. You might want to use getent or similar to find all user home directories.

Related

How to search (using find command) for directories and copy all the files and directory itself to another directory in linux?

How to search (using find command) for directories and copy all the files and directory itself to another directory in linux?
Here is what I have so far:
find -type d -name "*.ABC" -exec {} /Desktop/NewFile \;
I get this as output:
find: './GAE/.ABC: PERMISSION DENIED
Please Help, Thanks!
Your error here above has nothing to do with file read permission. You're trying to execute the directories you find! Avoid running commands as root or sudo unless: (1) you really need it and (2) you really know what you're doing. Quite often people asking for root or sudo privileges are exactly the ones should not have it.
That said... there are several ways to copy a directory tree under *nix. This is just one possible approach:
$ find <start> -type d -name \*.ABC -exec cp -av {} <target> \;
Where:
<start> is a directory name. It's used to tell find where to start its search (for example /usr/local or $HOME)
<target> is another directory name to define the final destination of your copied directories
UPDATE
In case you want to search for multiple paths...
$ find <start> -type d \( -name \*.ABC -o -name \*.DEF \) -exec cp -av {} <target> \;
This should work:
find ./source_dir -name \*.png -print0 | xargs -0 cp -t path/to/destination
For more info, you can look up here.

Get list of files that contain given text within directory given by pattern

I want to get a list of files that contain a given text within my file-system. Furthermore only files should be considdered that are located in a directoy given by a pattern.
So let´s say I have a number of directories called myDir within my filelsystem as shown here:
/usr
/myDir
/tmp
/myDir
/anotherDir
Now I want to get all the files within those directories that contain the text.
So basically I need to perform these steps:
loop all directories names myDir on the whole file-system
for every directory within that list get the files that contain the search-string
What I tried so far is find /etc /opt /tmp /usr /var -iname myDir -type d -exec ls -exec grep -l "SearchString" {} \;
However this doesn´t work as the results of find are directories which I may not use as input for grep. I assume I have to do one step in between the find and the grep but can´t find out how to do this.
I think I got it and will show you a little script that achieves what I need:
for i in $(find / -type d -iname myDir) do
for j in $(find "$i" -type f) do
grep "SearchString" "$j"
done
done
This will give me all the files that contain the SearchString and are located in any of the folders named myDir.

create a list with content of multiple zip files in linux

I am trying to create a script for linux that will make a list with all files inside all zip files from a directory.
#! /bin/bash
for file in `find /home -iname "*.zip*" -type f`
do
unzip -l $(echo ${file}) >> /home/list.txt
done
It works, but only when there are no white spaces in filename.
What can I do to make it work ?
You can use the find command to execute a command for each file it finds. Perhaps try something like:
find /home -iname "*.zip*" -type f -exec unzip -l {} \; > /home/list.txt

Remove a bunch of directories from one location based on a list of directories in another location?

I have two directories in totally different places in the filesystem:
/path1/dir1/*
/path2/dir2/*
dir1 has a list of subdirectories and dir2 has a similar list of subdirectories, some of which are also in dir1
I'd like a command that can use a list of the subdirectories that are currently in dir1 and if they exist in dir2, delete them.
I was able to output a list of the subdirectory names using the find command and sed together like this:
find $PWD -maxdepth 1 -type d | sed -e 's\^/path1/dir1///g' and that will output:
subdir1
subdir2
subdir3
but I don't know how to then feed that into a command to delete (recursively) those subdirectories from another location. Do I need to use awk or xargs or something?
Sounds like you want something like this:
cd /path1/dir1; find . -type d -maxdepth 1 -mindepth 1 -exec rm -rf /path2/dir2/{} \;
Replace the "rm -rf" with "echo" to see what directories it will delete before trying it :-)
The "-f" option prevents errors if the directory doesn't exist
Some versions of find (GNU?) also have "-execdir". You can use it like this:
find /path1/dir -type d -maxdepth 1 -mindepth 1 -execdir rm -rf /path2/dir2/{} \;
for dir in path1/dir1/*/
do
rm -rf path2/dir2/"$(basename dir)"
done
You could also try using find to locate the dirs and piping to awk:
find /path1/dir1/ -maxdepth 1 -mindepth 1 -type d |awk 'BEGIN{FS="/"}{system("echo rm -rf /path2/dir2/"$NF);}'
remove the "echo" in the system() call when you are sure the command is behaving properly.

How to change all occurrences of a word in all files in a directory

I was in the process of creating a User class where one of the methods was get_privileges();.
After hours of slamming my head into the keyboard, I finally discovered that the previous coder who I inherited this particular database spelled the word "privileges" as "privelages" in the MySQL database, and thus also everywhere in the hundreds of files that access these "privelages" it is spelled that way.
Is there a way in Linux (Ubuntu Server) that I can go through every place in the /var/www folder and replace "privelages" with "privileges", so that I don't have to deal with this typo and code around it?
A variation that takes into account subdirectories (untested):
find /var/www -type f -exec sed -i 's/privelages/privileges/g' {} \;
This will find all files (not directories, specified by -type f) under /var/www, and perform a sed command to replace "privelages" with "privileges" on each file it finds.
Check this out: http://www.cyberciti.biz/faq/unix-linux-replace-string-words-in-many-files/
cd /var/www
sed -i 's/privelages/privileges/g' *
I generally use this short script, which will rename a string in all files and all directory names and filenames. To use it, you can copy the text below into a file called replace_string, run sudo chmod u+x replace_string and then move it into your sudo mv replace_string /usr/local/bin folder to be able to execute it in any directory.
NOTE: this only works on linux (tested on ubuntu), and fails on MacOS. Also be careful with this because it can mess up things like git files. I haven't tested it on binaries either.
#!/usr/bin/env bash
# This will replace all instances of a string in folder names, filenames,
# and within files. Sometimes you have to run it twice, if directory names change.
# Example usage:
# replace_string apple banana
echo $1
echo $2
find ./ -type f -exec sed -i -e "s/$1/$2/g" {} \; # rename within files
find ./ -type d -exec rename "s/$1/$2/g" {} \; # rename directories
find ./ -type f -exec rename "s/$1/$2/g" {} \; # rename files

Resources