Linux: 'ls' all jpg files recursively in csv - linux

I'm trying to find a way to list all jpg images in all subdirectories and in csv format without the dir name present.
ls -R -1 -m . | grep '.jpg'
The ls command does output to csv fine, but the grep command breaks the csv format making each file appear on a new line instead of comma seperated.
I know I can use 'find' to list images but it seems to output the files in a different order than 'ls' and I don't see a output to csv parameter for 'find'
I need the images in each subdirectory on 1 comma seperated line.

I believe this does what you want. Each outputted line is a list of jpgs in a single directory separated by commas.
ls -d */ | xargs -i{} sh -c 'cd {};ls -m *jpg'
If you wanted to know which line was which directory you could run it in 2 steps like this
ls -d */ > dirs.txt
cat dirs.txt | xargs -i{} sh -c 'cd {};ls -m *txt'
and then the first line in dirs.txt would correspond to each line of output.

Related

Linux commands to get Latest file depending on file name

I am new to linux. I have a folder with many files in it and i need to get the latest file depending on the file name. Example: I have 3 files RAT_20190111.txt RAT_20190212.txt RAT_20190321.txt . I need a linux command to move the latest file here RAT20190321.txt to a specific directory.
If file pattern remains the same then you can try below command :
mv $(ls RAT*|sort -r|head -1) /path/to/directory/
As pointed out by #wwn, there is no need to use sort, Since the files are lexicographically sortable ls should do the job already of sorting them so the command will become :
mv $(ls RAT*|tail -1) /path/to/directory
The following command works.
ls | grep -v '/$' |sort | tail -n 1 | xargs -d '\n' -r mv -- /path/to/directory
The command first splits output of ls with newline. Then sorts it, takes the last file and then it moves this to the required directory.
Hope it helps.
Use the below command
cp ls |tail -n 1 /data...

Modify ls output to display [+] in front of directories

I am looking for a way to modify the ls output in that way that every directory displays [+] in front of the directory name. Ideally doing via bashrc.
me#computer[~]$ ls
[+]directory [+]directory
[+]directory file.png
file file.txt
readme
Currently I am just customizing the color output:
LS_COLORS=$LS_COLORS:'di=1;37;4' ; export LS_COLORS
This might help you, but it gives you only one column output:
ls | sed -r "$(find -maxdepth 1 -type d | cut -d/ -f2 | sed "1 d; 2~1 { s:.*:s/^\\(&\\)$/[+]\\\\1/;:g}")"
It works by piping the output of ls through sed and the sed script is dynamically build using a pipe that converts a list of directories to a list of S/^dirname$/[+]dirname/; sed script lines.
Just try out all the parts individually to see how it works.
For example when run in /etc the outputs starts likes this:
[+]acpi
adduser.conf
[+]adobe
[+]akonadi
aliases
aliases.db
You might want to alias the command in your bashrc.
And you might want to look into the tree command.
You can use :
ls -l : directories will start with d.
ls -p : a slash will be added into directory name like dir/
ls -F : will also add a slash after dir names and other marks to other file types (*, etc)
ls -d */ : As advised in comments, will list only dir names with a slash at the end. Remove -d to see also sub dir contents.
In terms of manipulating ls output you could go like :
ls -l |awk '/^d/{print "[+]"$NF}; /^[^d]/{print $NF}' |column
You can also use find and avoid parsing ls since had been said that parsing ls might break if file names contain strange chars like new lines etc.
find in this format will produce output identical to above ls:
find . -maxdepth 1 -printf '%Y %f\n' |awk '/^d/{print "[+]"$NF}; /^[^d]/{print $NF}' |column
you should also try this using a bash script
#!/usr/bin/env bash
myls() {
for i in *;do
[[ -d "${i}" ]] && {
printf "%s\n" "[+] ${i}"
continue;
}
printf "%s\n" "${i}"
done
}
source the script in your .bashrc file. Whenever you want to use this, just call myls in the directory.
you should note that it does not give you a colored output

grep - limit number of files read

I have a directory with over 100,000 files. I want to know if the string "str1" exists as part of the content of any of these files.
The command:
grep -l 'str1' * takes too long as it reads all of the files.
How can I ask grep to stop reading any further files if it finds a match? Any one-liner?
Note: I have tried grep -l 'str1' * | head but the command takes just as much time as the previous one.
Naming 100,000 filenames in your command args is going to cause a problem. It probably exceeds the size of a shell command-line.
But you don't have to name all the files if you use the recursive option with just the name of the directory the files are in (which is . if you want to search files in the current directory):
grep -l -r 'str1' . | head -1
Use grep -m 1 so that grep stops after finding the first match in a file. It is extremely efficient for large text files.
grep -m 1 str1 * /dev/null | head -1
If there is a single file, then /dev/null above ensures that grep does print out the file name in the output.
If you want to stop after finding the first match in any file:
for file in *; do
if grep -q -m 1 str1 "$file"; then
echo "$file"
break
fi
done
The for loop also saves you from the too many arguments issue when you have a directory with a large number of files.

How do I use the pipe command to display attributes in a file?

I'm currently making a shell program and I want to display the total amount of bytes in a specific file using the pipe command. I know that the pipe command takes whatever is on the left side and gives it to the right as input. (Assuming you are in the directory the file is in)
I know that the command (wc -c) displays the number of bytes in a file but I'm not sure how to pipe it. What I've tried was:
ls fileName.sh | wc -c
wc takes the filename as argument, not as input. Try this:
wc -c fileName.sh
The wc program takes multiple arguments. You can do this to apply it to all entries in the current working directory:
wc -c $(ls)
Another approach is to use xargs to convert input to arguments:
ls | xargs wc -c
You may need to use a more complex line if you have spaces in your filenames. ls can output a single file per line, and xargs can be told to split only on \n:
ls -1 | xargs -d '\n' wc -c
If you prefer to use find instead of ls (a more powerful tool), the -print0 option for find plays along with the -0 option to xargs.

Find specific string in subdirectories and order top directories by modification date

I have a directory structure containing some files. I'm trying to find the names of top directories that do contain a file with specific string in it.
I've got this:
grep -r abcdefg . | grep commit_id | sed -r 's/\.\/(.+)\/.*/\1/';
Which returns something like:
topDir1
topDir2
topDir3
I would like to be able to take this output and somehow feed it into this command:
ls -t | grep -e topDir1 -e topDir2 -e topDir3
which would returned the output filtered by the first command and ordered by modification date.
I'm hoping for a one liner. Or maybe there is a better way of doing it?
This should work as long as none of the directory names contain whitespace or wildcard characters:
ls -td $(grep -r abcdefg . | grep commit_id | dirname)

Resources