Grep for a file with a specific name [duplicate] - linux

This question already has answers here:
Exclude a string from wildcard search in a shell
(3 answers)
Closed 5 years ago.
In my repository I have several files, including two specific JAR files named as follows:
backend-0.0.1-SNAPSHOT.jar
backend-0.0.1-SNAPSHOT.jar.original
I need to get only the first one, and I need to fetch it only with its name: "backend". The version is not static; it can change.
So I have done this:
ls | grep 'backend'
But this one get me both of them, so I need to grep for files beginning with backend and ending by .jar.
How can I use this?

Don't use the output of ls for scripting. Use find instead:
find . -maxdepth 1 -type f -name 'backend*.jar'

Or, without using grep:
ls backend*.jar

ls | grep '^backend.*.jar$'
$ means, that there are no symbols after r in jar.

Related

How to write script displaying all subdirectories in the location [duplicate]

This question already has answers here:
List sub-directories with ls [closed]
(3 answers)
Closed 3 years ago.
I need help. How to write script displaying all subdirectories in the location. I've got something like that:
ls -al | grep ^d
but it only works in the home directory
find(1) may be a better choice here:
find . -type d
which would list all directories from the current directory and all subdirectories.

How can I search for specific file contents in all files in both current folder and all subfolders [duplicate]

This question already has answers here:
How do I recursively grep all directories and subdirectories?
(26 answers)
Closed 4 years ago.
Using Mac terminal server linux/bash commands, how can I search for a particular text string in all *.txt files in the current folder plus all files in the subfolders inside the current folder?
grep -i "xxx" */*
xxx is the target text I am trying to find.
find . -type f -print | egrep ".txt$" | xargs grep "SearchPatern"
Explained as
Find all the file names in the current directory and below send to ....
grep which picks out the file name that end in .txt and send names to ....
xargs which will execute a grep command on each file to look for SearchPatern.

Error: "grep: Argument list too long" [duplicate]

This question already has answers here:
How can I grep while avoiding 'Too many arguments' [duplicate]
(5 answers)
Closed 7 years ago.
I am trying to run the following command, but gets argument too long error. Can you help?.
HOST# grep -rl 'pattern' /home/*/public_html/*
-bash: /bin/grep: Argument list too long
Is there a way to override this error and grep the pattern matching files I want in all users public_html directory. There are around 500+ users in the same server.
Use find
find /home/*/public_html -type f -exec grep -l 'pattern' {} +
The + modifier makes it group the filenames in manageable chunks.
However, you can do it with grep -r. The arguments to this should be the directory names, not filenames.
grep -rl 'pattern' /home/*/public_html
This will just have 500+ arguments, not thousands of filenames.

Removing files in a sub directory based on modification date [duplicate]

This question already has answers here:
bash script to remove directories based on modified file date
(3 answers)
Closed 8 years ago.
Hi so I'm trying to remove old backup files from a sub directory if the number of files exceeds the maximum and I found this command to do that
ls -t | sed -e '1,10d' | xargs -d '\n' rm
And my changes are as follows
ls -t subdirectory | sed -e '1,$f' | xargs -d '\n' rm
Obviously when I try running the script it gives me an error saying unknown commands: f
My only concern right now is that I'm passing in the max number of files allowed as an argument so I'm storing that in f but now I'm not too sure how to use that variable in the command above instead of having to set condition to a specific number.
Can anyone give me any pointers? And is there anything else I'm doing wrong?
Thanks!
The title of your question says "based on modification date". So why not simply using find with mtime option?
find subdirectory -mtime +5d -exec rm -v {} \;
Will delete all files older than 5 days.
The problem is that the file list you are passing to xargs does not contain the needed path information to delete the files. When called from the current directory, no path is needed, but if you call it with subdirectory, you need to then rm subdirectory/file from the current directory. Try it:
ls -t subdirectory # returns files with no path info
What you need to do is change to the subdirectory, call the removal script, then change back. In one line it could be done with:
pushd subdirectory &>/dev/null; ls -t | sed -e '1,$f' | xargs -d '\n' rm; popd
Other than doing it in a similar manner, you are probably better writing a slightly longer and more flexible script forming the list of files with the find command to insure the path information is retained.

how to list full paths of folders inside a directory in linux? [duplicate]

This question already has answers here:
Show full path when using options
(8 answers)
Closed 8 years ago.
I have a folder /home/Documents/myFolder and inside this folder there are lots other folders. I want to have a file list.txt which contains all the paths of the folders. I want the text file content like this:
/home/Documents/myFolder/1234
/home/Documents/myFolder/asd2
/home/Documents/myFolder/asdawgf
/home/Documents/myFolder/dawt
.
.
.
I tried this one but it was not what I want ls > /home/Documents/myFolde/list.txt
it just prints the folder names. I want the full paths.
Use find listing all directories (-type d) and then sed the output to get the full path correct:
find . -type d | sed -n 's:^\./:/home/Documents/myFolder/:'p > /home/Documents/myFolder/list.txt
you can use find:
find . > /home/Documents/myFolde/list.txt

Resources