Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I'm wondering if I can merge multiple ls results.
I have to combine 3 ls -l results; two directories have to list only directories in them, and the other lists symbolic links only.
The merged list'll finally be sorted by name or by date.
I did it by java code but I want to do it by single Linux command line if possible.
Thanks
I think you are better of doing this in find, but using ls -l to get out your file info for sorting by sort. Like:
find /path/dir1 /path/dir2 /path/dir3 -maxdepth 1 -exec ls -l --full-time {} + | sort -k 6,7
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 days ago.
This post was edited and submitted for review 6 days ago.
Improve this question
In a grep that i use to find some values in a log, i'm using
-exec grep -cHF "55=36" {} \; | grep -v ":0"
To show me values that are different to zero, so i get this output:
opt/route/file_1.log:7
I want to know how i can set a range of numbers to show me, for example if the grep finds only 7 matches no to show me anything but if it is more than 50 ( > 50), to show me the output.
I was told that maybe something like this could work?
grep -v ':[0-7]$' but it doesn't seem to work for me
Like this:
<INPUT> | tail -n+7
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have one file with name config.tar.gz. Inside this tar file I have couple of files. Out of which I need to get size of one file.
I am trying following
tar -vtf config.tar.gz | grep sgr.txt
Output:
-rw-r--r-- root/DIAGUSER 109568 2019-11-26 10:16:21 sgr.txt
From this I need to extract only size in human readable format. Something similar to "ls -sh sgr.txt"
You could try:
tar -ztvf file.tar.gz 'specific_file' | awk '{print $3}'
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
Improve this question
I want to make a command where I list that directories which are contains only two letters.
How can I do it?
? is a wildcard for one character. So, the following should work:
ls -d ??/
The -d prevents ls from listing the contents of the directories, the final / excludes files.
ls -F | grep -o "^.\{2\}/$"
ls -F lists content by file system object type
| grep -o filters out anything that doesn't match the regex expression ^.\{2\}/$ which basically says 'match only folders with 2 characters in their name'
ls */ | awk 'length($0) < 3'
Note that this does not match hidden directories. choroba's answer is better, because it is usually a bad idea to parse the output of ls, but I like this for its readability.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I try
ls */ | grep "\.txt$"
to find all .txt file in the subdirectory but it seems that it can't work well all the time.
The pattern you want can easily be matched with a single glob:
ls */*.txt
The ls isn't necessary; it just demonstrates that it works. You can also use
echo */*.txt
printf '%s\n' */*.txt
files=( */*.txt )
for f in */*.txt; do ....
The pattern itself (*/*.txt) will expand to the list of the matching files; what you can do with that list is fairly broad.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
By using UNIX/Linux commands, pipes (“|”) and redirections (“>”, “>>”)
make a listing of the smallest 5 files in the “/etc” directory whose names
contains string “.conf”, sorted by increasing file size.
This will work:
ls -lS /etc | sort -k 5 -n| grep ".conf" | head -n 5
First list files by size, then sort by 5th column of results by number, then filter lines containing the string ".conf" and finally show only 5 lines.