List multiple directories contents and its path - linux

I want to write a linux script or command that will:
Look into multiple specific directories and list its contents
For example
/test/dir1/abc/version1/program_name/
/test/dir1/abc/version2/program_name/
/test/dir1/abc/version3/program_name/
/test/dir1/bca/version1/program_name/
/test/dir1/bca/version2/program_name/
/test/dir1/bca/version3/program_name/
/test/dir1/cab/version1/program_name/
/test/dir1/cab/version2/program_name/
/test/dir1/cab/version3/program_name/
I can do a
ls -al /test/dir1/*/
and see its contents. But I just want to see what it inside version2 and version3.
for example
ls -al /test/dir1/*/<version2 or version3>/*
and get a list like:
/test/dir1/abc/version2/program_name/
/test/dir1/abc/version3/program_name/
/test/dir1/bca/version2/program_name/
/test/dir1/bca/version3/program_name/
/test/dir1/cab/version2/program_name/
/test/dir1/cab/version3/program_name/
Not including version1. There is more directories than version1, version2, and version3. Thats why just excluding version1 doesnt work.
Any help really appreciated!

You want to use two glob expansions for this search. Try this:
ls -al /test/dir1/*/version[23]/*
It will search through all of the /test/dir1/* directories, and then look for subdirectories matching either 'version2' or 'version3'.

You can use list feature (glob) in BASH:
ls -al /test/dir1/*/{version2,version3}/*

Related

Using wildcard with 'ls'

I want to list the all files having a name like these:
12.0.3.1_CA
12.0.3.2A_CA
12.0.3.2B_CA
I tried
ls -ltr 12.0.3.?*_CA
That worked, but fails when I have files like:
12.0.3.2AA_CA
12.0.3.2A2_CA
If you want only one character at max after the '?' then try:
ls -ltr 12.0.3.?[*]_CA
[] is used to encase any character you want, like if we want to have 'ac' in the name we would use [ac].

Like a vlookup but in bash to match filenames in a directory against a ref file and return full description

I am aware there isn't a special bash function to do this and we will have to build this with available tools -- e.g. sed, awk, grep, etc.
We dump files into a directory and while their filename looks random, they can be mapped to their full description. For example:
/tmp/abcxyz.csv
/tmp/efgwaz.csv
/tmp/mnostu.csv
In filemapping.dat, we have:
abcxyz, customer_records_abcxyz
efgwaz, routernodes_logs_efgwaz
mnostu, products_campaign
We need to go through each of them in the directory recursively and rename the file with its full description. Final outcome:
/tmp/customer_records_abcxyz.csv
/tmp/routernodes_logs_efgwaz.csv
/tmp/products_campaign_mnostu.csv
I found something similar here but not sure how to work it out at directory level dealing with only one file as the lookup/referece file. Please help. Thanks!
I would try something like this:
sed 's/,/.csv/;s/$/.csv/' filemapping.dat | xargs -n2 mv
Either cd to tmp beforehand, or modify the sed command to include the path name.
The sed commands simply replace the comma and the line end with the string ".csv".

Korn shell get latest file matching pattern

Need help to write a korn shell script for the below.
Have to write the script in dir ..../script
Have the below files in dir ..../files
Have 2 file patterns
xxx892_1.txt
xxx367_8.txt
xxx356_9.txt
yyy736_9.txt
yyy635_7.txt
Need to get the latest files(last created) matching pattern
xxx and yyy i.e from above xxx356_9.txt, yyy635_7 and ftp them over.
Please need help with this. Thanks.
If by latest you mean time stamp.You can do something like this
ls -t xxx* | head -1 #this will give you the latest modified file
ls -t yyy* | head -1
The above will give you the file Names which you can use for FTP.

Gsutil wildcard search

I have to search 2 files from list of files.
The 2 Files which I have to search are in format googlea_1234_20151208.txt and googleb_7654_20151208.txt, so i am thinking of basing my search on keyword googlea , googleb and 20151208 .
Using gsutil, i can find individual files.
Using command gsutil ls gs://bucketid/*googlea_1234_20151208* which gives me first file and gsutil ls gs://bucketid/*googlea_1234_20151208* gives me 2nd file.
Looking for a command which will give me both files with one command
gsutil ls gs://bucketid/*google*20151208*
Assuming that gsutil is just passing the ls args to a real ls type processor, try
gsutil ls gs://bucketid/*google[ab]_*20151208*
The [ab] is know as a character-class. Rather than use ? to match any single character, [ab] says, match a single char if it is a or b
IHTH

Recursive grep with include giving incorrect results for current folder

I have created a test directory structure:
t1.html
t2.php
a/t1.html
a/t2.php
b/t1.html
b/t2.php
All files contain the string "HELLO".
The following commands are run from the root folder above:
> grep -r "HELLO" *
b/t1.html:HELLO
b/t2.php:HELLO
c/t1.html:HELLO
c/t2.php:HELLO
t1.html:HELLO
t2.php:HELLO
> grep -r --include=*.html "HELLO" *
b/t1.html:HELLO
c/t1.html:HELLO
t2.php:HELLO
Why is it including the correct .html files from the sub-directories, but the .php file from the current directory?
If I pop up a level to the directory above my whole structure, then it gives following result:
grep -r --include=*.html "HELLO" *
a/t1.html:HELLO
a/c/t1.html:HELLO
a/b/t1.html:HELLO
This is what I expected when ran from within my structure.
I assume I can achieve the goal using find+grep together, but I thought this was valid usage of grep.
Thanks for any help.
Andy
Use a dot instead of the asterisk:
grep -r HELLO .
Asterisk gets evaluated by the shell and replaced with the list of all the files in the current directory (whose names don't start with a dot). All of them are then grepped recursively.

Resources