shell must parse ls -Al output and get last field (file or directory name) ANY SOLUTION - linux

I must parse ls -Al output and get file or directory name
ls -Al output :
drwxr-xr-x 12 s162103 studs 12 march 28 2012 personal domain
drwxr-xr-x 2 s162103 studs 3 march 28 22:32 public_html
drwxr-xr-x 7 s162103 studs 8 march 28 13:59 WebApplication1
I should use only ls -Al | <something>
for example:
ls -Al | awk '{print $8}'
but this doesn't work because $8 is not name if there's spaces in directory name,it is a part of name. maybe there's some utilities that cut last name or delete anything before? I need to find any solution. Please, help!
EDITED: I know what parse ls -Al is bad idea but I should exactly parse it with construction above! No way to use some thing like this
for f in *; do
somecommand "$f"
done

Don't parse ls -Al, if all you need is the file name.
You can put all file names in an array:
files=( * )
or you can iterate over the files directly:
for f in *; do
echo "$f"
done
If there is something specific from ls that you need, update your question to specify what you need.

How about thisls -Al |awk '{$1=$2=$3=$4=$5=$6=$7=$8="";print $0}'
I know it's a cheap trick but since you don't want to use anything other than ls -Al I cant think anything better...

Based on #squiguy request on comments, I post my comment as an answer:
What about just this?
ls -1A
instead of l (L, the letter), a 1 (one, the number). It will only list the names of the files.

It's also worth noting that find can do what you're looking for:
Everything in this directory, equivalent to ls:
find . -maxdepth 1
Recursively, similar to ls -R:
find .
Only directories in a given directory :
find /path/to/some/dir -maxdepth 1 -type d
md5sum every regular file :
find . -type f -exec md5sum {} \;

Hope awk works for you:
ls -Al | awk 'NR>1{for(i=9;i<NF;i++)printf $i" ";print $i}'
In case you're interested in sed:
ls -Al | sed '1d;s/^\([^ ]* *\)\{8\}//'

Related

Proper way to format timestamp in bash

So I need to create an output file for an outside contractor containing a list of filenames from a directory and creation dates in a specific format like this:
FileName YYYYmmDDHHMMSS
So far I've come up with:
find ./ -type f -printf " %f %a\n"
which returns:
FIleName Fri Apr 21 18:21:15.0458585800 2017
Or:
ls -l | awk {'print $9" "$6" "$7" "$8'}
which returns:
FileNAme Apr 21 18:21
But neither is quite the output i need as it needs to be purely numerical and include seconds.
Keeping in mind that the list of files could be very large so efficiency is a priority, how can i get this output?
Something like this
find ./ -type f -printf " %f %AY%Am%Ad%AH%AM%AS\n" |sed -e 's/\.[0-9]*$//'
(sed is needed to remove fractional part after seconds)
(Edit) with ls it will be:
ls -l --time-style=+%Y%m%d%H%M%S |awk {'print $7" "$6'}

How to redirect out put of xargs when using sed

Since swiching over to a better management system I am wanting to remove all the redundant logs at the top of each of our source files. In Notepad++ I was able to achieve the result by using "replace in files" and replacing matches to \A(//.*\n)+ with blank. On Linux however I am having no such luck and am needing to resort to 'xargs' and 'sed'.
The sed expression I'm using is:
sed '1,/^[^\/]/{/^[^\/]/b; d}'
Ugly to be sure but it does seem to work.
The problem I'm having is when I try to run that through 'xargs' in order to feed it all the source files in our system I am unable to redirect the output to 'stripped' files, which I then intend to copy over the originals.
I want something in the line of:
find . -name "*.com" -type f -print0 | xargs -0 -I file sed '1,/^[^\/]/{/^[^\/]/b; d}' "file" > "file.stripped"
However I'm having grief passing the ">" through to the receiving environment (shell) as I'm already using too many quote marks. I have tried all manner of escaping and shell "wrappers" but I just can't get it to play ball.
Anyone care to point me in the right direction?
Thanks,
Slarti.
I made a similar scenario with a simpler sed expression just as an example, see if it works for you:
I created 3 files with the string "abcd" inside each:
# ls -l
total 12
-rw-r--r-- 1 root root 5 Oct 6 09:05 test.aaaaa.com
-rw-r--r-- 1 root root 5 Oct 6 09:05 test2.aaaaa.com
-rw-r--r-- 1 root root 5 Oct 6 09:05 test3.aaaaa.com
# cat test*
abcd
abcd
abcd
Running the find command as you showed using the -exec option instead of xargs, and replacing the sed expression for a silly one that simply replaces every "a" for "b" and the option -i, that writes directly do the input file:
# find . -name "*.com" -type f -print0 -exec sed -i 's/a/b/g' {} \;
./test2.aaaaa.com./test3.aaaaa.com./test.aaaaa.com
# cat test*
bbcd
bbcd
bbcd
In your case it should look like this:
# find . -name "*.com" -type f -print0 -exec sed -i '1,/^[^\/]/{/^[^\/]/b; d}' {} \;

How to print file/directory details using ls and AWK?

I am trying to print the output as below using the command:
ls -l| awk '/777/{print $0}'
or
ls -l| awk '/0777/{print $0}'
But it does not print anything.
O/P should be like,
drwxrwxrwx 2 sbcoper sbcprd 4096 Apr 20 2015 work
(I am using 0777 or 777 because of find . -type f -perm 0777)
you've got the right idea, but 777 is the octal representation of permissions, and with this you're looking at the output of ls, which lists permissions as a string, like "drwxrwxrwx", so a simple modification of your command would work nicely,
ls -l | awk '/rwxrwxrwx/{print $0}'

Copy the three newest files under one directory (recursively) to another specified directory

I'm using bash.
Suppose I have a log file directory /var/myprogram/logs/.
Under this directory I have many sub-directories and sub-sub-directories that include different types of log files from my program.
I'd like to find the three newest files (modified most recently), whose name starts with 2010, under /var/myprogram/logs/, regardless of sub-directory and copy them to my home directory.
Here's what I would do manually
1. Go through each directory and do ls -lt 2010*
to see which files starting with 2010 are modified most recently.
2. Once I go through all directories, I'd know which three files are the newest. So I copy them manually to my home directory.
This is pretty tedious, so I wondered if maybe I could somehow pipe some commands together to do this in one step, preferably without using shell scripts?
I've been looking into find, ls, head, and awk that I might be able to use but haven't figured the right way to glue them together.
Let me know if I need to clarify. Thanks.
Here's how you can do it:
find -type f -name '2010*' -printf "%C#\t%P\n" |sort -r -k1,1 |head -3 |cut -f 2-
This outputs a list of files prefixed by their last change time, sorts them based on that value, takes the top 3 and removes the timestamp.
Your answers feel very complicated, how about
for FILE in find . -type d; do ls -t -1 -F $FILE | grep -v "/" | head -n3 | xargs -I{} mv {} ..; done;
or laid out nicely
for FILE in `find . -type d`;
do
ls -t -1 -F $FILE | grep -v "/" | grep "^2010" | head -n3 | xargs -I{} mv {} ~;
done;
My "shortest" answer after quickly hacking it up.
for file in $(find . -iname *.php -mtime 1 | xargs ls -l | awk '{ print $6" "$7" "$8" "$9 }' | sort | sed -n '1,3p' | awk '{ print $4 }'); do cp $file ../; done
The main command stored in $() does the following:
Find all files recursively in current directory matching (case insensitive) the name *.php and having been modified in the last 24 hours.
Pipe to ls -l, required to be able to sort by modification date, so we can have the first three
Extract the modification date and file name/path with awk
Sort these files based on datetime
With sed print only the first 3 files
With awk print only their name/path
Used in a for loop and as action copy them to the desired location.
Or use #Hasturkun's variant, which popped as a response while I was editing this post :)

Recursively traverse Samba shares?

With bash on linux, how would I write a command to recursively traverse shares mounted, and run commands on each file, to get the file type and size, permissions etc, and then output all of this to a file?
A CIFS share mount would look like a regular directory tree in the linux shell.
The command to search as you need is therefore generic.
From the base directory,
find . -type f -exec ls -lsrt {} \; > file.txt
Ok, this does not give you the file-type detail;
that can be done with a -exec file filename on each file.
mount -v | grep smbfs | awk '{print $3}' | xargs ls -lsR
which you can redirect to a file.
mount -v | awk '/smbfs/{
cmd="ls -lsR "$3
while((cmd | getline d)>0){
print d "->file "$3
}
close(cmd)
}'
find $(mount -t smbfs | awk '{print $3}') -mount -type f -ls -execdir file {} \;
...
33597911 4 -rw-rw-r-- 2 peter peter 5 Dec 6 00:09 ./test.d\ ir/base
./base: ASCII text
3662 4 -rw-rw-r-- 2 peter peter 4 Dec 6 02:26 ./test.txt...
./test.txt...: ASCII text
3661 0 -rw-rw-r-- 2 peter peter 0 Dec 6 02:45 ./foo.txt
./foo.txt: empty
...
If you used -exec file {} +, it would run file once with multiple arguments, but then the output wouldn't be nicely interleaved with find's -ls output. (GNU find's -execdir {} + currently behaves the same as -execdir {} \;, due to a bug workaround. Use -exec file {} \; if you want the full path in the file output as well as in the -ls output above it.
find -ls output is not quite the same as ls -l, since it includes inode an number of blocks as the first two fields.

Resources