I want to cut the column which include the size of files . I use ls -l to view info about files in current localization . I save info about file in txt file ls -l > info.txt , and now I want to cut the column including size . I do cat info.txt | cut -d'' -f6 but i don't work . What I do wrong ?
This include info.txt :
-rwx------ 1 s10891 Domain Users 2188 May 4 14:51 info.txt
-rwx------ 1 s10891 Domain Users 1640 Mar 10 07:43 code.txt
-rwx------ 1 s10891 Domain Users 68 Mar 4 11:59 sorted.txt
drwx------ 1 s10891 Domain Users 0 Jan 11 09:48 PPJ
drwx------ 1 s10891 Domain Users 0 Sep 7 2012 public_html
drwx------ 1 s10891 Domain Users 0 Apr 15 11:16 RBD
drwx------ 1 s10891 Domain Users 0 Jan 7 09:45 RDB
drwx------ 1 s10891 Domain Users 0 Apr 15 12:00 SOP
drwx------ 1 s10891 Domain Users 0 Apr 8 12:53 sop_ex3
-rwx------ 1 s10891 Domain Users 122 Feb 25 11:48 sop_info.txt
drwx------ 1 s10891 Domain Users 0 Jan 14 09:41 WindowsFormsApplicati
drwx------ 1 s10891 Domain Users 0 Jan 14 09:41 WindowsFormsApplicati
drwx------ 1 s10891 Domain Users 0 Jan 14 09:41 WindowsFormsApplicati
and I want to get this :
2188
1640
68
0
0
0
0
0
0
122
0
0
0
most cuts count literal characters, so for -f 6 to work, your data has to be exactly in the same columns. Not sure if your 3rd to last line is a typo or an exact reproduction of your output, but it illustrates the problem with cut perfectly.
For this case, most people will use an awk solution:
ls -l | awk '{print $6}'
Will produce the output you have listed
The beauty of awk is that field 6 is determined by the value of awk FS variable (field separator), which defaults to "multiple white space values" (space or tab) (this is not an exact description, but is close enough for your needs).
You don't need to parse ls: stat -c %s * will give you a column of sizes. If you want the filename too: stat -c "%s %n" *
With sed:
ls -l | sed -E 's/ +/ /g'
" +" means any multiple space
Related
I am trying to count size of all files and subdirectories starting from ./ using oneliner:
ls -laR | grep -v "\.\." | awk '{total += $5} END {print total}'
but this counts size of subdirectories twice because output of ls -laR | grep -v "\.\." is:
.:
total 32
drwxr-xr-x 3 root root 4096 Nov 29 22:59 .
-rw-r--r-- 1 root root 55 Nov 29 02:19 131
-rw-r--r-- 1 root root 50 Nov 29 01:28 abc
-rw-r--r-- 1 root root 1000 Nov 29 01:27 access.log
drwxr-xr-x 2 root root 4096 Nov 29 22:24 asd
-rwx------ 1 root root 458 Nov 29 02:54 oneliners.sh
-rwx------ 1 root root 2136 Nov 29 17:56 regexp.sh.skript
./asd:
total 32
drwxr-xr-x 2 root root 4096 Nov 29 22:24 .
-rw-r--r-- 1 root root 21298 Nov 29 22:26 asd
so it counts directory asd twice. once in listing of directory .: as:
drwxr-xr-x 2 root root 4096 Nov 29 22:24 asd
and 2nd time in listing of directory ./asd: as:
drwxr-xr-x 2 root root 4096 Nov 29 22:24 .
I expect, this will happen for every subdirectory. Is there a way to remove them once from ls output? Usint grep -v '^d' removes all directories, so they wont be counted at all. I know I can do it simply bu using du -sb, but I need it to be done with fancy oneliner.
ls -FlaR |grep -v '\s\.\{1,\}/$' |awk '{total += $5} END {print total}'
includes the size of folders inside '.', but not the size of '.' itself. Comparing with du, the answer is quite different -as du is about the space on disk (relates to blocks).
The answer I get using your awk script is closer to what the OS reports -if you subtract the directory sizes you get a match, which suggests that MacOS X uses a method similar to
ls -FlaR |grep -v '^d.*/$' |awk '{total += $5} END {print total}'
for calculating the size of the content of a folder.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have two files in my home folder named "'?" and "'?;?" (without the double quotes). How can I delete them? I've tried to use escape, but it doesn't work.
Use single or double quotes to avoid wildcard expansion. A ? is a wildcard which indicates to the shell to match with any one single character. By placing it in quotes you are telling the shell not to perform wildcard expansion.
rm '?' '?;?'
rm "?" "?;?"
This will remove the two files named "?" and "?;?"
You can also use a backslash to quote the individual characters that have special meaning to the shell, so you could do this
rm \? \?\;\?
Notice you have to quote the '?' to prevent pathname expansion and you have to quote the ';' so the shell doesn't interpret that as separating commands.
If you leave out the quotes, then the shell parses it differently. Here's an experiment I ran.
$ for i in {1..4}; do for j in {a..c}; do touch "$i;$j" $j '?' '?;?';done;done
$ ls
1;a 1;b 1;c 2;a 2;b 2;c 3;a 3;b 3;c 4;a 4;b 4;c ? ?;? a b c
$ rm ? ?;?
rm: cannot remove `?': No such file or directory
rm: cannot remove `a': No such file or directory
rm: cannot remove `b': No such file or directory
rm: cannot remove `c': No such file or directory
bash: ?: command not found
$ rm `echo "?" "?;?"`
rm: cannot remove `?': No such file or directory
$
What happened here is the shell did pathname expansion, so
rm ? ?;?
became
rm ? a b c ? a b c;? a b c
The rm command removed files a b c ? then complained that the following files were not found (they had already been deleted). The semicolon separated commands, so it then tried to invoke the '?' command passing arguments "a" "b" "c" ... but there is no '?' command - the file named '?' had just been deleted, and it wasn't executable anyway - so the shell complains that the "?" command is not found.
If you want to remove all files matching "?" and "?;?" you need to trick the shell into expanding those, which I did like this
rm `echo "?" "?;?"`
This was expanded by the shell in two steps, first it runs echo "?" "?;?" which results in two strings, "?" and "?;?", then it does pathname expansion using those strings to produce the arguments for rm, which results in
rm ? 1;a 1;b 1;c 2;a 2;b 2;c 3;a 3;b 3;c 4;a 4;b 4;c ?;?
Notice that the wildcard expansion for '?' didn't produce any matching files this time (they had already been previously deleted), so the shell passes '?' as an argument to rm, which successfully removes all files passed as arguments except for '?' so it complains about that.
Here's another experiment
$ for i in {1..4}; do for j in {a..c}; do touch "$i;$j" $j '?' '?;?';done;done
$ ls
1;a 1;b 1;c 2;a 2;b 2;c 3;a 3;b 3;c 4;a 4;b 4;c ? ?;? a b c
$ rm "?" "?;?"
$ ls
1;a 1;b 1;c 2;a 2;b 2;c 3;a 3;b 3;c 4;a 4;b 4;c a b c
$ rm `echo "?" "?;?"`
$ ls
$
For more information consult the man page on globbing
man 7 glob
Wildcard Matching
A string is a wildcard pattern if it contains one of the characters '?', '*' or '['. Globbing is the operation that
expands a wildcard pattern into the list of pathnames matching the pattern. Matching is defined by:
A '?' (not between brackets) matches any single character.
A '*' (not between brackets) matches any string, including the empty string.
Note that ls can report a question mark for arbitrary non-printable characters, so there's a chance that what you've got as a file name does not contain a question mark.
You can spot this with the ls -b command, or with ls | cat.
As a convoluted example, complete with remedy, I created a script convolvulus like this:
set -x
mkdir convoluted &&
(
cd convoluted
cp /dev/null "$(ls -la | sed 1d)"
ls
ls -b
ls | cat
ls -la | cat
cp /dev/null $'\n'
cp /dev/null $'\n;\n'
ls -als | cat
ls -lab
ls
ls | cat
rm $'\n' $'\n;\n' d*
ls -a
)
rm -fr convoluted
When run, it yielded:
$ bash convolvulus 2>&1 | so
+ mkdir convoluted
+ cd convoluted
++ ls -la
++ sed 1d
+ cp /dev/null 'drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..'
+ ls
drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls -b
drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .\ndrwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls
+ cat
drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls -la
+ cat
total 0
drwxr-xr-x 3 jleffler staff 102 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
-rw-r--r-- 1 jleffler staff 0 Mar 9 11:58 drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ cp /dev/null '
'
+ cp /dev/null '
;
'
+ ls -als
+ cat
total 0
0 -rw-r--r-- 1 jleffler staff 0 Mar 9 11:58
0 -rw-r--r-- 1 jleffler staff 0 Mar 9 11:58
;
0 drwxr-xr-x 5 jleffler staff 170 Mar 9 11:58 .
0 drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
0 -rw-r--r-- 1 jleffler staff 0 Mar 9 11:58 drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls -lab
total 0
-rw-r--r-- 1 jleffler staff 0 Mar 9 11:58 \n
-rw-r--r-- 1 jleffler staff 0 Mar 9 11:58 \n;\n
drwxr-xr-x 5 jleffler staff 170 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
-rw-r--r-- 1 jleffler staff 0 Mar 9 11:58 drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .\ndrwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls
;
drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ ls
+ cat
;
drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..
+ rm '
' '
;
' 'drwxr-xr-x 2 jleffler staff 68 Mar 9 11:58 .
drwxr-xr-x 240 jleffler staff 8160 Mar 9 11:58 ..'
+ ls -a
.
..
+ rm -fr convoluted
$
Have fun!
The -b option to ls works for GNU ls and for Mac OS X and BSD ls (but is not defined by POSIX).
The '$'\n' notation is Bash ANSI-C Quoting.
Use quotes around the file names:
$ ls
? ?;?
$ rm '?'
$ ls
?;?
$ rm "?;?"
$ ls
$
How to generate manifest containing all files except particular file name in a folder like the fillowing.
Acctual requirement
4
issue1425.tgz 3096209598
issue1426.TGZ 3096209591
issue1427.ZIP 3096209592
issue1428.zip 3096209593
=>
total number of files: 4
file name: issue1425.tgz
size of file: 123333
....
im doing Like this
ls i2*.* |wc -l >>manifest.txt
vdir i2*.* >>manifest.txt
Output in "manifest.txt" from this is
4
-rwxr-Sr-t 1 root root 3096209598 2013-03-28 05:46 issue1425.tgz
-rwxrw-r-- 1 root root 3096209591 2013-03-20 06:46 issue1426.TGZ
-rwxr-Sr-t 1 root root 3096209592 2013-03-28 07:46 issue1427.ZIP
-rwxrw-r-- 1 root root 3096209593 2013-03-20 08:46 issue1428.zip
does any one has the solution to get my exact requirement:
Edit 2:#jarnbjo your command gives me wrong output see the actual sizes of files but its gives me wrongly.
root#aim-deb:/mnt/arch1/batchfiles/siva/20130328094916142/received# vdir
total 108816
drwxrwxrwx 5 1330 sno 4096 2013-03-20 00:30 i23321367
-rw-rw-r-- 1 1330 sno 39934457 2013-03-20 03:20 i23321367.tgz
drwxrwxrwx 5 1330 sno 4096 2013-03-20 00:33 i23321376
-rw-rw-r-- 1 1330 sno 36030069 2013-03-20 03:20 i23321376.tgz
drwxrwxrwx 5 1330 sno 4096 2013-03-20 00:34 i23321436
-rw-rw-r-- 1 1330 sno 35310600 2013-03-20 03:20 i23321436.tgz
-rw-r--r-- 1 root root 69 2013-03-29 00:57 manifest_QAG.txt
-rw-rw-r-- 1 1330 sno 75 2013-03-20 03:20 manifest.txt
root#aim-deb:/mnt/arch1/batchfiles/siva/20130328094916142/received# ls -1s -- block-size=1 i*.* $dir| awk '{print $2"\t"$1}'
i23321367.tgz 39981056
i23321376.tgz 36073472
i23321436.tgz 35352576
root#aim-deb:/mnt/arch1/batchfiles/siva/20130328094916142/received#
Answer:#jarnbjo thanks finally i got it why its happening. block size gives me size if file size on disk not actual size of file. here i want size of file so i can use vdir i*.* $dir| awk '{print $8"\t"$5}'
If you just want the amount of files and its size, try with this:
ls -1hs $dir
h stands for human readable
s stands for size
1 (one)just shows the names of files, one on each line.
And if you want it to be first name, then size, use:
ls -1hs $dir | awk '{print $2" "$1}'
Edit
From comments, you want it to have size in bytes, so you can do it like:
ls -1s --block-size=1 $dir
Further edit
If you want the dir not to be displayed, you have to cd $dir previously. There are other ways, but this seems to be the cleanest.
cd $dir; ls -1s --block-size=1 is*.* | awk '{print $2"\t"$1}'
Have a directory that multiple processes log to and I want to tail the latest file of a selected process.
in ~/bashrc I have added the following
function __taillog {
tail -f $(find $1 -maxdepth 1 -type f -printf "%T# %p\n" | sort -n | tail -n 1 | cut -d' ' -f 2-)
}
alias taillog='__taillog'
Taken from: https://superuser.com/questions/117596/how-to-tail-the-latest-file-in-a-directory
An example of the log file directory
-rw-r--r-- 1 genesys genesys 2284 Mar 19 16:34 gdalog.20130319_163436_906.log
-rw-r--r-- 1 genesys genesys 131072 Mar 19 16:34 gdalog.20130319_163436_906.snapshot.log
-rw-r--r-- 1 genesys genesys 10517 Mar 19 16:54 lcalog.20130319_163332_719.log
-rw-r--r-- 1 genesys genesys 131072 Mar 19 16:54 lcalog.20130319_163332_719.snapshot.log
-rw-r--r-- 1 genesys genesys 3792 Mar 19 16:37 StatServer_TLSTest.20130319_163700_703.log
-rw-r--r-- 1 genesys genesys 160562 Mar 19 16:52 StatServer_TLSTest.20130319_163712_045.log
-rw-r--r-- 1 genesys genesys 49730 Mar 19 16:54 StatServer_TLSTest.20130319_165217_402.log
-rw-r--r-- 1 genesys genesys 53960 Mar 20 09:55 StatServer_TLSTest.20130319_165423_702.log
-rw-r--r-- 1 genesys genesys 131072 Mar 20 09:56 StatServer_TLSTest.20130319_165423_702.snapshot.log
So to tail the all StatServer the command would be
taillog /home/user/logs/StatServer*
and it would tail the latest file for that application in the given path
The issue is the tail displays some of the file output but does not show any updates when the log file is appended. If the following command is run the log is tailed correctly
tail -f $(find /home/user/logs/StatServer* -maxdepth 1 -type f -printf "%T# %p\n" | sort -n | tail -n 1 | cut -d' ' -f 2-)
Some how adding this command as a bash function then calling it from an alias causes it to not operate as desired.
Any suggestion on a better way are welcome.
I believe you should be running this command:
taillog /home/user/logs
When you say /home/user/logs/this_app* you're passing all the files that match the pattern as argument to taillog and only using the first argument i.e. $1, and the command eventually translates to tail -f $1.
Instead $1 should be the directory where find should look for the files at that directory level (i.e. /home/user/logs in your case), then pipe the results to sort, tail and cut.
I didn't have any problems running your taillog function on linux/bash. Perhaps the log output is being buffered, so changes aren't being written right away? You might try turning off the [log]buffering option for this StatServer.
Using the tcsh shell on Free BSD, is there a way to recursively list all files and directories including the owner, group and relative path to the file?
ls -alR comes close, but it does not show the relative path in front of every file, it shows the path at the top of a grouping i.e.
owner% ls -alR
total 0
drwxr-xr-x 3 owner group 102 Feb 1 10:50 .
drwx------+ 27 owner group 918 Feb 1 10:49 ..
drwxr-xr-x 5 owner group 170 Feb 1 10:50 subfolder
./subfolder:
total 16
drwxr-xr-x 5 owner group 170 Feb 1 10:50 .
drwxr-xr-x 3 owner group 102 Feb 1 10:50 ..
-rw-r--r-- 1 owner group 0 Feb 1 10:50 file1
-rw-r--r-- 1 owner group 0 Feb 1 10:50 file2
What I would like is output like:
owner group ./relative/path/to/file
The accepted answer to this question shows the relative path to a file, but does not show the owner and group.
How about this:
find . -exec ls -dl \{\} \; | awk '{print $3, $4, $9}'
Use tree. Few linux distributions install it by default (in these dark days of only GUIs :-), but it's always available in the standard repositories. It should be available for *BSD also, see http://mama.indstate.edu/users/ice/tree/
Use:
tree -p -u -g -f -i
or
tree -p -u -g -f
or check the man page for many other useful arguments.
Works in Linux Debian:
find $PWD -type f
find comes close:
find . -printf "%u %g %p\n"
There is also "%P", which removes the prefix from the filename, if you want the paths to be relative to the specified directory.
Note that this is GNU find, I don't know if the BSD find also supports -printf.
You've already got an answer that works, but for reference you should be able to do this on the BSDs (I've tested it on a mac) :
find . -ls
If you fancy using Perl don't use it as a wrapper around shell commands. Doing it in native Perl is faster, more portable, and more resilient. Plus it avoids ad-hoc regexes.
use File::Find;
use File::stat;
find (\&myList, ".");
sub myList {
my $st = lstat($_) or die "No $file: $!";
print getgrnam($st->gid), " ",
getpwuid($st->uid), " ",
$File::Find::name, "\n";
}
Simple way I found was this:
ls -lthr /path_to_directory/*
" * " - represents levels.
Ajiths-MBP:test ajith$ ls -lthr *
test2:
total 0
-rw-r--r-- 1 ajith staff 0B Oct 17 18:22 test2.txt
test3:
total 0
-rw-r--r-- 1 ajith staff 0B Oct 17 18:22 test3.txt
test1:
total 0
-rw-r--r-- 1 ajith staff 0B Oct 17 18:21 test1.txt
drwxr-xr-x 3 ajith staff 96B Oct 17 18:22 test1_sub_dir
Ajiths-MBP:test ajith$ ls -lthr */*
-rw-r--r-- 1 ajith staff 0B Oct 17 18:21 test1/test1.txt
-rw-r--r-- 1 ajith staff 0B Oct 17 18:22 test2/test2.txt
-rw-r--r-- 1 ajith staff 0B Oct 17 18:22 test3/test3.txt
test1/test1_sub_dir:
total 0
-rw-r--r-- 1 ajith staff 0B Oct 17 18:22 test1_sub_file.txt
Use a shell script. Or a Perl script. Example Perl script (because it's easier for me to do):
#!/usr/bin/perl
use strict;
use warnings;
foreach(`find . -name \*`) {
chomp;
my $ls = `ls -l $_`;
# an incomprehensible string of characters because it's Perl
my($owner, $group) = /\S+\s+\S+\s+(\S+)\s+(\S)+/;
printf("%-10s %-10s %s\n", $owner, $group, $_);
}
Perhaps a bit more verbose than the other answers, but should do the trick, and should save you having to remember what to type. (Code untested.)