how to get the total size of certain folders in bash? [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
How can I get the total size of the certain folders?
For example I have a list of specific folder names in users.txt
# cat users.txt
user1
user2
user3
all this folders are locate in /home/
I have tried to execute:
# for i in `cat users.txt`; do du -shc /home/$i/; done
3.9M /home/user1/
3.9M total
141M /home/user2/
141M total
75M /home/user3/
75M total
but I need a total size of all of this folders.

du -shc $(sed 's#^#/home/#' users.txt)
That uses the contents of users.txt, prepended with /home/, as the arguments to du, so it will sum them for you.

You can pass output of du to tail -n 1 or execute
du -s directory
To get only size you can do
du -s directory | cut -f1

Related

Sort files in a directory by their text character length and copy to other directory [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I'm trying to find the smallest file by character length inside of a directory and, once it is found, I want to rename it and copy it to another directory.
For example, I have two files in one directory ~/Files and these are cars.txt and rabbits.txt
Text in cars.txt:
I like red cars that are big.
Text in rabbits.txt:
I like rabbits.
So far I know how to get the character length of a single file with the command wc -m 'filename' but I don't know how to do it in all the files and sort them in order. I know rabbits.txt is smaller in character length, but how do I compare both of them?
You could sort the files by size, then select the name of the first one:
file=$(wc -m ~/Files/* 2>/dev/null | sort -n | head -n 1 | awk '{print $2}')
echo $file

How can I print the total number of file descriptors with index 24 [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
How can I print the total number of file descriptors with index 24 for all the running processes?
I tried something: $
ls /proc/*/fd 2> errors.txt > stdout.txt | grep "^24" stdout.txt | wc -l
This solution returns 0 everytime.
I mention that my task ask me to write an one liner in order to solve it.
ls /proc/*/fd 2>/dev/null | grep -c '^24$'

Formatting Diff output in Shell Script [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I'm currently using (diff -q directory1 directory2) to output the files in each directory that are different and printing them to a table in html.
Current output: "Files directory1/file1 and directory2/file2 differ"
What I want: "file1 has changed"
I do not want to use comm or sort the files because other applications are pulling from the files and are sensitive to ordering. Any idea on how to get this done?
you need to grep diff output for file that differ then use awk to print file name with your new format
diff -rq dir1 dir2 | grep "differ" | awk '{print $2 "has changed"}'
Will this work?
diff -q $file1 $file2 | awk '{print $2 " has changed"}'

Meaning of command ls -lt | wc -l [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
My friend just passed me this command to count the number of files in a directory:
$ ls -lt | wc -l
Can someone please help me flush out the meaning of this command? I know that ls is to list all the files. But what does -lt mean?
Also, I get a different count if I use ls | wc -l with no -lt option. Why is that the case?
You'll want to get familiar with the "man (manual) pages":
$ man ls
In this case you'll see:
-l (The lowercase letter ``ell''.) List in long format. (See below.) If
the output is to a terminal, a total sum for all the file sizes is
output on a line before the long listing.
-t Sort by time modified (most recently modified first) before sorting the
operands by lexicographical order.
Another way you can see the effect of the options is to run ls without piping to the wc command. Compare
$ ls
with
$ ls -l
and
$ ls -lt

Space Issue in Linux [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
How to identify/ address the space issue in Linux. When I put
df -h
I get details of space availability of mounted devices. When i check the drive allocated for me is almost full.
Filesystem Size Used Avail Use% Mounted
/dev/mapper/rootvg-home 248M 236M 0 100% /home
You can see usage 100% and availability is 0 . How do I find any unwanted files in /home. I do lot of grep,sed,awk. Due to sed some temp files created but those are zero bytes. Apart from this any other way to identify the space, so that I can free some space !!!!
Thanks in advance. If i dont make sense, put a comment. i will address ASAP.
Finds all files over 20,000KB (roughly 20MB) in size and presents their names and size in a human readable format:
find / -type f -size +20000k -exec ls -lh {} \; | awk '{ print $9 ": " $5 }'
You can do
du -sh *
for human readable file size or
du -s * | sort -n
for size-sorted file size
And recursively check big directories for unwanted files

Resources