Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have a folder that contains many folders and my wordpresses sites.
At the same folder i need to catch up the "uploads" subfolder and tar it named by its site.
Can anyone help me out?
Does this do the trick?
find /var/www -name uploads -a -type d | awk -F '/' '{ system("tar -czvf "$3".tar "$0) }'
The find command lists all the directories named upload under /var/www.
That's piped to awk, which splits it using the slash and runs tar. The third field is used as the file name and the whole string as the target for the tar.
This works for me: tar -cvf thisstuff.tar */uploads/*
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 11 months ago.
Improve this question
Be nice. I'm learning Linux and can't find this answer (I've searched)
I'm trying to move all the .txt files in Folder1 to my Documents directory.
~$ ls
Desktop Documents Downloads file1.txt Music Public Templates test user0files.txt user-files.txt Videos
~$ cd ~/Documents
~/Documents$ ls
Folder1 Folder2 test1.txt test2.txt
~/Documents$ cd ~/Documents/Folder1
~/Documents/Folder1$ ls
bale.txt, ball.txt, bowl.txt, foldernew
~/Documents/Folder1$ mv *.txt ~/Documents
mv: cannot stat '*.txt': No such file or directory
From here I tried moving foldernew by name to ~/Documents and it worked. Can someone explain what I am doing incorrectly?
Thanks so much!!
It looks like you have a bunch of files that end in .txt, (note the comma), so *.txt doesn't find them.
Rename the files to remove the comma and try again.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have websites folders in my /home directory of centos 7. i want to copy robot.txt and favicon.ico file in all websites directories.
Websites directory structure are as following:
/home/domain.com/public_html
/home/domain2.com/public_html
I want command which copy robot.txt and favicon in all websites public_html directory from /root/robots.txt and /root/favicon.ico and if file already available on the destination folder then the command will overwrite file.
Many Thanks
You can use find too.
find /home -type d -name public_html -exec cp /root/robots.txt /root/favicon.ico {} \;
Just use a simple for loop that processes each directory.
for dest in /home/domain*.com/public_html
do
cp /root/robots.txt /root/favicon.ico $dest
done
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
What I'm doing is:
tar -czf etc.tar.gz /etc /usr/local/etc
And when I extract it I will have two directories:
1) etc
2) usr
What I want is to do it this way that I will have only etc after extracting with contents of this two directories.
Thanks.
Is there any other way than creating temporary directory with merged files from /etc and /usr/local/etc and then removing it?
cd /
tar -cf /path/to/etc.tar etc/
cd /usr/local
tar -rf /path/to/etc.tar etc/
cd /path/to
gzip etc.tar
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Is there any command in linux to calculate SHA1 hash of a director which contains files + Directories(these directories future contains file and more directories).
tar cf - $DIRECTORY|sha1sum
Deficiencies/advantages (depending on your perspective):
$DIRECTORY must be exactly the same in both cases (so you must use
relative paths).
This takes into account file modification dates, not just file contents.
I think you should be able to use this
find . -type f -exec sha1sum {} \;
Just replace the "." with your directory.
File by file you mean?
$ cd my_folder
$ sha1sum *
d73c8369c7808f7e96561b4c18d68233678f354f xxx.txt
5941a4f547f69b4b6271a351242ce41b3e440795 yyy.txt
Or of all the files together?
$ cat my_folder/* | sha1sum
7713154076812602f6f737cf5ad5924813182298
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I'm looking for a Linux command to go through all the directories on my server and find all .htaccess files. The output would be a list of all those files with full path, creation/modification date and filesize.
find / -name ".htaccess" -print
Replace / with the root folder you like to start searching, in case you don't want to search the whole file system.
Wikipedia has an article about find, which also links to its man page.
It's easy with the find command.
find / -name .htaccess -exec ls -l {} \;
This will print the name, and the file details according to ls -l. Note that this is starting the search under /, which may take a long time. You might want to specify a different folder to search.
Another simple way to achieve:
locate .htaccess
find -name .htaccess
Could be as simple as
ls -l $(locate .htaccess)
if updatedb has run recently.