How to use tail but ignore subdirectories in Linux [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 1 year ago.
This post was edited and submitted for review 1 year ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
I'm using tail in Ubuntu to fetch information from a ubuntu device. I'm kind of new to Linux so the command I use is
root#mydevice:/# tail /path/to/directory/*/*
So I can fetch many files at once.
However, some of the subdirectories also contain subdirectories, so the output looks like this:
==> /path/to/directory/number_one/subdirectory<==
tail: error reading '/path/to/directory/number_one/subdirectory': Is a directory
==> /path/to/directory/number_one/data_one <==
23300000
==> /path/to/directory/number_one/data_two <==
23953
==> /path/to/directory/number_one/data_three <==
667
etc...
Is there a way to use tail while ignoring subdirectories so I don't get this error?
Thanks a lot in advance.

You can use find and use the -f option so it only finds regular files, no directories. -maxdepth 1 keeps it from recursing into the subdirectories.
find /path/to/sensor/*/* -type f -maxdepth 1 -exec tail {} +

Related

How to change folders/files emblems recursively in linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
I want to change my folder emblems recursively. I know that the command gvfs-set-attribute -t string ~/Desktop/ metadata::emblems [] can change the emblem of only Desktop.
How can I change whole folders and the files emblems? I tried gvfs-set-attribute -t stringv ~/* metadata::emblems [] but it returns error Error setting attribute: Setting attribute /home/taygun/Desktop not supported.
You could feed the command into find:
find ~/ -type d -exec gvfs-set-attribute -t stringv {} metadata::emblems [] \;
There are some known issues with the defaultdir ~ on some distros with gvfs-set-attribute (https://bugzilla.redhat.com/show_bug.cgi?id=1368676)
Consider upgrading to the latest version if you're not already on it.

Remove symbolic links only from a folder in tcsh [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
I need to remove a large number of symbolic links from a folder that has other files which I don't want to remove. Is there any easy way to remove only symbolic links?
You can use the find(1) command
find . -maxdepth 1 -type l -exec rm {} \;
-maxdepth 1 is for only scanning current directory.
-type l is for searching symbolic links
-exec executes rm to delete given file, the {} being replaced by find with an appropriate path, and the \; ending the sub-command run by find
See also the man page of rm(1) (and of ls(1), mv(1), cp(1), ln(1), stat(1) if you want to use them in a variant of that find command).

tar every subfolder named X in a bash file [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I have a folder that contains many folders and my wordpresses sites.
At the same folder i need to catch up the "uploads" subfolder and tar it named by its site.
Can anyone help me out?
Does this do the trick?
find /var/www -name uploads -a -type d | awk -F '/' '{ system("tar -czvf "$3".tar "$0) }'
The find command lists all the directories named upload under /var/www.
That's piped to awk, which splits it using the slash and runs tar. The third field is used as the file name and the whole string as the target for the tar.
This works for me: tar -cvf thisstuff.tar */uploads/*

How to delete all files expect specific file type in the folder using command line [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have lot of files in specific folder.
I want to delete all files expect *.html file type in that folder.
Is there any way to do this in command line? I am using Linux.
I'll assume that you refer to linux command line, please update your question if not.
find ./folder/to/look/in -not -iname '*.html' -exec rm {} \;
Here's an explanation of what this does
edit
If you have not too many files then you might want to make find execute one single rm command. You can do that with using + instead of ;
find ./folder/to/look/in -not -iname '*.html' -exec rm {} +
Here's an explanation of this one

Hashing a Directory in Linux? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Is there any command in linux to calculate SHA1 hash of a director which contains files + Directories(these directories future contains file and more directories).
tar cf - $DIRECTORY|sha1sum
Deficiencies/advantages (depending on your perspective):
$DIRECTORY must be exactly the same in both cases (so you must use
relative paths).
This takes into account file modification dates, not just file contents.
I think you should be able to use this
find . -type f -exec sha1sum {} \;
Just replace the "." with your directory.
File by file you mean?
$ cd my_folder
$ sha1sum *
d73c8369c7808f7e96561b4c18d68233678f354f xxx.txt
5941a4f547f69b4b6271a351242ce41b3e440795 yyy.txt
Or of all the files together?
$ cat my_folder/* | sha1sum
7713154076812602f6f737cf5ad5924813182298

Resources