How can I export a recursive directory & file listing to a text file in Linux Bash shell with an SSH command? - linux

Assume I'm in my pwd - /home/kparisi/
What command can I run to export all directories & files from this directory and all subdirectories within it to a text file?
I don't need the contents of the files, just their names & paths (and permissions if possible)
Thanks in advance.

Use find to get a listing of all the files in a directory and its subdirectories, then pipe it to a file with the > operand.
find > list.txt

find is a good utility to obtain (recursively) all the content of a directory. If you want the file (and directory) names with their permissions:
find /home/kparisi -printf "%M %p\n"
You can then use ssh to run this command on the remote server:
ssh kparisi#remote.com 'find /home/kparisi -printf "%M %p\n"'
And finally, if you want to store this in a file on your local server:
ssh kparisi#remote.com 'find /home/kparisi -printf "%M %p\n"' > file

I understand that it is an old question, but anyway if somebody will reach this page, maybe will be useful. I love to do this (assume we have the file "fsstruct.txt" to save directory structure):
tree > fsstruct.txt
for me this format is simpler to read.
It is easy also to use other features of tree:
tree -p > fsstruct.txt
prints file type and permissions before file and it is more accessible when you are reading plain text, which is a file and which is a directory.
tree -h > fsstruct.txt
this:tree -ph > fsstruct.txt
prints sizes of files in a human readable format.
Also, it is possible to send output to the file: tree -ph . -o fsstruct.txt or create HTML:
tree -Hph . -o tree.html

The command find > list.txt will do. If you want directory tree list tree -a > list.txt can be used

Related

How to find all available links for a particular file in Linux

I have a file named testfile.txt in /home/rajeesh/Desktop/My/Assignments/. I need to search for all the available links to the above-mentioned file using some CLI command.
I tried:
find -L /home -inum $(ls -di testfile.txt | cut -d" " -f1)
I am running this command in directory /home/rajeesh/Desktop/My/Assignments/ but the result contains only link available in the current directory (I have links for the file on Desktop and in My folders).
Could anybody help me with this, please?
Thanks in advance.
If using GNU find:
find -L /home -samefile /home/rajeesh/Desktop/My/Assignments/testfile.txt
will display all files existing in the /home directory tree that have the same inode as the given file (Meaning they're hard links to the same underlying file) or are symbolic links pointing to the given file.

How to list all files in all directory in linux using linux cmd

I'm trying to read all the files available in all directories along with complete path, using bash script. Have tried
ls -R cmd but it is not listing the files properly.
My requirement is to have the output which should have complete path for the file.
Maybe you want the find utility which is recursive by default.
find . -type f
The . means the current pwd/directory
You can use find command for listing all the files from a particular directory.
find /your_path/ -type f 2>/dev/null
/your_path/ : provide the path from where you will execute the command, the output will have the complete path of the files.
2>/dev/null : to suppress the STDERR
right now what you are trying is to list all of the sub-directories using ls -R, so this would not fulfill what you are trying to achieve.
See man ls:
-R, --recursive
list subdirectories recursively

How to extract name of all files contained in a folder into a .txt file?

I want to extract name of all files contained in a folder into a .txt file for ex: when we type ls command in terminal it shows list all the files and folder names.
Can we store all these names in a .txt file.
You can redirect the output of the ls command to a file with > like so:
ls > files.txt
Note this will overwrite any previous contents of files.txt. To append, use >> instead like so:
ls >> files.txt
By using the > character, a command's output can be written to a named file. For example,
ls > list.txt
This will directly post the output of the ls command into the file list.txt. However, this will also overwrite anything that's in list.txt. To append to the end of a file instead of overwriting, use >>:
ls >> list.txt
This will leave the previous contents of list.txt in tact, and add the output of ls to the end of the file.
If you really care about files and want to skip directories, I'd use find.
find . -type f --mindepth 1 --maxdepth 1 >> list.txt
Note, that this list will also contain list.txt, as it is created before spawning find in the current directory.
The advantage of using find instead of ls is that it will include "hidden" files (files starting with a . in their name).

How to create empty txt files in a directory reflecting files in another directory?

I need to do some testing and need the same file names as I have in directory /home/recordings in /home/testing folder.
For example, if i have a file recording01.mp4 in /home/recordings, i would want to have the an empty file recording01.txt or recording01.mp4 or in /home/testing
I understand I can use the following command?
for i in /home/recordings/*; do touch "$i"; done
Not sure how to specify extension or the destination directory in this case?
A simple addition of /home/testing/ to touch command will do it.
for i in /home/recordings/*; do
temp=`echo $i|cut -f3 -d'/'`
cd /home/testing/
touch "$temp";
cd ../..
done
I assume you are not in home directory and running this script file from anywhere else.
You can also do this without a loop
find /home/recordings/ -type f -printf /home/testing/%f'\n' | xargs -n1 touch
Try this:
for i in /home/recordings/*; do touch "/home/testing/$i"; done
You need only specify absolute paths and things will work fine. A bunch of 0-length files are created, their names corresponding to those in /home/recordings.

Shell Script to Recursively Loop Through Directory and print location of important files

So I am trying to write a command line shell script or a shell script that will be able to recursively loop through a directory, all its files, and sub-directories for certain files and then print the location of these files to a text file.
I know that this is possible using BASH commands such as find, locate, exec, and >.
This is what I have so far. find <top-directory> -name '*.class' -exec locate {} > location.txt \;
This does not work though. Can any BASH, Shell scripting experts help me out please?
Thank-you for reading this.
The default behavior of find (if you don't specify any other action) is to print the filename. So you can simply do:
find <top-directory> -name '*.class' > location.txt
Or if you want to be explicit about it:
find <top-directory> -name '*.class' -print > location.txt
You can save the redirection by using find's -fprint option:
find <top-directory> -name '*.class' -fprint location.txt
From the man page:
-fprint file
[...] print the full file name into file file. If file does not exist when find is run, it is created; if it does exist, it is truncated.
A less preferred way to do it is to use ls:
ls -d $PWD**/* | grep class
let's break it down:
ls -d # lists the directory (returns `.`)
ls -d $PWD # lists the directory - but this time $PWD will provide full path
ls -d $PWD/** # list the directory with full-path and every file under this directory (not recursively) - an effect which is due to `/**` part
ls -d $PWD/**/* # same like previous one, only that now do it recursively to the folders below (achieved by adding the `/*` at the end)
A better way of doing it:
After reading this due to recommendation from Charles Duffy, it appears as a bad idea to use both ls as well as find (article also says: "find is just as bad as ls in this context".) The reason it's a bad idea is because you can't control the output of ls: for example, you can't configure ls to terminate filenames with NUL. The reason it's problematic is that unix allows all kind of weird characters in a file-name (newline, pipe etc) and will "break" ls in a way you can't anticipate.
Better use a shell script for the task, and it's pretty simple task too:
Create a file my_script.sh, edit the file to contain:
for i in **/*; do
echo $PWD/$i
done
Give it execute permissions (by running: chmod +x my_script.sh).
Run it from the same directory with:
./my_script.sh
and you're good to go!

Resources