How to extract name of all files contained in a folder into a .txt file? - linux

I want to extract name of all files contained in a folder into a .txt file for ex: when we type ls command in terminal it shows list all the files and folder names.
Can we store all these names in a .txt file.

You can redirect the output of the ls command to a file with > like so:
ls > files.txt
Note this will overwrite any previous contents of files.txt. To append, use >> instead like so:
ls >> files.txt

By using the > character, a command's output can be written to a named file. For example,
ls > list.txt
This will directly post the output of the ls command into the file list.txt. However, this will also overwrite anything that's in list.txt. To append to the end of a file instead of overwriting, use >>:
ls >> list.txt
This will leave the previous contents of list.txt in tact, and add the output of ls to the end of the file.

If you really care about files and want to skip directories, I'd use find.
find . -type f --mindepth 1 --maxdepth 1 >> list.txt
Note, that this list will also contain list.txt, as it is created before spawning find in the current directory.
The advantage of using find instead of ls is that it will include "hidden" files (files starting with a . in their name).

Related

How to get a list of the filenames of a specific folder in shell script?

I am trying to get all the filenames of a specific folder in a text file and I want only the names, not the relative path. I tried:
ls -1 a/b/c>filenames.txt
and its output is
file-2021-08-18.txt
file2-2021-08-18.js
file3-2021-08-18.json
file4-2021-08-19.json
which is what I want but only a specific day's file.
But when I do this:
ls -1 a/b/c/*2021-08-18*>filenames.txt
then the output is
a/b/c/file-2021-08-18.txt
a/b/c/file2-2021-08-18.js
a/b/c/file3-2021-08-18.json
I want only the filenames not the path of the files.
So, required output:
file-2021-08-18.txt
file2-2021-08-18.js
file3-2021-08-18.json
Is there any straightforward solution for this? OR I need to trim the output.
Thanks!!
When the argument to ls is a directory, it lists the filenames in the directory.
But when you use a wildcard, the shell expands the wildcard to all the filenames. So ls doesn't receive the directory as its argument, it receives all the filenames, and it lists them as given.
You can change to the directory and then list the matching files in the current directory:
(cd /a/b/c; ls *2021-08-18*) > filenames.txt
The parentheses make this run in a subshell, so the working directory of the original shell is unaffected.
With GNU find you may use the -printf option:
find a/b/c/ -type f -name "*2021-08-18*" -printf "%f\n" > filenames.txt
The directive %f picks out the file's name with any leading directories removed. Since -printf doesn't add a newline (\n) after the filename, we add one in order to match the required output.

Use a text file (containing file names) to copy files from current directory to new directory

I have created a file (search.txt) containing file names of .fasta files I want to copy from the current directory (which also contains many unwanted .fasta files). Is it possible to use this text file to find and copy the matching files in the current directory to a new location?
The search.txt file contains a list of names like this:
name_1
name_2
name_3
I tried to build the search term using find and grep, like this:
find . *.fasta | grep -f search.txt
which is returning output like this for each matching file:
./name_1.fasta
./name_2.fasta
./name_3.fasta
name_1.fasta
name_2.fasta
name_3.fasta
It's finding the correct files, but I'm not sure if this output is useful / can be used to copy these files?
To get only matching filenames from search.txt I would do this:
find . -type f -name '*.fasta' -print0 | grep -zf search.txt | xargs -r0 cp -t target-dir/
It will find all files with the extension .fasta, display only the ones with matching patterns in search.txt, and bulk cp them to target-dir, and each filename is terminated with a nullbyte in case filenames contain newlines.
Using Bash, you can read all the files from the list into an array:
$ mapfile -t files < search.txt
$ declare -p files
declare -a files=([0]="name_1" [1]="name_2" [2]="name_3")
Then, you can append the desired file extension to all array elements:
$ files=("${files[#]/%/.fasta}")
$ declare -p files
declare -a files=([0]="name_1.fasta" [1]="name_2.fasta" [2]="name_3.fasta")
And finally, move them to the desired location:
$ mv "${files[#]}" path/to/new/location
You don't actually need the intermediate step:
mapfile -t files < search.txt
mv "${files[#]/%/.fasta}" path/to/new/location

Batch copy files from text list of file names in linux

I have a list of images in a text file in the following format:
abc.jpg
xyz.jpg
The list contains about a hundred images in various directories within a specific directory. I would like to create a shell script that finds and copies these files into a specified directory.
Can this script be adapted to what I need?
Copy list of file names from multiple directories
you do not need a script for this, a simple oneliner will do:
(assuming, that the full filepath, or the relative filepath to were you executing the command is written in your_file.txt file for every image)
cat your_file.txt | xargs find path_to_root_dir -name | xargs -I{} cp {} specfic_directory/
xargs will take multiple lines and runs the command you give it with the input of every line with -I you can specify a variable, for where the content of the line is placed in the command (default is at the end).
So this will take every line from you file, search the file in all subdirectories of path_to_root_dir and then does a copy.

How can I export a recursive directory & file listing to a text file in Linux Bash shell with an SSH command?

Assume I'm in my pwd - /home/kparisi/
What command can I run to export all directories & files from this directory and all subdirectories within it to a text file?
I don't need the contents of the files, just their names & paths (and permissions if possible)
Thanks in advance.
Use find to get a listing of all the files in a directory and its subdirectories, then pipe it to a file with the > operand.
find > list.txt
find is a good utility to obtain (recursively) all the content of a directory. If you want the file (and directory) names with their permissions:
find /home/kparisi -printf "%M %p\n"
You can then use ssh to run this command on the remote server:
ssh kparisi#remote.com 'find /home/kparisi -printf "%M %p\n"'
And finally, if you want to store this in a file on your local server:
ssh kparisi#remote.com 'find /home/kparisi -printf "%M %p\n"' > file
I understand that it is an old question, but anyway if somebody will reach this page, maybe will be useful. I love to do this (assume we have the file "fsstruct.txt" to save directory structure):
tree > fsstruct.txt
for me this format is simpler to read.
It is easy also to use other features of tree:
tree -p > fsstruct.txt
prints file type and permissions before file and it is more accessible when you are reading plain text, which is a file and which is a directory.
tree -h > fsstruct.txt
this:tree -ph > fsstruct.txt
prints sizes of files in a human readable format.
Also, it is possible to send output to the file: tree -ph . -o fsstruct.txt or create HTML:
tree -Hph . -o tree.html
The command find > list.txt will do. If you want directory tree list tree -a > list.txt can be used

copy the content of many file to one file

I have many files and I want to copy the content of these files in one file.
how to do that using linux command.
Exemple :
folder1\text1.txt
folder1\text2.txt
folder1\text3.txt
folder1\text5.txt
folder1\text4.txt
folder1\text6.txt
etc
copy the contents of all file into folder1\text.txt
thank
You can do
cat folder1/text*.txt > folder1/text.txt
It will get all files matching folder1/text*.txt pattern and put its content in folder1/text.txt.
Note I used folder/text.txt, that is, forward slash. Backslash is not used in *NIX.
you can use
find folder1 -name "text.*.txt" -type f -exec cat {} >> folder1/text.txt
when in folder type in command line
cat *.txt >> text.txt

Resources