copy the content of many file to one file - linux

I have many files and I want to copy the content of these files in one file.
how to do that using linux command.
Exemple :
folder1\text1.txt
folder1\text2.txt
folder1\text3.txt
folder1\text5.txt
folder1\text4.txt
folder1\text6.txt
etc
copy the contents of all file into folder1\text.txt
thank

You can do
cat folder1/text*.txt > folder1/text.txt
It will get all files matching folder1/text*.txt pattern and put its content in folder1/text.txt.
Note I used folder/text.txt, that is, forward slash. Backslash is not used in *NIX.

you can use
find folder1 -name "text.*.txt" -type f -exec cat {} >> folder1/text.txt

when in folder type in command line
cat *.txt >> text.txt

Related

How to search for all the hidden files in my computer?

I want to find all the hidden files inside a directory in linux terminal.
I have found out that we have a grep command to search for the file but I need to search for hidden files.
grep -r search *
Any help would be appreciated. Thanks.
try this on your terminal too show all the hidden files on your system:
find / -name ".*" 2> /dev/null
or you can use other way like in this web https://devconnected.com/how-to-show-hidden-files-on-linux/
Simply use (with GNU grep)
grep -r search .
if you want to search contents of files in the current directory and its subdirectories recursively.
Note: It isn't clear if you want to search filenames or contents of files.
The proper solution:
find /dir -name '.*' -type f
If by "hidden file" you mean Linux file names that begin with . that are often hidden by default, (and directories starting with . whose contents might also be considered "hidden") then try this command:
find . -print | grep '/\.'

Find a zip file, print path and zip contents

I have a series of numbered sub-directories that may or may not contain zip files, and within those zip files are some single-line .txt files I need. Is it possible to use a combination of find and unzip -p to list the file path and the single line contents on the same output line? I'd like to save the results to a .txt and import it into excel to work with.
From the main directory I can successfully find and output the single line:
find . -name 'file.zip' -exec unzip -p {} file.txt \;
How can I prefix the find output (i.e. the file path) to the output of this unzip command? Ideally, I'd like each line of the text file to resemble:
./path/to/file1.zip "Single line of file1.txt file"
./path/to/file2.zip "Single line of file2.txt file"
and so on. Can anyone provide some suggestions? I'm not very experienced with linux command line beyond simple commands.
Thank you.
Put all the code you want to execute into a shell script, then use the exec feature to call the shell script, i.e.
cat finder.bash
#!/bin/bash
printf "$# : " # prints just the /path/to/file/file.zip
unzip -p "$#" file.txt
For now, get that to work, you can make it generic to pass others besides file.txt later.
Make the script executable
chmod 755 finder.bash
Call it from find. i.e.
find . -name 'file.zip' -exec /path/to/finder.bash {} \;
(I don't have an easy way to test this, so reply in comments with error msgs).

Rename file in Linux if file exist in a single command

There is need that I want to rename file in Linux if file exist in a single command.
Suppose I want to search test.text file and I want to replace it with test.text.bak then I fire the following command
find / -name test.text
if it exist then I fire the command
mv test.text test.text.bak
In this scenario I am executing two commands but I want this should be happen in single command.
Thanks
Just:
mv test.text test.test.bak
If the file doesn't exist nothing will be renamed.
To supress the error message, when no file exits, use that syntax:
mv test.text test.test.bak 2>/dev/null
If you want to find test.txt somewhere in a subdirectory of dir and move it, try
find dir -name test.txt -exec mv {} {}.bak \;
This will move all files matching the conditions. If you want to traverse from the current directory, use . as the directory instead of dir.
Technically, this will spawn a separate command in a separate process for each file matched by find, but it's "one command" in the sense that you are using find as the only command you are actually starting yourself. (Think of find as a crude programming language if you will.)
for FILE in `find . -name test.test 2>/dev/null`; do mv $FILE $FILE.bak; done
This will search all the files named "test.test" in current as well as in child direcroties and then rename each file to .bak

linux command to concatenate multiple files with content separated by filenames?

I am looking for a command that will concatenate multiple files in a directory tree with sames having a pattern such that the resulting file has contents of all the files separated by the name(path) of each file. I tried using find -exec and sed but couldn't succeed.Please help.
More specifically I have a directory containing many sub-directories having file named 'test.FAILED'. I want to concatenate all the test.FAILED files separated by their Paths so that I can have a look at all of them at the same time.
for i in <pattern>
do
echo "$i"
cat "$i"
done > output
Using (gnu) find:
find . -name \*.FAILED -print -exec cat "{}" \;

How to build one file contains other files, selected by mask?

I need to put the contents of all *.as files in some specified folder into one big file.
How can I do it in Linux shell?
You mean cat *.as > onebigfile?
If you need all files in all subdirectories, th most robust way to do this is:
rm onebigfile
find -name '*.as' -print0 | xargs -0 cat >> onebigfile
This:
deletes onebigfile
for each file found, appends it onto onebigfile (this is why we delete it in the previous step -- otherwise you could end up tacking onto some existing file.)
A less robust but simpler solution:
cat `find -name '*.as'` > onebigfile
(The latter version doesn't handle very large numbers of files or files with weird filenames so well.)
Not sure what you mean by compile but are you looking for tar?

Resources