delete all folders and files within a linux directory except one folder and all contents inside that folder - linux

I have a directory structure as :-
/usr/testing/member/
---> public--->folder1--->file1
\----> file2
---> folder3:- contains files and folders
---> folder4:- contains files and folders
---> several files
I want to keep the public folder and all its contents (further folders and files within it) but want to delete everything else under the directory /usr/testing/member/. But that also means member folder is not deleted.
Is there any shell script or command that can be used to achieve this exactly as i stated.

Here's one way to do it:
(cd /usr/testing/member; find . -maxdepth 1 \( ! -name . -a ! -name public \) -exec echo rm -fr {} +)
That is: cd into /usr/testing/member, find all files and directories there, without going further below, and exclude the current directory (".") and any file or directory named "public", and execute a command for the found files.
This will print what would be deleted.
Verify it looks good, and then drop the echo.

I think below will do the work,
$ cd /usr/testing/member/
$ rm -rf $(ls | grep -v "public")
explanation:
we are passing everything inside /usr/testing/member/ but public to rm by making use of -v(exclude) option of grep

Related

Copy everything except specific files

How can I copy everything (files and directories(even if they are empty)) from one directory to another, except files ".php", and files with name "config.yml".
I need to do this with single command.
I have tried this one
find ./ -type f ! ( -name "*.php" -o -name "config.yml" ) -exec cp --parents -r -t /my/directory/ "{}" +
It works but if the directory have only files ".php", command will skip the directory and do not copy the empty one, but I need the directory even if it will be empty.
Sorry, I haven't enough reputation to put a comment, because I would like to ask you if the use of "find" is a requisite.
If is not, you can do it easily with the rsync command:
rsync -av --exclude=config.yml --exclude="*php" ORIGINFOLDER/ DESTFOLDER
Just change ORIGINFOLDER and DESTFOLDER for your folders name, and take a look at the man to see the meaning of the options.

Merge and compile SCSS files recursively using shell script in Windows

So here's the situation. I currently have project with the following folder structure:
-- public_html
-- assets
-- scss (contains SCSS files, which can be located inside subfolders)
-- scripts
-- vendor
-- plugins (Also contain some SCSS files from the)
-- css
-- also some other folders (no scss or css here though)
-- dist (distribution folder)
-- other folders too for html, php etc...
When I want to upload the script to my remote server, I firstly process the complete folder structure in public_html to the dist. I do this with a build script, build.sh, which you can see here:
function prepareForBuild(){
echo "Updating dependencies...";
bower install
echo "Preparing 'dist' directory...";
mkdir -p dist
rm -rf dist/*
# for the stackoverflow markup... */
}
function buildApp(){
cd assets
echo "Creating temp directory..."
mkDir -p temp
## MERGING TAKES (OR SHOULD) PLACE HERE
find "/" -name '*.scss' -exec cat {} \; > merge.scss
cd ../
pause
# requirejs optimization
echo "Optimizing files...";
node r.js -o build/build.js
}
function cleanup(){
echo "Cleaning up unnecessary files...";
cd dist
rm -f bower.json .gitignore .bowerrc README.md .DS_Store config.rb
rm -f build.sh build.txt composer.json composer.lock
rm -rf build dist scss .git .sass-cache .idea
}
function pause(){
read -p "$*"
}
prepareForBuild
buildApp && cleanup
echo "Building finished!";
Problem:
However, when I run the script in the Git bash terminal it breaks at this line:
find "/" -name '*.scss' -exec cat {} \; > merge.scss
Edit: "/" or "\" does not seem to affect the problem :(
Also, run from root (public_html) directory:
where it returns in the normal command prompt "File not found - *.scss" (and outputs an empty merge.scss file) and simply stops working in the Git bash terminal.
I based this line on this answer, but I can't get it to work for me. I also tried something like this instead but this only returns more errors.
What am I missing here?
EDIT!
I thought the bash shell simply stopped working but it seems that was not the case. After a while it returns this:
What does this mean? Is the reference folder incorrect?
Objective:
As said I would like to recursively scan the assets folder for .scss files and merge them into one default.scss.
If this is too much to ask another solution integrated with require.js (or simply with node) would also be possible, as you can see in my script I already use require.js for standard optimization.
Thanks for your time and patience!
If you want to scan for files under the assets directory, then do not specify the root directory in the find command. Omitting directories might also be a good idea.
find . -name "*.scss" -not -type d -exec cat {} \; > merge.scss
In the chance the you are wanting the merge.scss file to be created in the newly created temp directory, you need to make it the current directory. Then you can do the file scan from the parent, assets, directory.
function buildApp(){
cd assets
echo "Creating temp directory..."
mkDir -p temp
## MERGING TAKES (OR SHOULD) PLACE HERE
cd temp
find .. -name '*.scss' -not -type d -exec cat {} \; > merge.scss
cd ../
pause
# requirejs optimization
echo "Optimizing files...";
node r.js -o build/build.js
}
Also, does a pause command work for you in the bash shell? That is a Windows command unless you have something else.
Your -exec option misses the {} special element, meaning the file matching each find iteration. In addition, you can't definitively use '\' as a root path, at least it would be '/'.
Nevertheless, ideally you should use a specific sub-path for better efficiencies, and for permissions access reasons (I guess your use may not have permission to access all sub path from the root path?).
At least, you can update your instruction to:
find '/' -type f -name "*.scss" -exec cat {} \; > merge.scss

Wrtie a script to Delete files if it exists in different folder in Linux

I'm trying write a script in linux. Where I have some csv files in Two different folders(A and B) and then after some processing copy of rejected files are moving to Bad Folder.
SO I want bad files to be deleted from Table A and B which have copied to Bad Folder.
Can you help me to write this script for linux?
Best
lets say name of Bad Folder is 'badFolder' and considering 'A', 'B' and 'badFolder' are in same directory
Steps to delete files from folder A and B:
step 1: change current directory to your 'badFolder'
cd badFolder
step 2: delete identical files
find . -type f -exec rm -f ../A/{} \;
find . -type f -exec rm -f ../B/{} \;
The argument -type f tells to look for files, not directories.
The -exec ... \; argument tells that, once it finds a file in 'badFolder', it should run the command rm -f on its counterpart in the A subdirectory.
Because rm is given with the -f option, it will silently ignore files that don't exist.
Also, it will not prompt before deleting files. This is very handy when deleting a large number of files. However, be sure that you really want to delete the files before running this script.
#!/bin/bash
#Set the working folder in which you want to delete the file
Working_folder=/<Folder>/<path>
cd $Working_folder
#command to delete all files present in folders
rm <filenames seperated by space>
echo "files are deleted"
#if you want to delete all files you can use wild card character
# e.g. command rm *.*
# if you want to delete a particular file say for deleting .csv file you can use command rm *.csv command
Set variables containing the paths of your A, B and BAD directories.
Then you can do something along the lines of
for file in ls ${PATH_TO_BAD}
do
rm ${PATH_TO_A}/$file
rm ${PATH_TO_B}/$file
done
This is iterating over the BAD directory and any file it finds, it deletes from the A and B directories.

Howto replace a file in several sub-folders

I've a series of directories containing a set of files. There is a new copy of this file which I would like to replace all instances with. How can do this with find command?
Latest file is in /var/www/html is called update_user.php
There are 125 directories with several other files including a copy of update_user.php. I want to replace these with the one in update_user.php excluding itself.
This should do the job:
find /path/to/old/files -type f -name update_user.php -exec cp /path/to/new/update_user.php {} \;
You should check if the new file is not inside /path/to/old and if so than first copy it outside and use that copy but.. it'll not harm if you don't - one cp will fail with are the same file error.
You can use
cp -v to see what it does
cp -u to update only when source file is newer
echo cp to perform dry run
I would suggest to check first if all dest. files are the same with:
find /path/to/old/files -type f -name update_user.php -exec md5sum {} \;|awk '{print $1}'|sort|uniq

Remove only files and not the directory in linux

I want to know how I can remove all the files in a directory say directory1 contains some 100 files. I just want to remove the files and not the directory.
I know that rmdir directory1 will remove directory1 completely. But I want to only remove all the files inside.
Try this:
rm /path/to/directory1/*
by adding the -r option you can additionally remove contained directories and their content recursively.
find /path/to/directory1 -type f | xargs rm -f
This recursively deletes all normal files in the directory.

Resources