I have a folder which i want to copy and overwrite an existing folder in a directory.
Logic : i unzip the file , it will consist of 3 files [ test1.txt, test2.txt, test3.txt ]
Source Folder
- home/source/folder1
- test.zip
Targer Folder
- home/target
when i use the unzip command , file structure will be as follow
- home/target/folder1
- test1.txt
- test2.txt
- test3.txt
- test.zip
Initially i was using this command to copy the source folder to the target folder
cp -R home/source/folder1 home/target
However, when i trigger this the second time , it will create a sub folder of folder1
- home/target/folder
-test1.txt
-test2.txt
-test3.txt
-test.zip
-folder1
I found a few thread suggesting to use -T command
cp -R -T home/source/folder1 home/target
However this will also result the same folder structure as
- home/target/folder
-test1.txt
-test2.txt
-test3.txt
-test.zip
-folder1
The last resort which i will go to is to write a script to remove first and copy the folder
-rm -r home/target/folder
-cp -R home/source/folder1
Point 1 :
i was wondering if it is possible to replace or overwrite the whole folder itself
e.g after we have unzip the files , folder structure will be as this in target folder
- home/target/folder1
-test1.txt
-test2.txt
-test3.txt
-folder1
using the cp copy, it will automatically replace the existing folder1 and its content to how it is reflected in source folder
Source Folder
- home/target/folder1
- test.zip
Using this approach is working as expected, however i was wondering if in point a approach is possible
[ -d home/target/folder1 ] && echo "Workbook Directory Exist , removing twbx files content" && rm -r home/target/folder1
cp home/source/folder1 home/target
Related
I have a few sub-directories with files inside each of them in /home/user/archived/myFiles that I'm trying to bundle into a single tar file. The issue is, it keeps bundling a full directory path instead of just everything in the myFiles folder.
When I untar the file, I just want all the bundled sub-directories/files inside to appear in the directory I extracted the file rather than having to go through a series of folders that get created.
Instead, when I currently untar the file, I get a "home" folder and I have to go through /home/user/archived/myFiles to reach all the files.
I tried using the -C flag that I saw suggested online here Tar a directory, but don't store full absolute paths in the archive where you insert parameters for the full directory minus the last folder, and then the name of the last folder which contains all the stuff you want bundled. But the tar command doesn't work as I get a no such file or directory error.
#!/bin/bash
archivedDir="/home/user/archived/myFiles"
tar -czvf "archived-files.tar.gz" "${archivedDir}"/*
rm -vrf "${archivedDir}"/*
# Attempt with -C flag
#tar -cvf "${archivedDir}/archived-files.tar.gz" -C "${archivedDir}" "/*"
So for example, if I did an ls on /home/user/archived/myFiles, and it listed two directories called folderOne and folderTwo, and I ran this bash script and did an ls on /home/user/archived/myFiles again, that directory should only contain archived-files.tar.gz.
If I extracted the tar file, then folderOne and folderTwo would appear.
As I explain already here you should first change to this directory and then create the archive.
So change you script to something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -czvf "../archived-files.tar.gz" *
This will create the archive in upper directory so you will not remove it with the next command.
the extraction should be something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -xzvf "../archived-files.tar.gz"
I need to run a command in Linux where I would copy the files (not folders) in ~/folder1/subfolder1 to ~/folder2/subfolder2 while deleting the initial contents/files in folder2?
Command cp copies files from one folder to another:
cp ~/folder1/* ~/folder2/
But, how can I also delete files that were initially in the folder2, while copying only files from folder1?
Also, is there a rsync command instead of cp that would only copy files and not subfolders?
I have tried with this:
rsync --delete-during folder1/* folder2/
But, I got an error:
rsync: --delete does not work without -r or -d.
And I don't want to use -r or -d flag since that would mean the subfolders would get copied as well, and I only want to copy files.
You want to delete everything in folder2, then copy every file from folder1 ?
Try something like this:
rm folder2/* && cp folder1/* folder2/
(cp will not copy directories by default)
I am trying to do an incremental backup of a system we have, and am moving a lot of attachment files in many directories. I would like to merge my current attachments with my new attachments. Example:
Production Attachments Directory:
Dir A/ Dir B/ File 1, File 2
Attachments Backup Directory:
Dir A/ Dir B/ File 3, File 4
After Merge I would like to see:
Dir A/ Dir B/ File 1, File 2, File 3, File 4
The directory has hundreds of folders like this, so I would like to do it recursively. Is there a command in Linux/Unix that can do this? I have looked into commands like Union, but I am not sure if that will solve this problem.
I'm not 100% sure I understand your problem because my solution seems too simple, but..
cp -R (recurse) may provide you what you seek.
Taking your backup copy first and create a duplicate :
cp -R [backup directory] merged_attachments
Then force and recurse the production copy (so you have the latest production version in case you have a conflict) :
cp -fR [production directory]/* merged_attachments
You will wind up with your final merged_attachments directory containing the same structure both source directories.
You could accomplish the same thing if you wanted to copy the backup into the production directory with :
cp -nR [backup directory]/* [production directory]
Using the noclobber flag so you don't overwrite production files with the backup version.
Fore example, I have some files in my project, structure could be like this
Project/
-- file1.txt
-- file2.txt
-- build/
-- sub-folder/
I want to zip some files from my project and I can do that using ZIP command (build folder and some other files are excluded from zip)
zip -r build/project-04.05.2016 ./ -x *\build\*
After this new file is created:
build/project-04.05.2016.zip
In Mac Finder when I double click on this file it becomes unzipped like this:
build/project-04.05.2016/
----------- file1.txt
----------- file2.txt
----------- subfolder/
I would like to somehow zip this archive, so when it's unzipped, instead of "project-04.05.2016" I get a folder "project" with the same content. I was trying to rename the file to "project-04.05.2016" after it's zipped as "project", but when it's unzipped the results are the same. Maybe there's a way to first move the files to some temporary "project" folder and than to zip them to "project-04.05.2016.zip"? Thanks in advance.
Here is my current solution, it's a small script and I'm not too satisfied with it, but it works. I will accept if someone answers with more elegant solution. So, here it is:
#clear the contents of the previous build first
rm -r build/*
#copy required files to a temporary folder
rsync -av --exclude=build ./ build/project
#change current directory to build
cd build/
#zip files from build/project folder
zip -r "project-04.05.2016" project
#remove temporary folder
rm -r project/
#final zip file is at location:
#build/project-04.05.2016.zip
Every-time I copy recursively I always end up with httpdocs folder instead of the files within in public_html.
For example I might run something like:
cp -rpf /var/www/vhosts/website/httpdocs /var/www/vhosts/anotherwebsite/httpdocs
I always end up with /var/www/vhosts/anotherwebsite/httpdocs/httpdocs when all I am trying to do is move a website from one user to another.
When you just want to "push" your local website without getting offline long, you can use a temporary dir.
TODAY=$(date +%y%m%d)
NEWCODE=/var/www/vhosts/anotherwebsite/docs_${TODAY}
OLDCODE=/var/www/vhosts/anotherwebsite/docs_old
rm -rf ${NEWCODE}
cp -rpf /var/www/vhosts/website/httpdocs ${NEWCODE} || exit 1
# some checks ?
cd /var/www/vhosts/anotherwebsite/ || exit 1
mv httpdocs ${OLDCODE} || exit 1
mv ${NEWCODE} httpdocs
Between the moves you will be unavailable. When that is a problem, you might want to make a work_in_progress.html file, rename that file to httpdocs/index.html, remove all other files and copy the new files after this (the correct index.html file as the last one).
But this seems to fancy, stick with the short hickup in the solution above.
Wat do you want when a file exists under aontherwebsite and not in the source website?
I think you want anotherwebsite to be an exact copy, so make sure the old files are removed first.
rm -r /var/www/vhosts/anotherwebsite/httpdoc
cp -rpf /var/www/vhosts/website/httpdocs /var/www/vhosts/anotherwebsite/httpdoc
Edit: anotherwebsite should stay available.
The general trick is using tar:
tar cf - * | ( cd /target; tar xf -)
In your case:
TARGET=/var/www/vhosts/anotherwebsite/httpdoc
cd /var/www/vhosts/website/httpdocs || exit 1
mkdir -p ${TARGET}
[ -d ${TARGET} ] || exit 1
tar cf - * | ( cd ${TARGET}; tar xf -)
I added "|| exit 1" to be sure you do not copy from the wrong dir and ${TARGET} is a dir.
You still have 3 challenges:
1) How do you delete files you do not use anymore (yesterday_special_action.html)
2) should you copy images first, for customers opening new pages before the images are copied
3) what website do you have if the copy/tar fails after copying a part (disk full)
I will post a new answer for solving these challenges.
You need to tell copy to treat the destination directory as a file instead of as a directory (into which to copy the folder that you've listed as your source). Which means you need the -T option.
Alternatively you could use a source of /var/www/vhosts/website/httpdocs/* if that glob captures all the files you actually care about. That should work too.