I am trying to do an incremental backup of a system we have, and am moving a lot of attachment files in many directories. I would like to merge my current attachments with my new attachments. Example:
Production Attachments Directory:
Dir A/ Dir B/ File 1, File 2
Attachments Backup Directory:
Dir A/ Dir B/ File 3, File 4
After Merge I would like to see:
Dir A/ Dir B/ File 1, File 2, File 3, File 4
The directory has hundreds of folders like this, so I would like to do it recursively. Is there a command in Linux/Unix that can do this? I have looked into commands like Union, but I am not sure if that will solve this problem.
I'm not 100% sure I understand your problem because my solution seems too simple, but..
cp -R (recurse) may provide you what you seek.
Taking your backup copy first and create a duplicate :
cp -R [backup directory] merged_attachments
Then force and recurse the production copy (so you have the latest production version in case you have a conflict) :
cp -fR [production directory]/* merged_attachments
You will wind up with your final merged_attachments directory containing the same structure both source directories.
You could accomplish the same thing if you wanted to copy the backup into the production directory with :
cp -nR [backup directory]/* [production directory]
Using the noclobber flag so you don't overwrite production files with the backup version.
Related
I have a few sub-directories with files inside each of them in /home/user/archived/myFiles that I'm trying to bundle into a single tar file. The issue is, it keeps bundling a full directory path instead of just everything in the myFiles folder.
When I untar the file, I just want all the bundled sub-directories/files inside to appear in the directory I extracted the file rather than having to go through a series of folders that get created.
Instead, when I currently untar the file, I get a "home" folder and I have to go through /home/user/archived/myFiles to reach all the files.
I tried using the -C flag that I saw suggested online here Tar a directory, but don't store full absolute paths in the archive where you insert parameters for the full directory minus the last folder, and then the name of the last folder which contains all the stuff you want bundled. But the tar command doesn't work as I get a no such file or directory error.
#!/bin/bash
archivedDir="/home/user/archived/myFiles"
tar -czvf "archived-files.tar.gz" "${archivedDir}"/*
rm -vrf "${archivedDir}"/*
# Attempt with -C flag
#tar -cvf "${archivedDir}/archived-files.tar.gz" -C "${archivedDir}" "/*"
So for example, if I did an ls on /home/user/archived/myFiles, and it listed two directories called folderOne and folderTwo, and I ran this bash script and did an ls on /home/user/archived/myFiles again, that directory should only contain archived-files.tar.gz.
If I extracted the tar file, then folderOne and folderTwo would appear.
As I explain already here you should first change to this directory and then create the archive.
So change you script to something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -czvf "../archived-files.tar.gz" *
This will create the archive in upper directory so you will not remove it with the next command.
the extraction should be something like:
archivedDir="/home/user/archived/myFiles"
cd $archivedDir
tar -xzvf "../archived-files.tar.gz"
There exists two directories: a/ and b/.
I'd like to copy all the files(recursively) from a/ into b/.
However, I only want to copy over an a file if its content is different than the already existing b file. If the corresponding b file does not exist, then you would still copy over the a file.
*by "corresponding file", I mean a files with the same name and relative path from their parent directories.
note:
The reason I don't want to overwrite a b file with the same exact contents, is because the b directory is being monitored by another program, and I don't want the file date to change causing the program to do more work than required.
I'm essentially looking for a way to perform a cp -rf a/ b/ while performing a diff check on each file. If the file's are different, perform the copy; otherwise skip the copy.
I see that cp has an update flag:
-u, --update
copy only when the SOURCE file is newer than the destination file or when the
destination file is missing
but this will not work because I'm not concerned about newer files; I'm concerned about different file contents.
Any shell language will do.
I've been attempting to get this to work by injecting my diff check into a find command:
find a/ ??? -exec cp {} b \;
This doesn't seem like an uncommon thing to do between two directories, so I'm hoping there is an elegant command line solution as aposed to me having to write a python script.
You can achieve this using rsync. Files or directories will be updated only if there is any new update in source folder.
$rsync -av --progress sourcefolder destinationfolder
Fore example, I have some files in my project, structure could be like this
Project/
-- file1.txt
-- file2.txt
-- build/
-- sub-folder/
I want to zip some files from my project and I can do that using ZIP command (build folder and some other files are excluded from zip)
zip -r build/project-04.05.2016 ./ -x *\build\*
After this new file is created:
build/project-04.05.2016.zip
In Mac Finder when I double click on this file it becomes unzipped like this:
build/project-04.05.2016/
----------- file1.txt
----------- file2.txt
----------- subfolder/
I would like to somehow zip this archive, so when it's unzipped, instead of "project-04.05.2016" I get a folder "project" with the same content. I was trying to rename the file to "project-04.05.2016" after it's zipped as "project", but when it's unzipped the results are the same. Maybe there's a way to first move the files to some temporary "project" folder and than to zip them to "project-04.05.2016.zip"? Thanks in advance.
Here is my current solution, it's a small script and I'm not too satisfied with it, but it works. I will accept if someone answers with more elegant solution. So, here it is:
#clear the contents of the previous build first
rm -r build/*
#copy required files to a temporary folder
rsync -av --exclude=build ./ build/project
#change current directory to build
cd build/
#zip files from build/project folder
zip -r "project-04.05.2016" project
#remove temporary folder
rm -r project/
#final zip file is at location:
#build/project-04.05.2016.zip
Let's say we have a zip file contains a directory named aq and in the current working directory we have files:
./
|- aq/a.txt
|- b.txt
When i use this command:
zip test.zip aq/* the a.txt file will be zipped into the aq directory that's inside the zip file
The question is how then can I add b.txt into the aq directory that's inside the test.zip file without putting the b.txt in the aq directory first which is in the current working directory like what I did with the a.txt?
Make a temp directory called, e.g. /tmp/$$/aq/, symlink into the temp directory and then do:
(cd /tmp/$$ && zip -r $ZIPDEST aq/)
i.e. zip using the temp dir. zip by default follows symbolic links, so it puts the file into the zip without making a copy.
This is pretty much how I construct complicated hierarchical zip files without copying everything to make the archive.
Tar has better options for renaming items as you're putting them into the archive, but you asked about zip.
I need to move the contents of a directory into an archive and I was delighted to find that the "-m" option does exactly that. (Of course, I'm using it with the "-T" option. :) )
But unfortunately, if the directory becomes after the zip operation, the directory itself is removed. I don not want this to happen, but can't find any option that behaves like this.
Do you guys have any ideas how I can get this behavior?
Here's my actual (obfuscated) command I'm using in my shell script:
zip -qrTmy $archive_name $files_dir
Unless your directory contains sub directories, it is not hard to recreate it after a zip/move:
zip -qrTmy $archive_name $files_dir
mkdir $files_dir
If the directory contains sub directories, then we need to duplicate that directory structure to a temporary name, perform a zip/move, rename the structure back. I'm still working on how to implement this idea. If I know a solution, I'll update this post.
UPDATE
If the directory contains sub directories:
find $files_dir -type d -exec mkdir temp_{} \; # Duplicate the dir structure
zip -qrTmy $archive_name $files_dir
mv temp_$files_dir $files_dir # Restore the dir structure