I need to move the contents of a directory into an archive and I was delighted to find that the "-m" option does exactly that. (Of course, I'm using it with the "-T" option. :) )
But unfortunately, if the directory becomes after the zip operation, the directory itself is removed. I don not want this to happen, but can't find any option that behaves like this.
Do you guys have any ideas how I can get this behavior?
Here's my actual (obfuscated) command I'm using in my shell script:
zip -qrTmy $archive_name $files_dir
Unless your directory contains sub directories, it is not hard to recreate it after a zip/move:
zip -qrTmy $archive_name $files_dir
mkdir $files_dir
If the directory contains sub directories, then we need to duplicate that directory structure to a temporary name, perform a zip/move, rename the structure back. I'm still working on how to implement this idea. If I know a solution, I'll update this post.
UPDATE
If the directory contains sub directories:
find $files_dir -type d -exec mkdir temp_{} \; # Duplicate the dir structure
zip -qrTmy $archive_name $files_dir
mv temp_$files_dir $files_dir # Restore the dir structure
Related
I am trying to do an incremental backup of a system we have, and am moving a lot of attachment files in many directories. I would like to merge my current attachments with my new attachments. Example:
Production Attachments Directory:
Dir A/ Dir B/ File 1, File 2
Attachments Backup Directory:
Dir A/ Dir B/ File 3, File 4
After Merge I would like to see:
Dir A/ Dir B/ File 1, File 2, File 3, File 4
The directory has hundreds of folders like this, so I would like to do it recursively. Is there a command in Linux/Unix that can do this? I have looked into commands like Union, but I am not sure if that will solve this problem.
I'm not 100% sure I understand your problem because my solution seems too simple, but..
cp -R (recurse) may provide you what you seek.
Taking your backup copy first and create a duplicate :
cp -R [backup directory] merged_attachments
Then force and recurse the production copy (so you have the latest production version in case you have a conflict) :
cp -fR [production directory]/* merged_attachments
You will wind up with your final merged_attachments directory containing the same structure both source directories.
You could accomplish the same thing if you wanted to copy the backup into the production directory with :
cp -nR [backup directory]/* [production directory]
Using the noclobber flag so you don't overwrite production files with the backup version.
I have a bunch of files in separate folders, and all of the folders are in one directory.
/var/www/folder1/file1.txt
/var/www/folder1/file2.txt
/var/www/folder1/file3.txt
/var/www/folder2/file4.jpg
/var/www/folder2/file5.jpg
/var/www/folder2/file6.jpg
/var/www/folder3/file7.pdf
/var/www/folder3/file8.doc
/var/www/folder3/file9.gif
I need everything inside of the folders that are inside of /var/www/ to be copied to another directory (say, /var/my-directory/), but not the actual folders. Based on the example above, I need /var/my-directory/` to look as follows:
/var/my-directory/file1.txt
/var/my-directory/file2.txt
/var/my-directory/file3.txt
/var/my-directory/file4.jpg
/var/my-directory/file5.jpg
/var/my-directory/file6.jpg
/var/my-directory/file7.pdf
/var/my-directory/file8.doc
/var/my-directory/file9.gif
I can't seem to figure out the command to do this. I've tried the following:
sudo cp -R /var/www/./. /var/my-directory/
But, that still copies all of the folders.
Is there any way to do what I'm trying to do?
Use find.
find /var/www/ -type f -exec cp '{}' /var/my-directory/ \;
The trick is -type f that only selects file.
There exists two directories: a/ and b/.
I'd like to copy all the files(recursively) from a/ into b/.
However, I only want to copy over an a file if its content is different than the already existing b file. If the corresponding b file does not exist, then you would still copy over the a file.
*by "corresponding file", I mean a files with the same name and relative path from their parent directories.
note:
The reason I don't want to overwrite a b file with the same exact contents, is because the b directory is being monitored by another program, and I don't want the file date to change causing the program to do more work than required.
I'm essentially looking for a way to perform a cp -rf a/ b/ while performing a diff check on each file. If the file's are different, perform the copy; otherwise skip the copy.
I see that cp has an update flag:
-u, --update
copy only when the SOURCE file is newer than the destination file or when the
destination file is missing
but this will not work because I'm not concerned about newer files; I'm concerned about different file contents.
Any shell language will do.
I've been attempting to get this to work by injecting my diff check into a find command:
find a/ ??? -exec cp {} b \;
This doesn't seem like an uncommon thing to do between two directories, so I'm hoping there is an elegant command line solution as aposed to me having to write a python script.
You can achieve this using rsync. Files or directories will be updated only if there is any new update in source folder.
$rsync -av --progress sourcefolder destinationfolder
I'm trying to remove all .html files from the directory generated and from all subfolders there but it needs to leave all other files and directories alone.
I tried going through folder by folder and running rm *.html but this takes a long time as there are 20+ subfolders which also have subfolders. I tried looking the man pages for rm but nothing obvious jumped out. I'm sure there's a way to do this in one shot but I don't know how. Any ideas?
I think this may work:
cd generated
find . -type f -iname "*.html" -delete
I am trying to create a zip file and want to preserve most of the directory structure, but not the rootdir as defined from the command line. The command I'm using is:
zip -r out.zip /foo/bar/
I'd like it to recurse through bar and add all files with preserved directory structure (which it does). However I do not want 'foo' to be the top level directory in the zip file created. I would like bar to be the top level directory.
Is there any easy way to go about this? I realize I could change directories before zipping to avoid the problem, but I'm looking for a solution that doesn't require this.
This should do it:
cd /foo/bar/
zip -r ../out.zip *
The archive will be in /foo/out.zip
I don't believe zip has a way to exclude the top level directory. I think your best bet would be to do something like:
pushd /foo; zip -r out.zip ./bar; popd;
But this is exactly the sort of answer you said you didn't want.
7z a -tzip out.zip -w foo/bar/.
If someone stumbles upon this and is not satisfied with the above solution, here follows a very simple workaround to not zip long subdirectories. It involves temporarily creating a folder in C:/, and after zipping simply deleting it:
ZipFiles <- list.files(".../ZipFiles") # Insert your long subdirectory into .../
dir.create("C:/ZipFiles")
dir.create(".../FolderToBeZipped")
file.copy(from = ZipFiles,to = "C:/ZipFiles")
zip(".../FolderToBeZipped",
files = "C:/ZipFiles")
unlink("C:/ZipFiles",recursive = TRUE)
The result then is .../FolderToBeZipped.zip/ZipFiles/
The benefit is that you need not be within the subdirectory (or project) when executing the code.