How to winzip a folder exclude parent folders? - parent

I am using this command to winzip a folder, but "bldforge_AOMS_DEV\WebSphere_AVOB" is included in the zip.
winzip32.exe -min -a -r -p C:\Build\AOM.zip m:\bldforge_AOMS_DEV\WebSphere_AVOB*
How can I create the zip without "bldforge_AOMS_DEV\WebSphere_AVOB", just all the files under bldforge_AOMS_DEV\WebSphere_AVOB\?
Thanks
Jirong

I'm no expert by i think this might solve you problem
winzip32.exe -min -a -r -p C:\Build\AOM.zip m:\bldforge_AOMS_DEV\WebSphere_AVOB\*
just add a slash before your * selector

Related

unzip -d option to extract file from jar to specific directory

unzip -d option to extract file from jar to specific directory
Without using any option we can unzip specific file to same directory structure, but -d option doesn't seem to work.
unzip someNiceOne.jar com/some/comp/some/dir/name.properties
This will unzip file 'name.properties' to 'com/some/comp/some/dir/'.
I need 'name.properties' to be under same directory level where my jar is present, but -d option doesn't seem to work. Any alternative option?.
unzip -d option help :
-d extract files into exdir
Using -j option should do what you need.
-j junk paths (do not make directories)
In your case:
unzip -j someNiceOne.jar com/some/comp/some/dir/name.properties

How can I download all the files from a remote directory to my local directory?

I want to download all the files in a specific directory of my site.
Let's say I have 3 files in my remote SFTP directory
www.site.com/files/phone/2017-09-19-20-39-15
a.txt
b.txt
c.txt
My goal is to create a local folder on my desktop with ONLY those downloaded files. No parents files or parents directory needed. I am trying to get the clean report.
I've tried
wget -m --no-parent -l1 -nH -P ~/Desktop/phone/ www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
I got
I want to get
How do I tweak my wget command to get something like that?
Should I use anything else other than wget ?
Ihue,
Taking a shell programatic perspective I would recommend you try the following command line script, note I also added the citation so you can see the original threads.
wget -r -P ~/Desktop/phone/ -A txt www.site.com/files/phone/2017-09-19-20-39-15 --reject=index.html* -e robots=off
-r enables recursive retrieval. See Recursive Download for more information.
-P sets the directory prefix where all files and directories are saved to.
-A sets a whitelist for retrieving only certain file types. Strings and patterns are accepted, and both can be used in a comma separated list. See Types of Files for more information.
Ref: #don-joey
https://askubuntu.com/questions/373047/i-used-wget-to-download-html-files-where-are-the-images-in-the-file-stored

How to copy files to the timestamp auto generated folder?

Hello I am trying to copy all files from Documents directory to the backup directory that has a timestamp. So I have created a folder called bk$( the time stamp of the folder) and I am trying to copy files from the Documents directory to the new created folder that is unique. This will be in a crontab backing up files from documents and when the backup will kick in, it will create new directory for each backup that is uniquely identified by the folder timestamp. For some reason I cannot get the cp or cpio -mdp. Now someone had mentioned I could use $PATH variable which seems promising, if that is the solution, if someone could help me out on making it work.
bkdest=home/user/backup/
bksource="/home/user/Documents/"
export PATH=/$bkdest:$PATH
mkdir /"$bkdest"bk.$(date +%Y_%m_%d_%H_%M_%S)
cp /"$bksource"* $PATH
My other approach which I have tried to use to make it work:
cp $bksource * ls | tail -l | $PATH
I could have gone with the ctime but unfortunately it does not work with the folder creation date.
This was my approach but with the latest created folder and not file
find $HOME -type d -daystart ctime 0
If someone could please help me out to copy to that new folder, I would really appreciate it. Thank you!
Store the target name in a variable:
bkdest=/home/user/backup
bksource=/home/user/Documents
target=${bkdest}/bk.$(date +%Y_%m_%d_%H_%M_%S)
mkdir -p $target
cp ${bksource}/* ${target}/
Note I tidied up your use of variables a little.
Also, this won't copy subdirectories. For that you need to use cp -R. When I do backups I prefer to use rsync.
I did not fully understand your approach or what exactly you want to do but here it goes.
CP Approach
You should not use cp for backups, rsync is far more suitable for this. But if for some reason you really need to use cp, you can use the following script.
#!/bin/bash
BKP_DIR=/tmp/bkp
BKP_SRC=/tmp/foo
SNAPSHOT=${BKP_DIR}/$(date +%F.%H-%M-%S.%N)
mkdir -p ${SNAPSHOT}
cp -r ${BKP_SRC}/* ${SNAPSHOT}
Rsync Approach
No big change here.
#!/bin/bash
BKP_DIR=/tmp/bkp
BKP_SRC=/tmp/foo
SNAPSHOT=${BKP_DIR}/$(date +%F.%H-%M-%S.%N)
rsync -a ${BKP_SRC}/ ${SNAPSHOT}/
Improved Rsync Approach (RECOMMENDED)
#!/bin/bash
BKP_DIR=/tmp/bkp
BKP_SRC=/tmp/foo
SNAPSHOT=${BKP_DIR}/$(date +%F.%H-%M-%S.%N)
LATEST=${BKP_DIR}/latest
rsync \
--archive \
--delete \
--backup \
--backup-dir=${SNAPSHOT} \
--log-file=${BKP_DIR}/rsync.log \
${BKP_SRC}/ ${LATEST}/
EXPLAINING: --archive plus --delete will make sure that $LATEST is a perfect copy of $BKP_SRC, it means that files that no longer exist in $BKP_SRC will be deleted from $LATEST. The --archive option also ensure that permissions and owners will be maintained, symlinks will be copied as symlinks, and more (look at man rsync for more information).
The --backup plus --backup-dir options will create a backup directory to put differential files. In other words, all files that were deleted or modified since last backup will be put in there, so you do not lost them as they are deleted from $LATEST.
--log-file is optional, but it is aways good to keep logs for debug purposes.
At the end you have an incremental backup.

How to duplicate a folder exactly

I am trying to copy a filesystem for a device I am programming for. After so much time trying to figure out why the filesystem I was installing wasn't working I found out that cp didn't get the job done. I used du -s to check the size of the original filesystem and the one that I copied with cp -r, as it turns out they differ by about 150 bytes.
Something is telling me that symbolic links or some sort of kernel objects aren't being copied correctly.
Is it possible to copy a folder/file system exactly? If so how would I go about it?
Try doing this the straightforward way :
cp -a src target
from man cp
-a, --archive
same as -dR --preserve=all
It preserve rights, symlinks...
Here I tried all the code in my Linux. Seems Rsync proposed by #seanmcl as the right one while others failed to keep owners and/or some special files or a denied result. The exact code is:
$ sudo rsync -aczvAXHS --progress /var/www/html /var/www/backup
Just remember to use just the directory name and not put a slash (/) or a wildcard (/*) at the end of source and target name otherwise the hidden files right below the source are not copied.
Another popular option is to use tar c source | (cd target && tar x ). See this linuxdevcenter.com article.
The most accurate way I know of copying files is with cpio:
cd /path/to/source
find . -xdev -print0 | cpio -oa0V | (cd /path/to/target && cpio -imV)
Not really easy to use, but this is very precise, preserving timestamps, owners, permissions, special files.
Rsync is the best way to copy a file system. They are myriad arguments that let you control exactly what is copied.
This is what I do, for example to duplicate directory A -> B:
$ mkdir B
$ cd A
$ cp -a ./ ../B

How can I upload an entire folder, that contains other folders, using sftp on linux?

I have tried put -r directory/*, which only uploaded the files and not folders. Gave me the error, cannot Couldn't canonicalise.
Any help would be greatly appreciated.
For people actually wanting a direct answer to this question (instead of being told to use something other than sftp)...
put -r local/path/to/directoryName
The uploaded directory must already exist in the working directory on the server, so you might need to create it first.
mkdir directoryName
Here you can find detailed explanation as how to copy a directory using scp. In your case, it would be something like:
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
This will copy the directory "foo" from the local host to a remote host's directory "bar".
Here -r is -recursively copy entire directories.
You can also use rcp with similar syntax. The only difference between them is that scp uses secure shell and rcp uses remote shell.
BTW The "Couldn't canonicalise" error you mentioned appear when sftp server is unable to access the file/directory mentioned in the command.
UPDATE: For users who want to use put specifically, please refer to Ben Thielker answer here.
sftp> mkdir source
sftp> put -r source
Uploading source/ to /home/myself/source
Entering source/
source/file1
source/file2
if you have issues using sftp, you can use ncftp
For centos
yum install ncftp
To copy a whole directory recursively
ncftpput -R -v -u username -P 21 ftp.server.dev /remote-path/ /localdirectory
Use scp instead. It uses SSH too and can easily handle recursion.

Resources