Excluding the '#recycle' directory from s3cmd upload - linux

I'm using s3cmd on a Synology NAS.
I built an exclusion file including #recycle/* and the --exclude-from=/path/to/exc/file option but it doesn't work.
I already tried this: '#recycle/*', "#recycle/*" and "\#recycle/*" but s3cmd still tries to upload the '#recycle' folder contents.
I have an error when trying to using both --exclude from=/path/to/file and -exclude='#recycle/*' in the same command.
Any ideas?

Related

How can I download a folder from google drive or dropbox using command in linux?

I am trying to download a folder using command in linux shell using dropbox or google drive link. The download works but it is not saved as a folder, after it is downloaded I cannot access it using 'cd ..' command. So the folder is downloaded but when I use cd .. , I get the message that the file is not a directory.
How can I download a folder and access it? I am also executing this in virtual machine.
I do not know which method you are using to download the directory. In order to download a directory, you need to recursively download all the files in it or, create a tar or zip for the directory.
You can consider using gdown.
Please also read the detailed explanation from the following post: wget/curl large file from google drive

Downloading from s3 bucket fails while running the s3cmd get from cron job

I am running a script to download files from s3 bucket. Running the script in cron. At times, the script fails , but when i run it manually it always works.
Can anyone help me with this.
It appears that your requirement is to download all new files from Amazon S3, so that you have a local copy of all files (without downloading them repeatedly).
I would recommend using the AWS Command-Line Interface (CLI), which has an aws s3 sync command. This will synchronize the files from Amazon S3 to your local directory (or the other way). If something goes wrong, it will try to copy the files again on the next sync.

Using tar -zcvf against a folder creates an empty compressed file

I am ssh'ed into an Acquia server trying to download some files. I need to backup these files for local development (to get user uploaded images mainly).
I am using the following command:
tar -zcvf ~/download/stage-files_3-19-2015_1344.tar.gz files/
I have read/write access to the download folder. I created that folder. I am in the parent folder of "files". And permissions to that folder are 777.
I was able to run this the other day with no issues. So I am very confused as to why this is happening now.
Actually I just figured this darn thing out. Must have run out of disk space because once I removed a prior compressed backup of the files it started running just fine. Dang disk quotas. Sorry guys.

Linux unzip preserve case?

Working on a web site. A number of third party javascript libraries use mixed-case in their files and folders.
I am working on a windows system.
When ready to upload from my local windows XAMPP environment to my linux hosting, I use 7zip to create a zip file of my site. I use 7zip's -xr! feature to skip certain directories like my .git repository.
I FTP the resulting .zip file to my server and use the server's "unzip" function to explode it. All my files are there but they are all changed to lowercase!
This kills the website as the third party libraries that are mixed-case are no longer found.
I've tried unzip -C but that did not seem to do anything.
I also look in the archive prior to uploading and on windows, all the file name cases are preserved.
Tried using GNU32's windows tar but the --exclude function is not allowing me to skip the .git directories.
I need some help in the form of:
How to use unzip in linux such that is preserves case (googled until hairless, but no love found...)
How to use tar on windows such that it excludes particular directories
How to use something else to achieve my goal. I honestly don't care what it is... I'm downloading CYGWIN right now to see if it'll help at all. I may end up installing Linux in a virtual box just to try tar-gz from a virtual machine actually running linux but would REALLY rather avoid that hassle every time I want to pack up a pretty simple archive.
Zip works fine for packing, but unpacking is not kosher.
Use tar's --exclude-vcs option:
--exclude-vcs
exclude version control system directories
Example:
tar --exclude-vcs czf foo.tar.gz foo
or for a *.tar.bz2 archive
tar --exclude-vcs cjf foo.tar.bz2 foo
Try unzip -U file.zip; this might work if you have an old version of unzip. Otherwise, post the output of unzip -v and unzip -l file.zip.

uploading all files in the folder using ftp

I'm trying to copy a folder from my computer to my android phone using ftp. After I login, I try put * or put *.mp3 but it doesn't work well.
I am using command line in ubuntu linux
You want mput rather than put.

Resources