I am ssh'ed into an Acquia server trying to download some files. I need to backup these files for local development (to get user uploaded images mainly).
I am using the following command:
tar -zcvf ~/download/stage-files_3-19-2015_1344.tar.gz files/
I have read/write access to the download folder. I created that folder. I am in the parent folder of "files". And permissions to that folder are 777.
I was able to run this the other day with no issues. So I am very confused as to why this is happening now.
Actually I just figured this darn thing out. Must have run out of disk space because once I removed a prior compressed backup of the files it started running just fine. Dang disk quotas. Sorry guys.
Related
I am trying to download a folder using command in linux shell using dropbox or google drive link. The download works but it is not saved as a folder, after it is downloaded I cannot access it using 'cd ..' command. So the folder is downloaded but when I use cd .. , I get the message that the file is not a directory.
How can I download a folder and access it? I am also executing this in virtual machine.
I do not know which method you are using to download the directory. In order to download a directory, you need to recursively download all the files in it or, create a tar or zip for the directory.
You can consider using gdown.
Please also read the detailed explanation from the following post: wget/curl large file from google drive
I have a few Azure SMB File Shares mounted on a linux VM.
In one of those file shares I have two folders, one called download and another called loaded.
Files get dropped in download, they get processed and moved into loaded. But sometimes we have to move the files from loaded to download again from our laptops (running Windows). And when we do this, the files can't move back to loaded.
Essentially:
I mount file-share
I run mv /mnt/file-share/download/file.txt /mnt/file-share/loaded/file.txt
I drag and drop file.txt from loaded to download from my laptop
Up to here everything works. But when I try to run mv /mnt/file-share/download/file.txt /mnt/file-share/loaded/file.txt again, it returns:
mv: /mnt/file-share/download/file.txt /mnt/file-share/loaded/file.txt are the same file
If I now umount and mount again file-share it works. So this makes me thing that it's a caching issue.
So I tried mounting with cache=none but it still does the same thing.
Any sugestions?
Thank you!
Using the noserverino option while mounting the share fixed the issue.
I have a zip file thats titled like so file1.zip,file2.zip,file3.zip,etc...
How do I go about extracting these files together correctly? They should produce one output file.
Thanks for the help!
First, rename them to "file.zip", "file.z01", "file.z02", etc. as Info-ZIP expects them to be named, and then unzip the first file. Info-ZIP will iterate through the split files as expected.
I found a way. I had to mount the remote machines user home folder on my Ubuntu desktop pc and use File Roller Archive Manager, which is just listed as Archive Manger in Ubuntu 18.
Mount remote home folder on local machine...
Install sshfs
sudo apt install sshfs
Make a directory for the mount. Replace remote with whatever folder name you want
mkdir remote
Mount the remote file system locally replacing linuxusername with the user account you want to use to login and xxx.* with its IP address or hostname.
sudo sshfs -o allow_other linuxusername#xxx.xxx.xxx.xxx:/ remote
Now in the mounted "remote" folder you can see the contents of the whole linux filesystem and navigate them in a File Manager just like your local file system, limited by user privileges of course where you can only write to the home folder of the remote user account.
Using Archive Manager I openened up the .zip file of the spanned set (not the .z01, .z02 etc files) and extracted inside the "remote" folder. I saw no indication of extraction progress, the bar stayed at 0% until it was complete. Other X Windows based archive applications might work.
This is slow, about 3-5 megabytes per second on my LAN. I noticed Archive Manager use 7z to extract but do not know how as 7z is not supposed to support spanned sets.
Also if your ssh server is dropbear instead of openssl's sshd it will be unbearably slow for large files. I had to extract a 160gb archive and the source filesystem was fat32 so was not able to combine the spanned set into one zip file as it has a 4gb file size limit.
Working on a web site. A number of third party javascript libraries use mixed-case in their files and folders.
I am working on a windows system.
When ready to upload from my local windows XAMPP environment to my linux hosting, I use 7zip to create a zip file of my site. I use 7zip's -xr! feature to skip certain directories like my .git repository.
I FTP the resulting .zip file to my server and use the server's "unzip" function to explode it. All my files are there but they are all changed to lowercase!
This kills the website as the third party libraries that are mixed-case are no longer found.
I've tried unzip -C but that did not seem to do anything.
I also look in the archive prior to uploading and on windows, all the file name cases are preserved.
Tried using GNU32's windows tar but the --exclude function is not allowing me to skip the .git directories.
I need some help in the form of:
How to use unzip in linux such that is preserves case (googled until hairless, but no love found...)
How to use tar on windows such that it excludes particular directories
How to use something else to achieve my goal. I honestly don't care what it is... I'm downloading CYGWIN right now to see if it'll help at all. I may end up installing Linux in a virtual box just to try tar-gz from a virtual machine actually running linux but would REALLY rather avoid that hassle every time I want to pack up a pretty simple archive.
Zip works fine for packing, but unpacking is not kosher.
Use tar's --exclude-vcs option:
--exclude-vcs
exclude version control system directories
Example:
tar --exclude-vcs czf foo.tar.gz foo
or for a *.tar.bz2 archive
tar --exclude-vcs cjf foo.tar.bz2 foo
Try unzip -U file.zip; this might work if you have an old version of unzip. Otherwise, post the output of unzip -v and unzip -l file.zip.
I'm trying to connect the liferay 6.0.6 document library (installed in linux and running on tomcat) to an external mounted folder shared through webdav.
I'm using mount.dafvs to mount the folders in linux, but whenever I create or add via sftp a file to a mounted folder, it doesn't have the 777 permission as its folder and 95% of the times it simply goes to the lost+found folder, leaving there an empty file. So I can see files uploaded from the portal, but I can't upload files from my linux machine.
But if I change the permissions of the file to 777 and then I edit again or I upload it again from sftp replacing it, the file is there and can be seen/downloaded from the portal too!
any idea of why this is happening and how can I get this sharing works with R/W options from both sides?