I have a a 17Gb zip file on Amazon Cloud Drive. Is there a way to unzip / extract it without downloading it first?
No, there is no way to do this on the server end.
Related
I have disk station with some large files (site with interactive file system where I can download some files and archives). I also have server with ssh access. Can I download file directy to the server without downloading it to my local machine and then using scp to the server?
Some people say to use
wget http://link-to-the-file
But I am not sure, that there is direct download link. Moreover you should specify language of achieve before downloading (confirming).
Note: I tried to use wget but can't understand is there downloading link.
I am using Youtube-dl to download transcript, it works fine on my machine (local server) where I provide the __Dirname into the Options params to upload files. But I want to use Google Cloud functions, so how can I substitute __dirname with Cloud storage ??
Thank you !!
Upload from Youtube-dl it's not possible. To upload files into Google Cloud storage is possible if you upload a file already in your disk.
You will need to download the file from whichever program you mention (as mentioned in the comments, you can download it to a temporal folder), upload the file to GCS and then delete it from your temporal folder.
What you can actually do? you can for example run a script inside of a Google Cloud Instance with a gsutil command to upload the files into a bucket.
I have a folder I want to download from Google Cloud Console using the Linux Ubuntu command terminal. I have logged in to my SSH console and so far I can only list the contents of my files as follows.
cd /var/www/html/staging
Now I want to download all the files from that staging folder.
Sorry, if I'm missing the point. Anyway, I came here seeking a way to download files from Google Cloud Console. I didn't have the ability to create an additional bucket as the author above suggested. But I accidently noticed that there is a button for exactly what I needed.
Seek keebab-style menu button. In the appearing dropdown you should find Download button.
If you mean cloud shell, then I typically use the gcp storage tool suite.
In summary, I transfer from cloud shell to gcp storage, then from storage to my workstation.
First, have the Google cloud ask installed on your system.
Make a bucket to transfer it into with gsutil mb gs://MySweetBucket
From within cloud shell, Move the file I to the bucket. gsutil cp /path/to/file gs://MySweetBucket/
On your local system pull the file down. gsutil cp gs://MySweetBucket/filename
Done!
I need a python code which can download files from a ftp server. I need a built in multi-part download managing package which can help me to retrieve files faster. I tried SmartDL but the problem is I don't know how to retrieve files in a ftp server. Also I used the add_basic_authentication to ensure that, I am passing the right credentials. Please help me with a solution.
I have no problem using any other solution/package which uses Multipart download.
P.S:- I need to save the Downloaded files on to an Object storage on Cloud. The size of each file may be 300MB and I need to download 20TB of data.
Thanks in anticipation.
Take a look at ftplib, it's a simple FTP library which will permit you to download files from a FTP server.
I'm new to node and want to upload directory using node.js. Can anybody please help? Thank You
If it's possible how? and if not then why?
you can upload only files.
however there are workarounds,you can
covert the folder to a zip file (client side), it really depends here on your case.
upload the zip file
then unzip it in the server
you can use google to find out how to do every step of those.
You can use use any Folder uploading library like https://github.com/blueimp/jQuery-File-Upload or https://dropzonejs.com which allows you to Drag and Drop whole directory using HTML5 features.