Upload youtube-dl transcript into Google Cloud storage - node.js

I am using Youtube-dl to download transcript, it works fine on my machine (local server) where I provide the __Dirname into the Options params to upload files. But I want to use Google Cloud functions, so how can I substitute __dirname with Cloud storage ??
Thank you !!

Upload from Youtube-dl it's not possible. To upload files into Google Cloud storage is possible if you upload a file already in your disk.
You will need to download the file from whichever program you mention (as mentioned in the comments, you can download it to a temporal folder), upload the file to GCS and then delete it from your temporal folder.
What you can actually do? you can for example run a script inside of a Google Cloud Instance with a gsutil command to upload the files into a bucket.

Related

Download Firebase Storage folder to a Cloud Functions Temporary Folder

I'm trying to download the entire folder from Firebase Storage so I can zip those files and upload them back to Firebase Storage while generating a download link.
I have read a lot of posts and code but everything seemed getting away from my scope.
Do you have a clear example on how to download a Firebase Storage Folder to a Cloud Functions Temporary Folder or at least some hints on I could do it, keeping in mind that I am targeting a specific folder?
There is no bulk download operation provided by the SDK. The general pattern for downloading all files with some shared prefix using the node SDK for Cloud Storage will be:
list the files using getFiles at the prefix (folder) of interest
iterate them
download each one separately

Where are files downloaded in Google App Engine?

I have a backend Nodejs application and I am fetching and streaming files in the background when a certain event happens in the client.
I have deployed the backend to Google App Engine.
The file downloading is working fine but I am a bit confused where the files are downloaded and stored ? In the app I am creating a folder relative to the deployed app folder and storing them there with createWriteStream. I also init a git repository where the files are (using simple-git npm module)
It seems the files are not accessible via the cloud shell since I can not find them there
Can I for example create a storage bucket and use "normal" file operations command there (and init the repo there)
-Jani
To store data downloaded you want to store it in Cloud Storage, you can find a complete guide in this Using Cloud Storage documentation.
Under almost any circumstances you want to download files into the App Engine Deployment since the instances doesn't have much memory to store data, and also when the deployment scales up and down you are prone to lost data

How can I extract a tar.gz file in a Google Cloud Storage bucket from a Colab Notebook?

As the question states, I'm trying to figure out how I can extract a .tar.gz file that is stored in a GCS Bucket from a Google Colab notebook.
I am able to connect to my bucket via:
auth.authenticate_user()
project_id = 'my-project'
!gcloud config set project {project_id}
However, when I try running a command such as:
!gsutil tar xvzf my-bucket/compressed-files.tar.gz
I get an error. I know that gsutil probably has limited functionality and maybe isn't meant to do what I'm trying to do, so is there a different way to do it?
Thanks!
Google Cloud Storage - GCS does not natively support unpacking a tar archive. You will have to do this yourself either on your local machine or from a Compute Engine VM, for instance
You can create a Dataflow process from a template to decompress a file in your Bucket
The template is called Bulk decompress Cloud Storage files
You have to specify file location, output location, failure log, and tmp location
This worked for me. I'm new to colab and python itself so I'm not certain this is the solution.
!sudo tar -xvf my-bucket/compressed-files.tar.gz

How to download a folder to my local PC from Google Cloud console

I have a folder I want to download from Google Cloud Console using the Linux Ubuntu command terminal. I have logged in to my SSH console and so far I can only list the contents of my files as follows.
cd /var/www/html/staging
Now I want to download all the files from that staging folder.
Sorry, if I'm missing the point. Anyway, I came here seeking a way to download files from Google Cloud Console. I didn't have the ability to create an additional bucket as the author above suggested. But I accidently noticed that there is a button for exactly what I needed.
Seek keebab-style menu button. In the appearing dropdown you should find Download button.
If you mean cloud shell, then I typically use the gcp storage tool suite.
In summary, I transfer from cloud shell to gcp storage, then from storage to my workstation.
First, have the Google cloud ask installed on your system.
Make a bucket to transfer it into with gsutil mb gs://MySweetBucket
From within cloud shell, Move the file I to the bucket. gsutil cp /path/to/file gs://MySweetBucket/
On your local system pull the file down. gsutil cp gs://MySweetBucket/filename
Done!

How to extract files server side on Amazon Cloud Drive?

I have a a 17Gb zip file on Amazon Cloud Drive. Is there a way to unzip / extract it without downloading it first?
No, there is no way to do this on the server end.

Resources