Using Google Cloud Storage JSON Api (React-Native and Cloud Functions) - node.js

Currently, my app calls google cloud storage from the client-side and that causes us to store a lot of sensitive information on the client-side. For the past three days, I've been trying to figure out how to use google cloud JSON API to no avail. Can anyone walk through the process from beginning to end of setting that up and uploading images to google cloud storage from the server-side using node.JS

Make sure your IAM permissions are set up - otherwise your app will crash when trying to use cloud storage. I've attached a link to an open source github as well that will walk you through setting this up. It is called google-cloud-nodejs-client in case the link moves.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
https://github.com/aslamanver/google-cloud-nodejs-client

Related

Best way to upload medium sized videos to azure blob storage?

I am currently on a student plan for azure (gotta stay finessing as a college student lol) and am looking for the best way to upload videos to azure blob storage. Currently, I am using an azure function api to upload the video, but I am encountering a "Javascript heap out of memory" error when I try and multiparts parse big video files.
Ideally, I'd be able to quickly upload 3.5 minute music videos from mobile and desktop to azure blob storage with this method.
Either a better way of uploading videos to blob storage from my front-end or a solution for the javascript heap out of memory error would be amazing help.
Here's the link to that other post, if you are curious: How to fix JavaScript heap out of memory on multipart.Parse() for azure function api
Approaches:
After a workaround based on your issue, I would suggest that you use Azure Media Services.
Media Services can be integrated with Azure CDN. Refer to check Media Services-Managing streaming endpoints.
All supported formats use HTTP to transport data and benefit from HTTP caching. In live streaming, actual video/audio data is separated into fragments, which are cached in CDNs.
To start, I recommend that you use the Azure Storage SDK with Node.JS. The SDK will handle everything for you. Attaching few uploaders below to check accordingly.
Upload a video to Azure Blob examples
Refer MSDoc & SO thread by #Gopi for uploading a video with the .mp4 extension to Azure blob storage using C#.
You can upload a video using Azure functions directly. But to use Azure Functions, you must create a back-end component written in.NET, Java, JavaScript, or Python.
You can use the "Azure Storage Rest API" to upload files/video files using a storage account, like you mentioned. You will be able to get the desired result by using this Azure Storage Rest -API-MSDoc.

Upload video to Blob Storage using Azure Function

In my app I would like my users to upload local videos, which I would want to store in the Blob Storage. But I would like to achieve this by using the Azure Functions. Is this possible? If so, I couldn't find any resource that could point me in the right direction. If not what would be the ideal way to achieve this. I am building the app using flutter, for which we do not yet have the SDK's. Any help is appreciated.
There's no SDK available for Dart. You'll have to make a back-end part running in .NET, Java, JavaScript or Python in order to use Azure Functions.
But you can use the Azure Storage REST API to store the videos as blobs using a Storage Account.
Take a look at the official reference here. Using this, you'll be able to store the videos using Http.
Also, this tutorial might be useful.
The fella here is using the File Service instead the Blob Service.
A point of interest is that you have to be in mind some limitations, not that it's impossible to do what you want to, but is good to be aware:
There are limitations to the storage service.
[...] You can only upload 4mb “chunks” per upload. So if your
files exeed 4mb you have to split them into parts. If you are a good
programmer you can make use of tasks and await to start multiple
threads. Please consult the Azure limits documentation to see if any
other restrictions apply.

Google cloud storage access token for speech-to-text

I can’t figure out how to use longer files than one minute in the speech-to-text api, I have google cloud storage, but I can’t get the access code to that.

How to serve images via serving url, that are uploaded to google cloud storage from google compute engine

In the past I had used Images API for that. But the blobstore and images APIs are available only within the App Engine runtime environment. Now I use google compute engine and I want to create serving url for the images uploaded to google cloud storage. How to do that? Is it possible images to be directly served from google cloud storage?
There are 2 possible solutions, the first one could be use Signed URLs which gives time-limited resource access to anyone in possession of the URL.
The second option is use Request endpoints (example : https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]). Something important to mention is that you need to grant the correct roles to the bucket in order to access it.

Programatically transfer files from any CDN to Google Cloud bucket

I would like to transfer a bunch of files that lives on a CDN to my Google Cloud bucket. I have no control over the CDN and cannot access anything on it except the files I would like to transfer. Any idea if the Google Cloud Storage API have any support for this kind of action?
I found the answer myself, the Google Cloud Transfer Service does support regular HTTP data sources as you can see here:
https://cloud.google.com/storage-transfer/docs/reference/rest/v1/TransferSpec

Resources