Google cloud storage access token for speech-to-text - speech-to-text

I can’t figure out how to use longer files than one minute in the speech-to-text api, I have google cloud storage, but I can’t get the access code to that.

Related

Best way to upload medium sized videos to azure blob storage?

I am currently on a student plan for azure (gotta stay finessing as a college student lol) and am looking for the best way to upload videos to azure blob storage. Currently, I am using an azure function api to upload the video, but I am encountering a "Javascript heap out of memory" error when I try and multiparts parse big video files.
Ideally, I'd be able to quickly upload 3.5 minute music videos from mobile and desktop to azure blob storage with this method.
Either a better way of uploading videos to blob storage from my front-end or a solution for the javascript heap out of memory error would be amazing help.
Here's the link to that other post, if you are curious: How to fix JavaScript heap out of memory on multipart.Parse() for azure function api
Approaches:
After a workaround based on your issue, I would suggest that you use Azure Media Services.
Media Services can be integrated with Azure CDN. Refer to check Media Services-Managing streaming endpoints.
All supported formats use HTTP to transport data and benefit from HTTP caching. In live streaming, actual video/audio data is separated into fragments, which are cached in CDNs.
To start, I recommend that you use the Azure Storage SDK with Node.JS. The SDK will handle everything for you. Attaching few uploaders below to check accordingly.
Upload a video to Azure Blob examples
Refer MSDoc & SO thread by #Gopi for uploading a video with the .mp4 extension to Azure blob storage using C#.
You can upload a video using Azure functions directly. But to use Azure Functions, you must create a back-end component written in.NET, Java, JavaScript, or Python.
You can use the "Azure Storage Rest API" to upload files/video files using a storage account, like you mentioned. You will be able to get the desired result by using this Azure Storage Rest -API-MSDoc.

Copy data from GCP to Azure Storage when the bucket in GCP has a requester pays

I am trying to copy data from GCP to Azure Storage, but the bucket in GCP has a Requester Pays, I did try the transfer using AzCopy and Azure Data Factory, at the end of the Azure configuration I can see the bucket in GCP but when hit the bucket I got a 400 bad request error, this is because the bucket in GCP has a requester pays, what additional configuration I have to do to copy the data? I already have the credential of the GCP service account
I don't think you will be able to use AzCopy. Looking at the GCP documentation on Requester Pays, you need to send the billing PROJECT_IDENTIFIER along with the request. There are several ways to do this. I would suggest looking at the REST API or the code samples.
https://cloud.google.com/storage/docs/using-requester-pays#code-samples_2
Azure Data Factory supports pulling data from a REST API so you could use the GCP REST call with it. Just make sure you send the billing PROJECT_IDENTIFIER along with your REST call.
https://cloud.google.com/storage/docs/using-requester-pays#rest-access-requester-pays

Using Google Cloud Storage JSON Api (React-Native and Cloud Functions)

Currently, my app calls google cloud storage from the client-side and that causes us to store a lot of sensitive information on the client-side. For the past three days, I've been trying to figure out how to use google cloud JSON API to no avail. Can anyone walk through the process from beginning to end of setting that up and uploading images to google cloud storage from the server-side using node.JS
Make sure your IAM permissions are set up - otherwise your app will crash when trying to use cloud storage. I've attached a link to an open source github as well that will walk you through setting this up. It is called google-cloud-nodejs-client in case the link moves.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
https://github.com/aslamanver/google-cloud-nodejs-client

How to serve images via serving url, that are uploaded to google cloud storage from google compute engine

In the past I had used Images API for that. But the blobstore and images APIs are available only within the App Engine runtime environment. Now I use google compute engine and I want to create serving url for the images uploaded to google cloud storage. How to do that? Is it possible images to be directly served from google cloud storage?
There are 2 possible solutions, the first one could be use Signed URLs which gives time-limited resource access to anyone in possession of the URL.
The second option is use Request endpoints (example : https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]). Something important to mention is that you need to grant the correct roles to the bucket in order to access it.

Programatically transfer files from any CDN to Google Cloud bucket

I would like to transfer a bunch of files that lives on a CDN to my Google Cloud bucket. I have no control over the CDN and cannot access anything on it except the files I would like to transfer. Any idea if the Google Cloud Storage API have any support for this kind of action?
I found the answer myself, the Google Cloud Transfer Service does support regular HTTP data sources as you can see here:
https://cloud.google.com/storage-transfer/docs/reference/rest/v1/TransferSpec

Resources