How to serve images via serving url, that are uploaded to google cloud storage from google compute engine - python-3.x

In the past I had used Images API for that. But the blobstore and images APIs are available only within the App Engine runtime environment. Now I use google compute engine and I want to create serving url for the images uploaded to google cloud storage. How to do that? Is it possible images to be directly served from google cloud storage?

There are 2 possible solutions, the first one could be use Signed URLs which gives time-limited resource access to anyone in possession of the URL.
The second option is use Request endpoints (example : https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]). Something important to mention is that you need to grant the correct roles to the bucket in order to access it.

Related

Using Google Cloud Storage JSON Api (React-Native and Cloud Functions)

Currently, my app calls google cloud storage from the client-side and that causes us to store a lot of sensitive information on the client-side. For the past three days, I've been trying to figure out how to use google cloud JSON API to no avail. Can anyone walk through the process from beginning to end of setting that up and uploading images to google cloud storage from the server-side using node.JS
Make sure your IAM permissions are set up - otherwise your app will crash when trying to use cloud storage. I've attached a link to an open source github as well that will walk you through setting this up. It is called google-cloud-nodejs-client in case the link moves.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
https://github.com/aslamanver/google-cloud-nodejs-client

Programatically transfer files from any CDN to Google Cloud bucket

I would like to transfer a bunch of files that lives on a CDN to my Google Cloud bucket. I have no control over the CDN and cannot access anything on it except the files I would like to transfer. Any idea if the Google Cloud Storage API have any support for this kind of action?
I found the answer myself, the Google Cloud Transfer Service does support regular HTTP data sources as you can see here:
https://cloud.google.com/storage-transfer/docs/reference/rest/v1/TransferSpec

Can I direcitly download an image to some cloud storage using an URL

I am working on a website where client can upload images. I use an image handler api that can finally generate an URL of the image that I need. I'm considering using cloud services for storing images. Do I really need to first use the URL to download the image to my server and then upload to the cloud? Is there any simple ways to put the image on the cloud?
The image url is temporary so I have to save it in somewhere.
I'm using node.js, express
If your cloud were Amazon Web Services (AWS), you can upload directly from a client using HTTP POST. See this article with these examples.
You might need to create an identity with AWS Cognito, Another link
Note that if you do upload directly to the S3 bucket you would have to know the name of the file, and possibly send that to the server. Otherwise your file will be "in the cloud" and no one will know the name of it, so it won't be retrieved : )

Can i upload to CDN server directly in Azure?

I have been exploring the features available in Azure and AWS. The fact that most features is not available or not clear.In CDN part i have comparisson criteria like 'Whether I can push/upload content to CDN Servers like in AKamai.
I have seen the feedback program and find that Custom-Origin is not available(
Link : http://feedback.azure.com/forums/169397-cdn/status/191761 ).But this one i could not find any link.Anyone has any idea?
No. Azure CDN currently does not support direct interaction (i.e. direct content upload, explicit or on-demand content expiration, etc.). It works as advertised serves files from Azure Storage Account or azure Cloud Service.

Windows azure: how to setup front-end and back-end with shared image folder

I'm trying to find the best setup for my website on Windows Azure.
I have a front-end and a back-end website made in ASP.NET MVC4.
Both websites must use a shared same images. Font-end for displaying, back-end for CRUD actions. The image files are stored in a folder in the front-end web application and the url's to those images are stored in a mysql database.
Currenty i have 2 Windows Azure websites, but i can't access the images from the back-end website because there are stored in a folder on the front-end application?
What's the best setup and cheapest for this type of application?
2 websites with shared BLOB storage ?
A cloud service containing 2 webroles (front- and back-end) ?
... ?
Thanks
First you should not use web application's folder beside temporary operations. Since Azure means multi-computer environment, resource (image) won't be available for requester if you use more than one instance (machine)
I would go on 2 blob container. (not 2 blob storage account)
We do not have IP based restriction on blobs yet so as long as you don't share those addresses you will be fine. If you really need to have restriction you can use Shared Access Policy which you can find more details on Use a Stored Access Policy also you should review this one too Restrict Access to Containers and Blobs
I think that using a shared blob storage account is the right direction.
Using a local folder is not a good idea - on web sites and cloud services these are not persistent and you may lose your files. Either way - this is not a scalable solutions - if you'll add additional instances in the future you will not have access to the files.
Using blob storage will give you a location that is accessible from both locations and indeed from the client's browser directly.
You do not specify whether the images need to be accessed securely from the front end or not, if not that blob storage is particularly useful as they can be served from a public container on azure storage directly.

Resources