How to keep GCS key json file Safe? - node.js

When I used a bucket a key file was downloaded and it said keep this file safe ?
now I Cannot use .env to encrypt because in the following code you have to link the json file directly to gain access to GCS bucket.
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({
keyFilename:path.join(__dirname,'/<keyfilename>.json'),
projectId:'<project ID>'
});
Now I am concerned when i deploy my app on the app engine this file may be accessed by someone somehow
that is a serious threat because it gives direct access to my GCS bucket
Should I be concerned about that file being accessed by anyone??

Instead of using the Service Account JSON file in AppEngine, You can use the App Engine default service. account to access the GCS buckets or any other service in GCP. By default, the App Engine default service account has the Editor role in the project, Any user account with sufficient permissions to deploy changes to the Cloud project can also run code with read/write access to all resources within that project. However, you can change the service account permissions through the Console.
Open the Cloud Console.
In the Members list, locate the ID of the App Engine default
service account.
The App Engine default service account uses the member ID:
YOUR_PROJECT_ID#appspot.gserviceaccount.com
Use the dropdown menu to modify the roles assigned to the service
account.

Related

How to put Google Pub/Sub service key file path in a config file and not in envionment variable?

I was going through the documentation for Google Cloud Pub/Sub and I found out that the key file has to be stored in environment variable. https://cloud.google.com/pubsub/docs/quickstart-client-libraries I store want to store it in a config.js file so that I don't have to play with environment variables again when i am deploying it on cloud run. How can I do that?
My answer isn't exactly what you should expect! In fact, if you run your container on Cloud Run, you don't need a service account key file.
Firstly, it's not secure
Then you can do almost all with ADC (Application Default Credential)
But there is some limitation; I wrote an article on this. And another one article is under review to narrow again these limitations.
So, when you deploy your Cloud Run revision, use the --serviceaccount parameter to specify the service account email to use, and that's all!!
So, to really answer your question, if you have your file set in the config.js, you can manually load the file content and pass it to the lib
const {auth} = require('google-auth-library');
const keys = JSON.parse("YOUR CONTENT");
const client = auth.fromJSON(keys);
If you are running on your local windows machine, you can go to the environment variable and create environment variable named - GOOGLE_APPLICATION_CREDENTIALS and set the complete path of service account key json file like - C:/keyfolder/sakey.json.
Or you can use command line given in the example of your link.
To get service account key file, you can go to the service Accounts in the GCP console and create service account. If you already have service account, just download the key json file by clicking on ... in action column of Service Accounts.

Access issues with Google Cloud Functions read access to Google Cloud Firestore Collection in Default Database

I am trying to write a cloud function in python that would read a collection in Google Cloud Firestore (Native) [not the Realtime Database or Datastore].
I have created a Service Account that has below Roles for the project:
- Project Owner
- Firebase Admin
- Service Account User
- Cloud Functions Developer
- Project Editor
When run on my local I am setting the service account credential in my environment: GOOGLE_APPLICATION_CREDENTIALS
My cloud function is able to access Cloud Storage. I am only having issues with Cloud Firestore.
I have tried using both the Client Python SDK and the Admin SDK (Python). The Admin SDK seems to only be available for the realtime database as it requires a Database URL to connect.
I have tried running both from my dev machine and as a cloud function.
I also changed the Firestore access rules to below for unrestricted access:
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write: if true;
}
}
}
I am trying to run the same code in the Google Documentation..
from google.cloud import firestore
def process_storage_file(data, context):
# Add a new document
db = firestore.Client()
doc_ref = db.collection(u'users').document(u'alovelace')
doc_ref.set({
u'first': u'Ada',
u'last': u'Lovelace',
u'born': 1815
})
# Then query for documents
users_ref = db.collection(u'users')
docs = users_ref.get()
for doc in docs:
print(u'{} => {}'.format(doc.id, doc.to_dict()))
I am not able to get the Cloud Function to connect to Google Cloud Firestore. I get the error:
line 3, in raise_from google.api_core.exceptions.PermissionDenied: 403 Missing or insufficient permissions.
Both the cloud function and Firestore are in the same GCP Project.
The service account you specified on the cloud function UI configuration needs to have the Datastore User Role
First, check if you uploaded the service account's credential JSON file along with your code, and that the GOOGLE_APPLICATION_CREDENTIALS environment variable is also set in your Cloud Function's configuration page. (I know uploading credentials is a bad idea, but you need to put the JSON file somewhere, if you don't want to use the Compute Engine default service account.)
Second, you might want to provide a Cloud Datastore User role (or a similar one) to your service account, instead of Firebase Admin. It seems that the new Firestore can be accessed with the Cloud Datastore X roles, rather than the Firebase ones.

Limit Azure Blob Access to WebApp

Situation:
We have a web-app on azure, and blob storage, via our web-app we write data into the blob, and currently read that data back out returning it as responses in the web-app.
What we're trying to do:
Trying to find a way to restrict access to the blob so that only our web-app can access it. Currently setting up an IP address in the firewall settings works fine if we have a static IP (we often test running the web app locally from our office and that lets us read/write to the blob just fine). However when we use the IP address of our web app (as read from the cross domain page of the web app) we do not get the same access, and get errors trying to read/write to the blob.
Question:
Is there a way to restrict access to the blob to the web app without having to set up a VPN on azure (too expensive)? I've seen people talk about using SAS to generate time valid links to blob content, and that makes sense for only allowing users to access content via our web-app (which would then deliver them the link), but that doesn't solve the problem of our web-app not being able to write to the blob when not publicly accessible.
Are we just trying to miss-use blobs? or is this a valid way to use them, but you have to do so via the VPN approach?
Another option would be to use Azure AD authentication combined with a managed identity on your App Service.
At the time of writing this feature is still in preview though.
I wrote on article on how to do this: https://joonasw.net/view/azure-ad-authentication-with-azure-storage-and-managed-service-identity.
The key parts:
Enable Managed Identity
Add the generated service principal the necessary role in the storage account/blob container
Change your code to use AAD access tokens acquired with the managed identity instead of access key/SAS token
Acquiring the token using https://www.nuget.org/packages/Microsoft.Azure.Services.AppAuthentication/1.1.0-preview:
private async Task<string> GetAccessTokenAsync()
{
var tokenProvider = new AzureServiceTokenProvider();
return await tokenProvider.GetAccessTokenAsync("https://storage.azure.com/");
}
Reading a blob using the token:
private async Task<Stream> GetBlobWithSdk(string accessToken)
{
var tokenCredential = new TokenCredential(accessToken);
var storageCredentials = new StorageCredentials(tokenCredential);
// Define the blob to read
var blob = new CloudBlockBlob(new Uri($"https://{StorageAccountName}.blob.core.windows.net/{ContainerName}/{FileName}"), storageCredentials);
// Open a data stream to the blob
return await blob.OpenReadAsync();
}
SAS Keys is the correct way to secure and grant access to your Blob Storage. Contrary to your belief, this will work with a private container. Here's a resource you may find helpful:
http://www.siddharthpandey.net/use-shared-access-signature-to-share-private-blob-in-azure/
Please also review Microsoft's guidelines on securing your Blob storage. This addresses many of the concerns you outline and is a must read for any Azure PaaS developer:
https://learn.microsoft.com/en-us/azure/storage/common/storage-security-guide

Google Cloud userproject access denied error

I want to upload a file from my web site (made with nodejs) to my Google Cloud Storage.
But i get this error:
starting-account-67n988tuygj7#sonproje-1533259273248.iam.gserviceaccount.com does not have storage.objects.create access to yiginlabilgi/sefer.jpg.'
You have to set up the right permissions for the bucket. Add service account starting-account-67n988tuygj7#sonproje-1533259273248.iam.gserviceaccount.com as member to the bucket yiginlabilgi with Storage Object Creator role.
Follow the Adding a member to a bucket-level policy in order to achieve that.

GCP App Engine Access to GCloud Storage without 'sharing publicly'

I would like to know how to grant a Google Cloud Platform App Engine project permissions to serve content from Google Cloud Storage without setting the Google Cloud Storage bucket permissions to ‘share publicly'.
My App engine project is running Node JS. Uses Passport-SAML authentication to authenticate users before allowing them to view content, hence I do not want to set access on an individual user level via IAM. Images and videos are currently served from within a private folder of my app, which is only accessible once users are authenticated. I wish to move these assets to Google Cloud Storage and allow the app to read the files, whist not providing global access. How should I go about doing this? I failed to find any documentation on it.
I think this might work for you https://cloud.google.com/storage/docs/access-control/create-signed-urls-program
I can't seem to find the API doc for nodejs (google is really messing around with their doc urls). Here's some sample code:
bucket.upload(filename, options, function(err, file, apiResponse) {
var mil = Date.now()+60000;
var config = {
action: 'read',
expires: mil
};
file.getSignedUrl(config, function(err, url) {
if (err) {
return;
}
console.log(url);
});
As stated in the official documentation:
By default, when you create a bucket for your project, your app has
all the permissions required to read and write to it.
Whenever you create an App Engine application, there is a default bucket that comes with the following perks:
5GB of free storage.
Free quota for Cloud Storage I/O operations
By default it is created automatically with your application, but in any case you can follow the same link I shared previously in other to create the bucket. Should you need more than those 5GB of free storage, you can make it a paid bucket and you will only be charged for the storage that surpasses the first 5 GB.
Then, you can make use of the Cloud Storage Client Libraries for Node.js and have a look at some nice samples (general samples here or even specific operations over files here) for working with the files inside your bucket.
UPDATE:
Here there is a small working example on how to use the Cloud Storage client libraries to retrieve images from your private bucket without making them public, by means of authenticating requests. It works in a Cloud Function, so you should have no issues in reproducing the same behavior in App Engine. It does not perform exactly what you need, as it displays the image in the bucket alone, without any integration inside an HTML file, but you should be able to build something from that (I am not too used to work with Node.js, unfortunately).
I hope this can be of some help too.
'use strict';
const gcs = require('#google-cloud/storage')();
exports.imageServer = function imageSender(req, res) {
let file = gcs.bucket('<YOUR_BUCKET>').file('<YOUR_IMAGE>');
let readStream = file.createReadStream();
res.setHeader("content-type", "image/jpeg");
readStream.pipe(res);
};

Resources