GCP project creation via API doesn't enable Service Usage API - node.js

I'm trying to automate the entire process of project creation using the official Google SDK for Node.js. For project creation, I use the Resource Manager SDK:
const resource = new Resource();
const project = resource.project(projectName);
const [, operation,] = await project.create();
I also have to enable some services in order to use them in the process. When I run:
const client = new ServiceUsageClient();
const [operation] = await client.batchEnableServices({
parent: `projects/${projectId}`,
serviceIds: [
"apigateway.googleapis.com",
"servicecontrol.googleapis.com",
"servicemanagement.googleapis.com",
]
});
I receive:
Service Usage API has not been used in project 1014682171642 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/serviceusage.googleapis.com/overview?project=1014682171642 then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
I find it suspicious that Service Usage API isn't enabled by default when I create a project via API. Obviously, it takes the benefit of using APIs if I had to enable something manually. When I create a project via Could Console, Service Usage API is enabled by default, so this issue affects only the API. Maybe there's some other way to enable Service Usage API programmatically.
I would appreciate any form of help.

As described in GCP docs:
When you create a Cloud project using the Cloud Console or Cloud SDK, the following APIs and services are enabled by default...
In your case, you're creating a project with a Client Library. The doc needs improvement as when it mentioned Cloud SDK, they actually meant the CLI tool, not the Client libraries.
To clarify, projects currently created with Client Libraries or with REST don't have any APIs enabled by default.
You can't call Service Usage to enable Service Usage on a project, as making the call would require Service Usage to already be enabled on the resource project.
My suggestion is to follow this flow:
Some process, using application project X (where Service Usage API is enabled), creates the new project Y.
The same process, using application project X, batch enables API services on project Y.
Or:
Automate the process of project creation on some sort of a bash script and create them using gcloud projects create commands.

I wrote a complete block of code, which worked for me. I apologize in advance if the code quality suffers (I probably butchered it) - literally do not know any nodejs - I compiled it from your code and couple of examples on the Internet.
const {Resource} = require('#google-cloud/resource-manager');
const {ServiceUsageClient} = require('#google-cloud/service-usage');
const projectId = '<YOUR PROJECT ID>';
const orgId = '<YOUR ORG ID>'; // I had to use org for my project
const resource = new Resource();
async function create_project() {
await resource
.createProject(`${projectId}`, {
name: `${projectId}`,
parent: { type: "organization", id: `${orgId}` }
})
.then(data => {
const operation = data[1];
return operation.promise();
})
.then(data => {
console.log("Project created successfully!");
enable_apis();
});
}
const client = new ServiceUsageClient();
async function enable_apis() {
const [operation] = await client.batchEnableServices({
parent: `projects/${projectId}`,
serviceIds: [
"serviceusage.googleapis.com",
"servicecontrol.googleapis.com",
"servicemanagement.googleapis.com",
]
})
}
create_project();
This successfully creates the project and enables three APIs. I would make sure the project is fully created before trying to enable apis (this is just a theory).
Regarding the link, you've mentioned earlier, I am going to speculate here, but I think what they meant by Cloud SDK is gcloud CLI tool, which is a part of Cloud SDK.

Related

Is it possible to programmatically retrieve from serviceaccount credentials which api's are enabled? (in nodejs, not the cloud environment)

I have a service account credentials json file with client_email and private_key.
Is it then possible to programmatically retrieve from serviceaccount credentials which api's are enabled? I don't mean a solution like go to console.cloud.google.com but from within nodejs. Thanks!
You will need to know the Project ID as well. The answer from #wardenunleashed is for API Gateway. That does not cover which Google APIs are enabled.
APIs are enabled per project, so you must specify the project to query.
A service account JSON key file contains the Project ID for the project that owns the service account.
The private_key_id is also important. That ID is used to lookup the public key for validating private key signatures.
Google has an API Gateway Client Library for NodeJS with the desired capability
const projectId = 'my-project';
const {ApiGatewayServiceClient} = require('#google-cloud/api-gateway');
const client = new ApiGatewayServiceClient();
async function listApis() {
const [apis] = await client.listApis({
parent: `projects/${projectId}/locations/global`,
});
for (const api of apis) {
console.info(`name: ${api.name}`);
}
}
listApis();

how do you create a google cloud project and service account using node js?

How do you programmatically provision a google cloud project, enable API's and create a service account using node js?
thanks
Chris
The answer lies within the REST API.
Assuming you’re using the Cloud Shell, first install the Node.JS Client Library by running:
npm install googleapis --save
To create a project you can use the Resource Manager API method ‘projects.create‘ as shown in the Node.JS code example there. Replace my-project-id with the desired Project ID, and my-project-name with the desired Project name:
const {google} = require('googleapis');
var cloudResourceManager = google.cloudresourcemanager('v1');
authorize(function(authClient) {
var request = {
resource: {
"projectId": "my-project-id", // TODO
"name": "my-project-name" // TODO
},
auth: authClient,
};
cloudResourceManager.projects.create(request, function(err, response) {
if (err) {
console.error(err);
return;
}
console.log(JSON.stringify(response, null, 2));
});
});
Similarly you can use the Cloud IAM API ‘projects.serviceAccounts.create’ method to create Service Accounts. Replace my-project-id with the project ID the Service Account will be associated with, and my-service-account with the desired Service Account ID:
const {google} = require('googleapis');
var iam = google.iam('v1');
authorize(function(authClient) {
var request = {
name: 'projects/my-project-id', // TODO
resource: {
"accountId": "my-service-account" // TODO
},
auth: authClient,
};
iam.projects.serviceAccounts.create(request, function(err, response) {
if (err) {
console.error(err);
return;
}
console.log(JSON.stringify(response, null, 2));
});
});
And then, to enable an API or Service use the Service Usage API ‘services.enable’ method. In this case, I will enable the Cloud Pub/Sub API. Replace 123 with your project number:
const {google} = require('googleapis');
var serviceUsage = google.serviceusage('v1');
authorize(function(authClient) {
var request = {
name: "projects/123/services/pubsub.googleapis.com", // TODO
auth: authClient,
};
serviceUsage.services.enable(request, function(err, response) {
if (err) {
console.error(err);
return;
}
console.log(JSON.stringify(response, null, 2));
});
});
Alternatively you may use the ‘services.batchEnable‘ method to enable multiple APIs in a single call. You can find a full list of the APIs you can enable here.
You can define each call with:
function authorize(callback) {
google.auth.getClient({
scopes: ['https://www.googleapis.com/auth/cloud-platform']
}).then(client => {
callback(client);
}).catch(err => {
console.error('authentication failed: ', err);
});
}
Note that you should adapt the code to your needs, simplifying it, and modifying or adding any additional parameters you require for your API calls.
The Deployment Manager allows you to provision all these resources and can be triggered through the API.
There is even an official example on GitHub that does the following:
Creates a new project.
Sets the billing account on the new project.
Sets IAM permissions on the new project.
Turns on a set of apis in the new project.
Creates service accounts in the new project.
Remember to replace the values in the config.yaml file. To get the billing account ID, you can use the billingAccounts.list method, and to get the organization ID, you can use the gcloud organizations list command.
Keep in mind that you will need to set up the requirements specified in the README file of the example repository, but this only needs to be done once. The permissions to the DM Service Account can be set in the Manage Resources section of the Cloud Console.
Once you have changed the required configurations, you can test the deployment with the gcloud deployment-manager deployments create command and get the request body sent to the Deployments: insert API by adding the --log-http flag. Notice that what interests you is the first request, the others are made to check the progress of the operation.
Finally, with the request body contents, you can change the values you need and make this call to the API using nodejs. This post provides examples on how to use the google-api-nodejs-client to create deployments.
The advantage of using the Deployment Manager is that all resources can be created in a single request.
Administering projects using Client Libraries to my knowledge is not yet supported.
You'll need to rely on gcloud cli to manage projects and enabling apis achieve this.
If you are looking to automate this you can still do this using a service account while using gcloud cli.
And if you are looking to build this for internal users you can use a CI/CD tool like Jenkins.
For external users you can use something like cloud run to achieve this. Although I can't think of a scenario where you want to do this

Why my API calls using google-api-nodejs-client to Google Analytics are not working in production?

I'm calling the Google Analytics Reporting API using google-api-nodejs-client to show the number of visits inside a blog.
This blog is hosted inside Google App Engine Standard Environment.
In development, I'm authenticating my API calls using the Application Default Credentials. I downloaded the JSON file with the credentials from the account service I created exclusively for analytics purposes, set the file to the Google_Application_Credentials environment variable and everything worked. I'm able to get the data from Google Analytics and display it in the website.
But this is not working in production. I suppose getClient() it's not getting the credentials in that environment.
Things to note: 1) I did not upload the downloaded JSON file with the credentials from the service account (I think it would be counter intuitive and unsafe to do that, and from what I understood in the docs, GCP is able to deal automatically with the API authentications);
const {google} = require("googleapis");
async function main () {
// This method looks for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS
// environment variables.
const auth = await google.auth.getClient({
// Scope of the analytics reporting,
// with only reading access.
scopes: 'https://www.googleapis.com/auth/analytics.readonly',
});
// Create the analytics reporting object
const analyticsreporting = await google.analyticsreporting({
version: 'v4',
auth: auth,
});
// Fetch the analytics reporting
const res = await analyticsreporting.reports.batchGet({...});
return res.data;
}
I already run out of options. Can someone help me with this?
This is a problem with the default scopes and application default credentials. By default, if you don't create a new service account, you are going to get 'application default credentials' from the GCE metadata service:
https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-nodejs
Those credentials usually only have the cloud-platform scope, and the set of scopes cannot be changed (as of today). To make this work, you have a few options.
You could create a new service account, download the service account key, and use the keyFile property in the getClient method options to reference the key. If you do it this way, the scopes you pass into getClient will be respected.
You could play with the scopes available to the service account under which your GAE application is running. I haven't personally tried that, but it theoretically should be possible.
Best of luck!

Nodejs cloud vision api PERMISSION_DENIED wrong project #

when I try to run firebase functions with cloud vision API and test the functions. I get this error:
ERROR: { Error: 7 PERMISSION_DENIED: Cloud Vision API has not been
used in project 563584335869 before or it is disabled. Enable it by
visiting
https://console.developers.google.com/apis/api/vision.googleapis.com/overview?project=563584335869
then retry. If you enabled this API recently, wait a few minutes for
the action to propagate to our systems and retry.
I do not recognize this project number and I have already enabled the API with the project that I am using. I set the GOOGLE_APPLICATION_CREDENTIALS using the project with the enabled API. What is it that I'm doing wrong?
For those of you who are still having this issue here is what worked for me:
const client = new vision.ImageAnnotatorClient({
keyFilename: 'serviceAccountKey.json'
})
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly. Additionally, you can take a look on this link that contains a useful step-by-step guide to use Firebase functions with Vision API which includes the Vision object authentication code for Node.js.
Passing the path to the service account key in code example:
// Imports the Google Cloud client library.
const Storage = require('#google-cloud/storage');
// Instantiates a client. Explicitly use service account credentials by
// specifying the private key file. All clients in google-cloud-node have this
// helper, see https://github.com/GoogleCloudPlatform/google-cloud-node/blob/master/docs/authentication.md
const storage = new Storage({
keyFilename: '/path/to/keyfile.json'
});
// Makes an authenticated API request.
storage
.getBuckets()
.then((results) => {
const buckets = results[0];
console.log('Buckets:');
buckets.forEach((bucket) => {
console.log(bucket.name);
});
})
.catch((err) => {
console.error('ERROR:', err);
});
API has not been used in project 563584335869
If you clicked that link has been printed in a console, that guide you to this url.
https://console.developers.google.com/apis/api/vision.googleapis.com/overview?project=firebase-cli
So that project id means you used the project credential of 'firebase-cli' not yours.
If you tried to set the value of the environment, you can find the variables in your directory
~/.config/firebase/...credentials.json
And sometime it would be not replaced after you tried to override.
But, you can set your credential in code.
You can find the way of getting a credential in here.
https://cloud.google.com/iam/docs/creating-managing-service-account-keys
And the credential format is like this one.
{
"type": "service_account",
"project_id":
"private_key_id":
"private_key":
"client_email":
"client_id":
"auth_uri":
"token_uri":
"auth_provider_x509_cert_url":
"client_x509_cert_url":
}
I've faced exactly the same error what you have when I used another google API. and resolved that way including the credential inside code.
const textToSpeech = require("#google-cloud/text-to-speech")
const keyfile = require(".././my-project.json")
const config = {
projectId: keyfile.project_id,
keyFilename: require.resolve(".././my-project.json")
};
const TTS_Client = new textToSpeech.TextToSpeechClient(config)

Google Cloud Function : support for Google Cloud KMS

I am using a Google Cloud Function (GCF) with a Pubsub trigger which sends a HTTP request to a third party API.
The GCF receives notifications from a Pubsub topic used by a service which should not be aware of the third party API.
The third party API requires an authentication using Basic HTTP Authentication.
In order to not to have to hardcode the password in my source code I am using Google KMS to generate a new encrypted key each time I deploy my function. I am using Google Cloud KMS to decrypt the secret each time the function is instantiated.
For decrypting using KMS I have to provide a private key for a service account to the NodeJS Google API.
My main problem today is that I have to push my private key to the GCloud Bucket if I want my GCF to work properly.
Is it possible by using either the Runtime Configurator or the Deployment Manager to configure secrets for a Google Cloud Function?
Thanks you.
As of December 2019, the preferred way to store and manage secrets on Google Cloud is Secret Manager:
$ echo -n "user:pass" | gcloud beta secrets create "my-basic-auth" \
--data-file=- \
--replication-policy "automatic"
You can also create and manage secrets from API:
// Import the library
const {SecretManagerServiceClient} = require('#google-cloud/secret-manager');
// Create the client
const client = new SecretManagerServiceClient();
// Create the secret
const [secret] = await client.createSecret({
parent: "projects/<YOUR-PROJECT-ID>",
secretId:"my-basic-auth",
secret: {
replication: {
automatic: {},
},
},
});
// Add the version with your data
const [version] = await client.addSecretVersion({
parent: secret.name,
payload: {
data: Buffer.from("user:pass", "utf8"),
},
});
Then, in your Cloud Function:
const [version] = await client.accessSecretVersion({
name:"projects/<YOUR-PROJECT-ID>/secrets/<MY-SECRET>/versions/1",
});
const auth = version.payload.data.toString('utf-8');
// auth is user:pass
The service account with which you deploy your Cloud Function will need roles/secretmanager.secretAccessor permissions.
The other solution to this which came out only in the last few months, is to use Google Cloud Runtime Configuration with Firebase for Functions:
https://firebase.google.com/docs/functions/config-env
Firebase for Functions seems to provide access to several features that are not yet available via other means.
Runtime Configurator does not charge for use, but enforces the following API limits and quotas:
1200 Queries Per Minute (QPM) for delete, create, and update requests
600 QPM for watch requests.
6000 QPM for get and list requests.
4MB of data per user, which consists of all data written to the Runtime Configurator service and accompanying metadata.
https://cloud.google.com/deployment-manager/pricing-and-quotas#runtime_configurator
As an aside, I find this conflict in the Firebase for Functions comical:
The Firebase SDK for Cloud Functions offers built-in environment configuration to make it easy to store and retrieve this type of data for your project without having to redeploy your functions.
Then a moment later:
After running functions:config:set, you must redeploy functions to make the new configuration available.
The KMS solution is a viable alternative, however it seems costly for functions. KMS is billed at $0.06 per month per active key, as well as $0.03 per 10,000 operations.
This would then change the cost of your Cloud Function from $0.40 per million invocations, to $3.40 per million invocations. That is quite the jump.
https://cloud.google.com/kms/
https://cloud.google.com/functions/
Is it possible by using either the Runtime Configurator or the Deployment Manager to configure secrets for a Google Cloud Function?
There is no built-in service that will let you configure secrets to be directly accessed by Google Cloud Functions at this time, so the method you are currently using is the proper way to handle secrets on Cloud functions for the time being. This could change as the product is still in beta.
If you want you can make a feature request to the Cloud Function team by using the appropriate issue tracker.
There's also a Google Cloud Key Management Service: Node.js Client.
cd functions
npm install #google-cloud/kms
For example:
// Imports the Cloud KMS library
const {KeyManagementServiceClient} = require('#google-cloud/kms');
// Instantiates a client
const client = new KeyManagementServiceClient();
// Build the location name
const locationName = client.locationPath(functions.config().firebase.projectId, functions.config().firebase.locationId);
async function listKeyRings() {
const [keyRings] = await client.listKeyRings({
parent: locationName,
});
for (const keyRing of keyRings) {
console.log(keyRing.name);
}
return keyRings;
}
return listKeyRings();

Resources