I'm trying to use #google-cloud/tasks, and have a piece of code modeled after the node.js documentation for creating an HTTP Target task. The Cloud Tasks API is enabled, and I've created a queue from the gcloud CLI.
The backend in my case is Cloud Run. I've followed the instructions to configure a service account for HTTP Target handler authentication. The service account has the following permissions:
Cloud Run Invoker
Cloud Tasks Enqueuer
Cloud Tasks Task Runner
Service Account User
Lastly regarding the bit about Using HTTP Target tasks with authentication tokens, I've added:
oidcToken: {
serviceAccountEmail,
},
to the task object. After all this, the response is still PERMISSION_DENIED: Request had insufficient authentication scopes.
Actually, even if I try something basic, like listing the queues:
process.env.GOOGLE_APPLICATION_CREDENTIALS = '/path-to-my-service-account.json';
const {CloudTasksClient} = require('#google-cloud/tasks');
const client = new CloudTasksClient();
const parent = client.queuePath('my-project', 'my-region', 'my-queue-name');
client.listQueues({parent})
I get the same authentication error. I'm running node inside a Docker container, so maybe that's why Application Default Credentials aren't working? Any idea how I can authenticate successfully?
Related
I want to trigger a GCP cloud function from a simple nodejs app running locally.
Reading the documentation it should be simple:
run gcloud auth application-default login to write ADC to file used by client libraries.
use google-auth-library to get a http client to use to trigger the function.
/**
* TODO(developer): Uncomment these variables before running the sample.
*/
// Example: https://my-cloud-run-service.run.app/books/delete/12345
// const url = 'https://TARGET_HOSTNAME/TARGET_URL';
// Example (Cloud Functions): https://project-region-projectid.cloudfunctions.net/myFunction
const targetAudience = 'https://<REGION>-<PROJECTID>.cloudfunctions.net/<FUNCTIONNAME>';
const { GoogleAuth } = require('google-auth-library');
const auth = new GoogleAuth();
const payload = {"prop1": "prop1Value"};
async function request() {
const client = await auth.getIdTokenClient(targetAudience);
const resp = await client.request({ url: targetAudience, method: 'POST', data: payload });
console.info(`Resp status: ${resp.status}; resp.data: ${resp.data}`);
}
(async () => {
await request();
})();
My understanding was that the google-auth-library would pick up the ADC from the file setup from running gcloud auth application-default login and everything would work.
My user has permission to invoke GCP functions as I can trigger the function using CURL with the header -H "Authorization:bearer $(gcloud auth print-identity-token)" \
However when I run this, it doesn't get past the line:
const client = await auth.getIdTokenClient(targetAudience);
Failing with:
Cannot fetch ID token in this environment, use GCE or set the GOOGLE_APPLICATION_CREDENTIALS environment variable t
o a service account credentials JSON file.
Using PubSub library works fine so expect ADC does work just not sure what am I missing when trying to trigger the GCP function.
Am I using the google-auth-library correctly here ?
Thanks
As mentioned in the thread:
gcloud auth activate-service-account --key-file is only for "you"
running gcloud commands, it won’t be picked up by "applications" that
need GOOGLE_APPLICATION_CREDENTIALS. As you can see from Invoke a
Google Cloud Run from java or How to call Cloud Run from outside
of Cloud Run/GCP?, you either need to have the JSON key file of
Service Account, or have to be running inside a GCE/GKE/Cloud Run/App
Engine/GCF instance.
For this to work on your local environment, I recommend logging in
with gcloud auth application-default login command (this command is
meant to work as if you’ve set GOOGLE_APPLICATION_CREDENTIALS
locally).
If that doesn't work, as a last resort you can refactor your code to
pick up identity token from an environment variable (if set) while
working locally,
such as: $ export ID_TOKEN="$(gcloud auth print-identity-token -q)" $ ./your-app
To know more about how the code does it with a JSON key file,refer to the link and similar implementation there.
For more information you can refer to a similar thread stated as :
Give the default service account access rights to Workspace
resource(s) you're attempting to access.
Use the JSON key file you set
up locally already, to have the Cloud Function run as the same user as
is happening when you run locally.
Essentially do a hybrid where you create a new service account that ONLY has the permissions you want (instead of using the default
service account or your personal user, both of which might have far
more permissions then desired for safety/security), use a key file to
run the Cloud Function under that identity, and only give the desired
permissions to that service account.
I am trying to use google text-to-speech and other translation service in my nodejs but when i connect to google api I get this error message
"Your application has authenticated using end user credentials from the Google Cloud SDK or Google Cloud Shell which are not supported by the texttospeech.googleapis.com. We recommend configuring the billing/quota_project setting in gcloud or using a service account through the auth/impersonate_service_account setting. For more information about service accounts and how to use them in your application, see https://cloud.google.com/docs/authentication/. If you are getting this error with curl or similar tools, you may need to specify 'X-Goog-User-Project' HTTP header for quota and billing purposes. For more information regarding 'X-Goog-User-Project' header, please check https://cloud.google.com/apis/docs/system-parameters.",
metadata: Metadata {
internalRepr: Map(2) {
'google.rpc.errorinfo-bin' => [Array],
'grpc-status-details-bin' => [Array]
},
options: {}
},
note: 'Exception occurred in retry method that was not classified as transient'
}
so after many research i tried to verify that i am authenticating using my service account credentials. I ran this command
gcloud auth activate-service-account --key-file=./auth/service_acct_key.json
and it shows this
Activated service account credentials for: [firebase-adminsdk-uwecx#xxxxx.iam.gserviceaccount.com]
but when run the server again
node server.js
I still got the error
what is causing this error and how can authenticate correctly ?
With gcloud CLI, you have 2 level of authentication:
The CLI level
The Google Cloud Auth library level (Also named ADC, for Application Default Credential)
When you perform the command gcloud auth .... you are at the CLI level
When you perform the command gcloud auth application-default ... you are at the ADC level.
In your case, you only set the authentication at the CLI level, and, of course, that authentication isn't detected in your NODE app, that use Google Cloud libraries and search credential at ADC level.
When you use service account key file (that is a bad practice, but too often prosed and shared in tutorial, even on Google Cloud tutorials (...)), you have to set an environment variable GOOGLE_APPLICATION_CREDENTIALS with the value equals to the absolute path of your service account key file. Try that
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/auth/service_acct_key.json
node server.js
It should work.
Usually in Python what I do, I get the application default credentials, I get the access token then I refresh it to be able to authenticate to a private environment.
Code in Python:
# getting the credentials and project details for gcp project
credentials, your_project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
#getting request object
auth_req = google.auth.transport.requests.Request();
print(f"Checking Authentication : {credentials.valid}")
print('Refreshing token ....')
credentials.refresh(auth_req)
#check for valid credentials
print(f"Checking Authentication : {credentials.valid}")
access_token = credentials.token
credentials = google.oauth2.credentials.Credentials(access_token);
storage_client = storage.Client(project='itg-ri-consumerloop-gbl-ww-dv',credentials=credentials)
I am entirely new to NodeJS, and I am trying to make the same thing.
My goal later is to create an app engine application that would expose an image that is found in a private bucket, so credentials are a must.
How it is done?
For authentication, you could rely on the default application credentials that are present within the GCP platform (GAE, Cloud Functions, VM, etc.). Then you could just run the following piece of code from the documentation:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage();
const bucket = storage.bucket('albums');
const file = bucket.file('my-existing-file.png');
In most circumstances, there is no need to explicitly use authentication packages since they are already executed underneath the google-cloud/storage package in Nodejs. The same holds for the google-cloud-storage package in Python. It could help to look at the source code of both packages on Github. For me, this really helped to understand the authentication mechanism.
When I develop code on my own laptop, that interacts with google cloud storage, I first tell the gcloud SDK what my credentials are and on which GCP project I am working. I use the following commands for this:
gcloud config set project [PROJECT_ID]
gcloud auth application-default login
You could also set DEFAULT_APPLICATION_CREDENTIALS as an environment variable that points to a credentials file. Then within your code, you could pass the project name when initializing the client. This could be helpful if you are running your code outside of GCP on another server for example.
I'm trying to make some calls to the new Azure Scheduler API. However, all my requests come back with this error:
<Error xmlns="http://schemas.microsoft.com/windowsazure" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<Code>AuthenticationFailed</Code>
<Message>The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.</Message>
</Error>
I'm pretty sure that I have everything setup correct because I can make calls using the same code and certificate to the Azure Service Management API.
The code I'm using to attach the certificate to the web request is from the MSDN Sample. The Scheduler API calls that I've tried to make are the Check Name Availability, Create Cloud Service, and Create Job Collection.
I've also verified that my subscription is Active for the preview of the Scheduler.
Here is an example of a request I've tried:
Create Cloud Service
Request A cloud service is created by submitting an HTTP PUT operation
to the CloudServices OData collection of the Service Management API
Tenant.Replace with your subscription ID and
with your cloud service ID.
So for this I create a web request pointing to:
https://management.core.windows.net/[MySubId]/cloudServices/[MyNewServiceName]
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(requestUri);
// Define the requred headers to specify the API version and operation type.
request.Headers.Add("x-ms-version", "2012-03-01");
request.Method = "PUT";
request.ContentType = "application/xml";
Next I add the request body as specified in the documentation:
<CloudService xmlns:i='http://www.w3.org/2001/XMLSchema-instance' xmlns='http://schemas.microsoft.com/windowsazure'>
<Label>[MyServiceName]</Label>
<Description>testing</Description>
<GeoRegion>uswest</GeoRegion>
</CloudService>
And finally I add the certificate that I use with my subscription to the account.
// Attach the certificate to the request.
request.ClientCertificates.Add(certificate);
I try to get the response and instead I get the error shown above.
BTW - I've also tried different regions thinking maybe it was a region issue since the scheduler isn't supported in all regions, but I still get the same response.
You need to register the scheduler in your application first by calling (PUT):
<subscription id>/services?service=scheduler.JobCollections&action=register
If you want to do this in .NET you can use the new Management libraries:
var schedulerServiceClient = new SchedulerManagementClient(credentials);
var result = schedulerServiceClient.RegisterResourceProvider();
Console.WriteLine(result.RequestId);
Console.WriteLine(result.StatusCode);
Console.ReadLine();
More detail: http://fabriccontroller.net/blog/posts/a-complete-overview-to-get-started-with-the-windows-azure-scheduler/
I have a Pull Task Queue running on App Engine. I am trying to access the queue externally from the NodeJS REST client: https://github.com/google/google-api-nodejs-client
I'm passing my Server API key in with the request:
var googleapis = require('googleapis'),
API_KEY = '...';
googleapis
.discover('taskqueue', 'v1beta2')
.execute(function(err, client) {
var req = client.taskqueue.tasks.insert({
project: 'my-project',
taskqueue: 'pull-queue',
key: API_KEY
});
req.execute(function(err, response) {
...
});
});
But I am getting back a 401 "Login Required" message. What am I missing?
If I need to use OAuth, how can I get an access token if my client is a NodeJS server instead of user/browser that can process the OAuth redirect URL?
The best way to do this is to take advantage of Service Accounts in GCE. This is a synthetic user account that is usable by anyone in the GCE project. Getting all of the auth lined up can be a little tricky. Here is an example on how to do this in python.
The general outline of what you need to do:
Start the GCE instance with the task queue OAuth scope.
Add the GCE service account to the task queue ACL in queue.yaml.
Acquire an access token. It looks like you can use the computeclient.js credential object to automate the HTTP call to http://metadata/computeMetadata/v1beta1/instance/service-accounts/default/token
Use this token in any API calls to the task queue API.
I'm not a Node expert, but searching around I saw found an example of how to connect to the Datastore API from Node using service accounts from GCE. It should be straightforward to adapt this to the task queue API.