How to create a Google API key using NodeJS or Rest? - node.js

I was able to do this using the gcloud CLI:
gcloud --project=some-project alpha services api-keys create
But I could not find any way to do this using googleapis, nor was I able find any leads at their node repository google-api-nodejs-client.
For context, I will be running this functions in AWS Lambda.

I think (!?) that this API is not yet exposed through APIs Explorer:
E.g. The following 404s (NOT_FOUND)
API=apkeys
VER=v2alpha1
curl https://www.googleapis.com/discovery/v1/apis/${API}/${VER}/rest
Unfortunately, until it is, (there's no discovery document and) the API Client library is unable to auto-generate the SDK for it.
It's unclear to me whether this is policy or an oversight.
I recommend you pester the Cloud SDK team on Google's Issue Tracker (for Cloud SDK)
Note:
If you append --log-http to (any) gcloud command, it will display the underlying REST calls for the command. Absent a Google-provided SDK for these methods, you could introspect the API and code the REST calls directly:
gcloud alpha services api-keys create ... \
--project=${PROJECT} \
--log-http
Yields:
==== request start ====
uri: https://apikeys.googleapis.com/v2alpha1/projects/${PROJECT}/keys?alt=json
method: POST
== headers start ==
b'accept': b'application/json'
b'authorization': b'Bearer ya29...'
== headers end ==
== body start ==
== body end ==

Related

Unable to fetch the jira cloud assets through atlassian-python-api

I am working on a Atlassian Jira cloud product. I have a requirement to get the details of Jira cloud assets and update a field with some specific key-value pair. To achieve same, I have chosen python api to interact with Atlassian jira cloud and do those changes.
But, I am unable to access jira cloud assets using the atlassian-python-api which I have got from the https://pypi.org/project/atlassian-python-api/. Even I tried with different versions of it to achieve same.
Here is my sample code.
from atlassian import Jira
# Enter your Jira Cloud site URL and API token
JIRA_URL = "https://<your-domain>.atlassian.net"
JIRA_API_TOKEN = "<your-api-token>"
# Initialize the Jira API client
jira = Jira(
url=JIRA_URL,
username="",
password="",
cloud=True,
api_token=JIRA_API_TOKEN,
)
# Fetch a list of all assets
assets = jira.assets().search("")
# Print the asset details in a tabular format
print("Asset ID\tAsset Type\tName\tDescription")
for asset in assets:
print(f"{asset['id']}\t{asset['type']}\t{asset['name']}\t{asset['description']}")
But, getting below error.
assets = jira.assets().search("")
AttributeError: 'Jira' object has no attribute 'assets'
After that, I tried to fetch with different components like - jira.jira_service_desk().get_all_assets() , jira.get_all_assets() and jira.cloud.asset_management.get_all_assets() . But, everytime I am facing respective issue like 'Jira' object has no attribute <>`.
Could you please suggest a way make the bulk operations on jira-cloud-assets.
FYI, Even I went with the atlassian provided api - https://developer.atlassian.com/cloud/assetsapi/rest/api-group-assets/#api-asset-get - It also doesn't help to get them.
Expecting a solution to make write operations on Jira-Cloud assets.

How to create a nodejs AWS lambda function locally using vscode and not serverless or SAM CLI?

I've go through hundreds of blogs/videos/resources but nowhere it mentions how to create a simple lambda function for Nodejs REST API locally using vscode, AWS toolkit extension and AWS cli. Is there any way where I can create a simple nodejs endpoint on my local and run using above and not serverless or SAM?( There's some internal restrictions hence I can't use them)
What you need is to set up a API gateway and event trigger for your lambda that triggers whenever a HTTP request comes in. So here are steps
Look into serverless framwork where you will define a serverless.yaml file that will have configuration to mention how your lambda will get invoked (In this case, its a HTTP event)
In your IDE of choice, use serverless-offline npm package
Your IDE config will look something like this (This example uses IntelliJ IDE)
IDE config to start up Lambda in local
Once you start up the service in local, you should be able to hit the REST endpoint in local using any rest client like Postman
Instead of (4) above you could also directly invoke your lambda function in local using AWS CLI like aws lambda invoke /dev/null \ --endpoint-url http://localhost:3002 \ --function-name <Your lambda function name> \ --payload '{<Your payload>}'

How to connect Google Datastore from a script in Python 3

We want to do some stuff with the data that is in the Google Datastore. We have a database already, We would like to use Python 3 to handle the data and make queries from a script on our developing machines. Which would be the easiest way to accomplish what we need?
From the Official Documentation:
You will need to install the Cloud Datastore client library for Python:
pip install --upgrade google-cloud-datastore
Set up authentication by creating a service account and setting an environment variable. It will be easier if you see it, please take a look at the official documentation for more info about this. You can perform this step by either using the GCP console or command line.
Then you will be able to connect to your Cloud Datastore client and use it, as in the example below:
# Imports the Google Cloud client library
from google.cloud import datastore
# Instantiates a client
datastore_client = datastore.Client()
# The kind for the new entity
kind = 'Task'
# The name/ID for the new entity
name = 'sampletask1'
# The Cloud Datastore key for the new entity
task_key = datastore_client.key(kind, name)
# Prepares the new entity
task = datastore.Entity(key=task_key)
task['description'] = 'Buy milk'
# Saves the entity
datastore_client.put(task)
print('Saved {}: {}'.format(task.key.name, task['description']))
As #JohnHanley mentioned, you will find a good example on this Bookshelf app tutorial that uses Cloud Datastore to store its persistent data and metadata for books.
You can create a service account and download the credentials as JSON and then set an environment variable called GOOGLE_APPLICATION_CREDENTIALS pointing to the json file. You can see the details at the link below.
https://googleapis.dev/python/google-api-core/latest/auth.html

Get Google App Engine LocationId at runtime

The new Cloud Tasks python libraries require location as task creation parameter. I can always look up the location and hardcode it, but everything else, including the project name, is available through environment variables. Is there a way to get the locationId (eg. us-central1) from python3 standard environment?
The REST API (and presumably the python client library) for AppEngine can return the location id if you know the application name:
https://cloud.google.com/appengine/docs/admin-api/reference/rest/v1/apps/get
The Application object that is returned has a "locationId" key.
However, note that the cloud tasks documentation calls out 2 exceptions to verbatim using this identifier: europe-west and us-central need to be passed to tasks as europe-west1 and us-central1 respectively.
It's possible to get this information from the Metadata server. Accessing http://metadata.google.internal/computeMetadata/v1/instance/region from your app will return a string of the form 'projects/[numeric-project-id]/regions/[locationId]'.

What is suggested method to get service versions

What is the best way to get list of service versions in google app engine in flex env? (from service instance in Python 3). I want to authenticate using service account json keys file. I need to find currently default version (with most of traffic).
Is there any lib I can use like googleapiclient.discovery, or google.appengine.api.modules? Or I should build it from scratches and request REST api on apps.services.versions.list using oauth? I couldn't not find any information in google docs..
https://cloud.google.com/appengine/docs/standard/python3/python-differences#cloud_client_libraries
Finally I was able to solve it. Simple things on GAE became big problems..
SOLUTION:
I have path to service_account.json set in GOOGLE_APPLICATION_CREDENTIALS env variable. Then you can use google.auth.default
from googleapiclient.discovery import build
import google.auth
creds, project = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform.read-only'])
service = build('appengine', 'v1', credentials=creds, cache_discovery=False)
data = service.apps().services().get(appsId=APPLICATION_ID, servicesId=SERVICE_ID).execute()
print data['split']['allocations']
Return value is allocations dictionary with versions as keys and traffic percents in values.
All the best!
You can use Google's Python Client Library to interact with the Google App Engine Admin API, in order to get the list of a GAE service versions.
Once you have google-api-python-client installed, you might want to use the list method to list all services in your application:
list(appsId, pageSize=None, pageToken=None, x__xgafv=None)
The arguments of the method should include the following:
appsId: string, Part of `name`. Name of the resource requested. Example: apps/myapp. (required)
pageSize: integer, Maximum results to return per page.
pageToken: string, Continuation token for fetching the next page of results.
x__xgafv: string, V1 error format. Allowed values: v1 error format, v2 error format
You can find more information on this method in the link mentioned above.

Resources