How to filter Google Cloud Instances using google-compute-engine python client? - python-3.x

I am using
Package Version
------------------------ ---------
google-api-core 2.0.1
google-auth 2.0.2
google-cloud-compute 0.5.0
google-compute-engine 2.8.13
to retrieve google cloud instances data. I was referring to the docs to get aggregated list of instances. I wasnt able to filter instances based on tags of the compute VM instances. Is there a particular way to do it using python as there is no particular mention about it in the documentation. ?

Please include your code.
You should be able to apply a Filter to AggregatedListInstancesRequest and you should be able to specify labels, see Filtering searches using labels. I'm confident (!?) the API is consistent.

The documentation refers to the filter query parameter. I didn't manage to get it to work with tags. But you can still list all the instances and filter it directly.
In the example below I'm looking for the gke-nginx-1-cluster-846d7440-node tag.
from googleapiclient import discovery
from google.oauth2 import service_account
scopes = ['https://www.googleapis.com/auth/compute']
sa_file = 'alon-lavian-0474ca3b9309.json'
credentials = service_account.Credentials.from_service_account_file(sa_file, scopes=scopes)
service = discovery.build('compute', 'v1', credentials=credentials)
project = 'alon-lavian'
zone = 'us-central1-a'
request = service.instances().list(project=project, zone=zone)
while request is not None:
response = request.execute()
for instance in response['items']:
if 'gke-nginx-1-cluster-846d7440-node' in instance['tags']['items']:
print(instance)
request = service.instances().list_next(previous_request=request, previous_response=response)

Related

Unable to fetch the jira cloud assets through atlassian-python-api

I am working on a Atlassian Jira cloud product. I have a requirement to get the details of Jira cloud assets and update a field with some specific key-value pair. To achieve same, I have chosen python api to interact with Atlassian jira cloud and do those changes.
But, I am unable to access jira cloud assets using the atlassian-python-api which I have got from the https://pypi.org/project/atlassian-python-api/. Even I tried with different versions of it to achieve same.
Here is my sample code.
from atlassian import Jira
# Enter your Jira Cloud site URL and API token
JIRA_URL = "https://<your-domain>.atlassian.net"
JIRA_API_TOKEN = "<your-api-token>"
# Initialize the Jira API client
jira = Jira(
url=JIRA_URL,
username="",
password="",
cloud=True,
api_token=JIRA_API_TOKEN,
)
# Fetch a list of all assets
assets = jira.assets().search("")
# Print the asset details in a tabular format
print("Asset ID\tAsset Type\tName\tDescription")
for asset in assets:
print(f"{asset['id']}\t{asset['type']}\t{asset['name']}\t{asset['description']}")
But, getting below error.
assets = jira.assets().search("")
AttributeError: 'Jira' object has no attribute 'assets'
After that, I tried to fetch with different components like - jira.jira_service_desk().get_all_assets() , jira.get_all_assets() and jira.cloud.asset_management.get_all_assets() . But, everytime I am facing respective issue like 'Jira' object has no attribute <>`.
Could you please suggest a way make the bulk operations on jira-cloud-assets.
FYI, Even I went with the atlassian provided api - https://developer.atlassian.com/cloud/assetsapi/rest/api-group-assets/#api-asset-get - It also doesn't help to get them.
Expecting a solution to make write operations on Jira-Cloud assets.

Building a jump-table for boto3 clients/methods

I'm trying to build a jumptable of API methods for a variety of boto3 clients, so I can pass an AWS service name and a authn/authz low-level boto3 client to my utility code and execute the appropriate method to get a list of resources from the AWS service.
I'm not willing to hand-code and maintain a massive if..elif..else statement with >100 clauses.
I have a dictionary of service names (keys) and API method names (values), like this:
jumpTable = { 'lambda' : 'list_functions' }
I'm passed the service name ('lambda') and a boto3 client object ('client') already connected to the right service in the region and account I need.
I use the dict's get() to find the method name for the service, and then use a standard getattr() on the boto3 client object to get a method reference for the desired API call (which of course vary from service to service):
`apimethod = jumpTable.get(service)`
`methodptr = getattr(client, apimethod)`
Sanity-checking says I've got a "botocore.client.Lambda object" for 'client' (that looks OK to me) and a "bound method ClientCreator._create_api_method.._api_call of <botocore.client.Lambda" for the methodptr which reports itself as of type 'method'.
None of the API methods I'm using require arguments. When I invoke it directly:
'response = methodptr()'
it returns a boto3 ClientError, while invoking at through the client:
response = client.methodptr()
returns a boto3 AttributeErrror.
Where am I going wrong here?
I'm locked into boto3, Python3, AWS and have to talk to 100s of AWS services, each of which has a different API method that provides the data I need to gather. To an old C coder, a jump-table seems obvious; a more Pythonic approach would be welcome...
The following works for me:
client = boto3.Session().client("lambda")
methodptr = getattr(client, apimethod)
methodptr()
Note that the boto3.Session() part is required. When calling boto3.client(..) directly, I get a 'UnrecognizedClientException' exception.

How to connect Google Datastore from a script in Python 3

We want to do some stuff with the data that is in the Google Datastore. We have a database already, We would like to use Python 3 to handle the data and make queries from a script on our developing machines. Which would be the easiest way to accomplish what we need?
From the Official Documentation:
You will need to install the Cloud Datastore client library for Python:
pip install --upgrade google-cloud-datastore
Set up authentication by creating a service account and setting an environment variable. It will be easier if you see it, please take a look at the official documentation for more info about this. You can perform this step by either using the GCP console or command line.
Then you will be able to connect to your Cloud Datastore client and use it, as in the example below:
# Imports the Google Cloud client library
from google.cloud import datastore
# Instantiates a client
datastore_client = datastore.Client()
# The kind for the new entity
kind = 'Task'
# The name/ID for the new entity
name = 'sampletask1'
# The Cloud Datastore key for the new entity
task_key = datastore_client.key(kind, name)
# Prepares the new entity
task = datastore.Entity(key=task_key)
task['description'] = 'Buy milk'
# Saves the entity
datastore_client.put(task)
print('Saved {}: {}'.format(task.key.name, task['description']))
As #JohnHanley mentioned, you will find a good example on this Bookshelf app tutorial that uses Cloud Datastore to store its persistent data and metadata for books.
You can create a service account and download the credentials as JSON and then set an environment variable called GOOGLE_APPLICATION_CREDENTIALS pointing to the json file. You can see the details at the link below.
https://googleapis.dev/python/google-api-core/latest/auth.html

What is suggested method to get service versions

What is the best way to get list of service versions in google app engine in flex env? (from service instance in Python 3). I want to authenticate using service account json keys file. I need to find currently default version (with most of traffic).
Is there any lib I can use like googleapiclient.discovery, or google.appengine.api.modules? Or I should build it from scratches and request REST api on apps.services.versions.list using oauth? I couldn't not find any information in google docs..
https://cloud.google.com/appengine/docs/standard/python3/python-differences#cloud_client_libraries
Finally I was able to solve it. Simple things on GAE became big problems..
SOLUTION:
I have path to service_account.json set in GOOGLE_APPLICATION_CREDENTIALS env variable. Then you can use google.auth.default
from googleapiclient.discovery import build
import google.auth
creds, project = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform.read-only'])
service = build('appengine', 'v1', credentials=creds, cache_discovery=False)
data = service.apps().services().get(appsId=APPLICATION_ID, servicesId=SERVICE_ID).execute()
print data['split']['allocations']
Return value is allocations dictionary with versions as keys and traffic percents in values.
All the best!
You can use Google's Python Client Library to interact with the Google App Engine Admin API, in order to get the list of a GAE service versions.
Once you have google-api-python-client installed, you might want to use the list method to list all services in your application:
list(appsId, pageSize=None, pageToken=None, x__xgafv=None)
The arguments of the method should include the following:
appsId: string, Part of `name`. Name of the resource requested. Example: apps/myapp. (required)
pageSize: integer, Maximum results to return per page.
pageToken: string, Continuation token for fetching the next page of results.
x__xgafv: string, V1 error format. Allowed values: v1 error format, v2 error format
You can find more information on this method in the link mentioned above.

Get list of application packages available for a batch account of Azure Batch

I'm making a python app that launches a batch.
I want, via user inputs, to create a pool.
For simplicity, I'll just add all the applications present in the batch account to the pool.
I'm not able to get the list of available application packages.
This is the portion of code:
import azure.batch.batch_service_client as batch
from azure.common.credentials import ServicePrincipalCredentials
credentials = ServicePrincipalCredentials(
client_id='xxxxx',
secret='xxxxx',
tenant='xxxx',
resource="https://batch.core.windows.net/"
)
batch_client = batch.BatchServiceClient(
credentials,
base_url=self.AppData['CloudSettings'][0]['BatchAccountURL'])
# Get list of applications
batchApps = batch_client.application.list()
I can create a pool, so credentials are good and there are applications but the returned list is empty.
Can anybody help me with this?
Thank you,
Guido
Update:
I tried:
import azure.batch.batch_service_client as batch
batchApps = batch.ApplicationOperations.list(batch_client)
and
import azure.batch.operations as batch_operations
batchApps = batch_operations.ApplicationOperations.list(batch_client)
but they don't seem to work. batchApps is always empty.
I don't think it's an authentication issue since I'd get an error otherwise.
At this point I wonder if it just a bug in the python SDK?
The SDK version I'm using is:
azure.batch: 4.1.3
azure: 4.0.0
This is a screenshot of the empty batchApps var:
Is this the link you are looking for:
Understanding the application package concept here: https://learn.microsoft.com/en-us/azure/batch/batch-application-packages
Since its python SDK in action here: https://learn.microsoft.com/en-us/python/api/azure-batch/azure.batch.operations.applicationoperations?view=azure-python
list operation and here is get
hope this helps.
I haven't tried lately using the Azure Python SDK but the way I solved this was to use the Azure REST API:
https://learn.microsoft.com/en-us/rest/api/batchservice/application/list
For the authorization, I had to create an application and give it access to the Batch services and the I programmatically generated the token with the following request:
data = {'grant_type': 'client_credentials',
'client_id': clientId,
'client_secret': clientSecret,
'resource': 'https://batch.core.windows.net/'}
postReply = requests.post('https://login.microsoftonline.com/' + tenantId + '/oauth2/token', data)

Resources