cds toolbox - Exception: Missing/incomplete configuration - python-3.x

cds toolbox - Exception: Missing/incomplete configuration
This is the code :
import cdsapi
c = cdsapi.Client()
c.retrieve(
'reanalysis-era5-single-levels',
{
'product_type': 'reanalysis',
'variable': [
'2m_temperature', 'cloud_base_height', 'total_precipitation',
],
'year': [
'1980', '1981', '1982',
'1983', '1984', '1985',
'1986', '1987', '1988',
'1989', '1990', '1991',
'1992', '1993', '1994',
'1995', '1996', '1997',
'1998', '1999',
],
'month': '10',
'day': '16',
'time': '21:00',
'format': 'netcdf',
},
'download.nc')
This is my output :

go to user directory C:Users\username
create a text file (like sample.txt) and change its name to .cdsapirc i.e you have to create a file with extension .CDSAPIRC (it should not be having any name, its just a file with extension like .conda)
open this file with any text editor like Notepad++
paste this in your .cdsapirc file
url: https://cds.climate.copernicus.eu/api/v2
key: xxxxx:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
you will get your own UID and API key from your user information via this URL or simply click on your username on the upper right corner, it will direct you to the page
https://cds.climate.copernicus.eu/user/
so in the line change credentials as
key: UID:API-Key
Your Python File should work Now

Related

Is there a hotkey for adding quotes to Python dictionary in VS Code?

Copied "Query String Parameters" from Chrome Dev Tools, and it was like:
param = {
type: 24
interval_id: 100:90
action:
start: 60
limit: 20
}
To make it works correctly as dictionary in Python, I have to add comma and quotes like this:
param = {
'type': '24',
'interval_id': '100:90',
'action':'',
'start': '60',
'limit': '20',
}
Is there a hotkey to adding quotes automatically?

API to pull the list of supported languages in AWS Translate

I am currently working on a project where I need to translate Customer comments into English from the source language on AWS. It is easy to do so using AWS Translate but before I call translate API to translate text into English, I want to check whether the source language is supported by AWS or not?
One solution is to put all the language codes supported by AWS Translate into a list and then check source language against this list. This is easy but it is going to be messy and I want to make it more dynamic.
So, I am thinking to code like this
import boto3
def translateUserComment(source_language):
translate = boto3.client(service_name='translate', region_name='region', use_ssl=True)
languages_supported = tanslate.<SomeMethod>()
if source_language in languages_supported:
result = translate.translate_text(Text="Hello, World",
SourceLanguageCode=source_language, TargetLanguageCode="en")
print('TranslatedText: ' + result.get('TranslatedText'))
print('SourceLanguageCode: ' + result.get('SourceLanguageCode'))
print('TargetLanguageCode: ' + result.get('TargetLanguageCode'))
else:
print("The source language is not supported by AWS Translate")
Problem is that I am not able to find out any API call to get the list of languages/language codes supported by AWS Translate for place.
Before I posted this question,
I have tried to search for similar questions on stackoverflow
I have gone through the AWS Translate Developer guide but still no luck
Any suggestion/ redirection to the right approach is highly appreciated.
Currently there is no API for this service, although this code would work, in this code a class is created Translate_lang with all the language codes and country wise
from here-> https://docs.aws.amazon.com/translate/latest/dg/what-is.html
, you can call this class into your program and use it by creating an instance of the class:
translate_lang_check.py
class Translate_lang:
def __init__(self):
self.t_lang = {'Afrikaans': 'af', 'Albanian': 'sq', 'Amharic': 'am',
'Arabic': 'ar', 'Armenian': 'hy', 'Azerbaijani': 'az',
'Bengali': 'bn', 'Bosnian': 'bs', 'Bulgarian': 'bg',
'Catalan': 'ca', 'Chinese (Simplified)': 'zh',
'Chinese (Traditional)': 'zh-TW', 'Croatian': 'hr',
'Czech': 'cs', 'Danish': 'da ', 'Dari': 'fa-AF',
'Dutch': 'nl ', 'English': 'en', 'Estonian': 'et',
'Farsi (Persian)': 'fa', 'Filipino Tagalog': 'tl',
'Finnish': 'fi', 'French': 'fr', 'French (Canada)': 'fr-CA',
'Georgian': 'ka', 'German': 'de', 'Greek': 'el', 'Gujarati': 'gu',
'Haitian Creole': 'ht', 'Hausa': 'ha', 'Hebrew': 'he ', 'Hindi': 'hi',
'Hungarian': 'hu', 'Icelandic': 'is', 'Indonesian': 'id ', 'Italian': 'it',
'Japanese': 'ja', 'Kannada': 'kn', 'Kazakh': 'kk', 'Korean': 'ko',
'Latvian': 'lv', 'Lithuanian': 'lt', 'Macedonian': 'mk', 'Malay': 'ms',
'Malayalam': 'ml', 'Maltese': 'mt', 'Mongolian': 'mn', 'Norwegian': 'no',
'Persian': 'fa', 'Pashto': 'ps', 'Polish': 'pl', 'Portuguese': 'pt',
'Romanian': 'ro', 'Russian': 'ru', 'Serbian': 'sr', 'Sinhala': 'si',
'Slovak': 'sk', 'Slovenian': 'sl', 'Somali': 'so', 'Spanish': 'es',
'Spanish (Mexico)': 'es-MX', 'Swahili': 'sw', 'Swedish': 'sv',
'Tagalog': 'tl', 'Tamil': 'ta', 'Telugu': 'te', 'Thai': 'th',
'Turkish': 'tr', 'Ukrainian': 'uk', 'Urdu': 'ur', 'Uzbek': 'uz',
'Vietnamese': 'vi', 'Welsh': 'cy'}
def check_lang_in_translate(self, given_country):
if given_country in self.t_lang:
return self.t_lang[given_country]
else:
return None
def check_lang_code_in_translate(self, given_lang):
if given_lang in list(self.t_lang.values()):
return True
else:
return False
You can call and check your lang codes using the methods of this class:
from translate_lang_check import Translate_lang
tl = Translate_lang()
print(tl.check_lang_code_in_translate('en'))

issue while creating VM instance using python code in GCP

I am trying to write a code which will read values from excel file and will create VMs in Google Cloud. I am facing problem at two locations, while creating tags if I use 'items': [tag] or while creating service account scope it starts giving me error.
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
import csv
credentials = service_account.Credentials.from_service_account_file('G:/python/json/mykids-280210.json')
compute = googleapiclient.discovery.build('compute', 'v1', credentials=credentials)
def create_instance(compute, vm_name, image_project, image_family, machinetype, startupscript, zone, network,
subnet, project, scope, tag):
# Get the latest Debian Jessie image.
image_response = compute.images().getFromFamily(
project=image_project, family=image_family).execute()
source_disk_image = image_response['selfLink']
# Configure the machine
machine_type = "zones/" + zone + "/machineTypes/" + machinetype
startup_script = startupscript
config = {
'name': vm_name,
'machineType': machine_type,
'description': 'This VM was created with python code',
'tags': {
'items': ['external', 'home', 'local'] #'items': [tag] <~~~~~~~~~~~
},
'deletionProtection': False,
'labels': {'env': 'dev', 'server': 'mytower', 'purpose': 'personal'},
# Specify the boot disk and the image to use as a source.
'disks': [
{
'boot': True,
'autoDelete': True,
'initializeParams': {
'sourceImage': source_disk_image,
}
}
],
# Specify a network interface with NAT to access the public
# internet.
'networkInterfaces': [{
'network': 'global/networks/' + network,
'subnetwork': 'regions/us-central1/subnetworks/' + subnet,
'accessConfigs': [
{'type': 'ONE_TO_ONE_NAT', 'name': 'External NAT'}
]
}],
# Allow the instance to access cloud storage and logging.
'serviceAccounts': [{
'email': 'default',
'scopes': [
#'https://www.googleapis.com/auth/devstorage.read_write', 'https://www.googleapis.com/auth/logging.write'
#scope # scope <~~~~~~~~~~~~~~~~~~~~
]
}],
'scheduling': {
"preemptible": True
},
# Metadata is readable from the instance and allows you to
# pass configuration from deployment scripts to instances.
'metadata': {
'items': [{
# Startup script is automatically executed by the
# instance upon startup.
'key': 'startup-script',
'value': startup_script
}]
}
}
return compute.instances().insert(
project=project,
zone=zone,
body=config).execute()
# [END create_instance]
with open('vms.csv', newline='') as csvfile:
data = csv.DictReader(csvfile)
for row in data:
vm_name = row['vm_name']
image_project = row['image_project']
image_family = row['image_family']
machinetype = row['machinetype']
startupscript = row['startupscript']
zone = row['zone']
network = row['network']
subnet = row['subnet']
project = row['project']
scope = row['scopes']
tag = row['tags']
print(create_instance(compute, vm_name, image_project, image_family, machinetype, startupscript, zone, network,
subnet, project, scope, tag))
csvfile.close()
error when use scope variable
G:\python\pythonProject\venv\Scripts\python.exe G:/python/pythonProject/read-excel-gcp/vm/create_vm.py
Traceback (most recent call last):
File "G:\python\pythonProject\read-excel-gcp\vm\create_vm.py", line 100, in <module>
print(create_instance(compute, vm_name, image_project, image_family, machinetype, startupscript, zone, network,
File "G:\python\pythonProject\read-excel-gcp\vm\create_vm.py", line 79, in create_instance
return compute.instances().insert(
File "G:\python\pythonProject\venv\lib\site-packages\googleapiclient\_helpers.py", line 134, in positional_wrapper
return wrapped(*args, **kwargs)
File "G:\python\pythonProject\venv\lib\site-packages\googleapiclient\http.py", line 915, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://compute.googleapis.com/compute/v1/projects/mykids-280210/zones/us-central1-a/instances?alt=json returned "One or more of the service account scopes are invalid: 'https://www.googleapis.com/auth/devstorage.read_write', 'https://www.googleapis.com/auth/logging.write''". Details: "One or more of the service account scopes are invalid: 'https://www.googleapis.com/auth/devstorage.read_write', 'https://www.googleapis.com/auth/logging.write''">
Process finished with exit code 1
I get a similar error when I use tag variable.
I have # the value the way I am passing in the above code.
Below is my csv file details
vm_name,image_project,image_family,machinetype,startupscript,zone,network,subnet,project,scopes,tags
python-vm1,debian-cloud,debian-9,e2-micro,G:/python/json/startup-script.sh,us-central1-a,myvpc,subnet-a,mykids-280210,"https://www.googleapis.com/auth/devstorage.read_write', 'https://www.googleapis.com/auth/logging.write'","external', 'home', 'local'"
python-vm2,debian-cloud,debian-9,e2-micro,G:/python/json/startup-script.sh,us-central1-a,myvpc,subnet-a,mykids-280210,"https://www.googleapis.com/auth/devstorage.read_write', 'https://www.googleapis.com/auth/logging.write'","external', 'home', 'local'"
I am not sure that when the value are passed directly it works, but when passing the value through variable, it fails.
I have marked the problem area with <~~~~~~~~~~~~
Please suggest if anyone understands the issue.
#d.s can you try changing your scope format to something like this:
'serviceAccounts': [
{
'email': 'default'
'scopes':[
'https://www.googleapis.com/auth/compute',
'https://www.googleapis.com/auth/servicecontrol',
'https://www.googleapis.com/auth/service.management.readonly',
'https://www.googleapis.com/auth/logging.write',
'https://www.googleapis.com/auth/monitoring.write',
'https://www.googleapis.com/auth/trace.append',
'https://www.googleapis.com/auth/devstorage.read_write']}]
The listed scopes are the default scopes that you will need for an instance. I think the problem you are facing is you where trying to only list two scopes which are not enough to allow you to deploy your instance.

Pulling item properties from Microsoft Sharepoint document library with Microsoft Graph API

I'm able to successfully pull file metadata from my SharePoint library with the Microsoft Graph API, but am having trouble pulling the properties of an item:
I can get a partial list of properties using this endpoint:
https://graph.microsoft.com/v1.0/sites/{site-id}/drives/{}/items/{}/children?$expand=listItem($expand=fields)
But the list that comes from this endpoint doesn't match the list of properties that exists on the item.
For example, below is a list of fields that come from that endpoint - you can see that '.Push Too Salsify.' (one of the fields I need) is not present. There are also other fields that exist but don't appear in the item properties:
{'ParentLeafNameLookupId': '466', 'CLIPPING_x0020_STATUS': 'Not Started', 'Edit': '0', 'EditorLookupId': '67', '_ComplianceTagWrittenTime': '', 'RequiredField': 'teams/WORKFLOWDEMO/Shared Documents/1062CQP6.Phase4/1062CQP-Phase4-Size.tif', 'PM_x0020_SIGN_x0020_OFF': 'No', 'QA_x0020_APPROVED': 'No', 'ImageWidth': 3648, 'PM_x0020_Approval_x0020_Status': '-', 'AuthorLookupId': '6', 'SelectedFlag': '0', 'NameOrTitle': '1062CQP-Phase4-Size.tif', 'ItemChildCount': '0', 'FolderChildCount': '0', 'LinkFilename': '1062CQP-Phase4-Size.tif', 'ParentVersionStringLookupId': '466', 'PHOTOSTATUS': 'Not Started', '#odata.etag': '"c4b7516e-64df-46d2-b916-a1ee6f29d24a,8"', 'Thumbnail': '3648', '_x002e_Approval_x0020_Status_x002e_': 'Approved', 'Date_x0020_Created': '2019-10-09T04:25:40Z', '_CommentCount': '', 'Created': '2019-10-09T04:25:33Z', 'PreviewOnForm': '0', '_ComplianceTag': '', 'FileLeafRef': '1062CQP-Phase4-Size.tif', 'ImageHeight': 3648, 'LinkFilenameNoMenu': '1062CQP-Phase4-Size.tif', '_ComplianceFlags': '', 'ContentType': 'Document', 'Preview': '3648', 'ImageSize': '3648', 'Product_x0020_Category': 'Baseball', 'DATE_x0020_ASSIGNED': '2019-10-09T04:25:40Z', 'DateCreated': '2019-10-09T04:25:40Z', 'WORKFLOW_x0020_SELECTION': ['Select'], 'Predecessors': [], 'FileType': 'tif', 'LEGAL_x0020_APPROVED': 'No', 'PUSH_x0020_READY': False, 'FileSizeDisplay': '74966432', 'id': '466', '_LikeCount': '', '_ComplianceTagUserId': '', 'Modified': '2019-10-09T14:41:25Z', 'DocIcon': 'tif', '_UIVersionString': '0.7', '_CheckinComment': ''}
Any help would be greatly appreciated. I've scoured the documentation and can't seem to find the correct endpoint to pull item properties from a Sharepoint DriveItem.

how to attach AWS scp-organization unit's (OUs) policies to particular organization unit(OU) using python script?

YAML file:
Organization-unit:
xyz
Organization-unit:
xyz2
Policies:
- name: scp-xyz
description:
- name: scp-xyz2
description:
using python how to attach xyz organization unit to scp-xyz only under aws organization.
output: list of policies and organization unit info
{'Policies': [{'Id': 'p-xw7j86as', 'Arn': '', 'Name': 'scp-xyz', 'Description': 'Allows access to every operation', 'Type': 'SERVICE_CONTROL_POLICY', 'AwsManaged': True}
{'Policies': [{'Id': 'p-xw7j006as', 'Arn': '', 'Name': 'scp-xyz2', 'Description': 'Allows access to every operation', 'Type': 'SERVICE_CONTROL_POLICY', 'AwsManaged': True}
{'OrganizationalUnit': {'Id': 'ou-uwjh-87', 'Arn': 'a', 'Name': 'xyz'}
{'OrganizationalUnit': {'Id': 'ou-uw-87', 'Arn': 'a', 'Name': 'xyz2'}
I have below sample to attach policy to OU:
response = client.attach_policy(
PolicyId='string', #policy ID string requires "p-" followed by from 8 to
128 lower-case letters or digits.
TargetId='string' # ID of OU
)
Can anyone please guide me how can I do this task ?
REsult should be like below :
[![SCP-xyz policy should attach with xyz organization unit][1]][1]

Resources