read credentials from aws by boto 3 - botocore

import boto3
from botocore.client import Config
aws_key = config.get('aws_access_key_id')
aws_sec = config.get('aws_secret_access_key')
client = boto3.client(
's3',
# Hard coded strings as credentials, not recommended.
aws_access_key_id=aws_key,
aws_secret_access_key=aws_sec
)
I am getting the error:
Traceback (most recent call last):
File "C:\Freedom\Comparing_Files_in_windows.py", line 18, in
aws_key = config.get('aws_access_key_id')
NameError: name 'config' is not defined.
I installed also botocore and still I have this error.

You haven't defined config. I assume that is a configuration file used to store your credentials. However, there are better ways to set your credentials, you can for example use an AWS credentials file or set them as environment variables.
Boto3 will automatically pick up these credentials, so no need to explicitly extract them, however, if you require them for backwards compatibility (that is how I stumbled upon this post), you can retrieve these S3 credentials with the get_credentials method
import boto3
session = boto3.Session()
credentials = session.get_credentials()
access_key = credentials.access_key
secret_key = credentials.secret_key

Related

Azure-quantum authentication is near to impossible with EnvironmentCredential

The code below to solve a simple problem on azure quantum never works. Microsoft, please help me with authentication.
from azure.quantum.optimization import Problem, ProblemType, Term
from typing import List
from azure.quantum.optimization import Term
from azure.quantum import Workspace
from azure.identity import EnvironmentCredential
from azure.quantum.optimization import ParallelTempering
problem = Problem(name="My First Problem", problem_type=ProblemType.ising)
workspace = Workspace (
subscription_id = "my-subscription-id", # Add your subscription_id
resource_group = "AzureQuantum", # Add your resource_group
name = "my-workspace-name", # Add your workspace name
location = "my-workspace-location" , # Add your workspace location (for example, "westus")
credential = EnvironmentCredential(AZURE_USERNAME="my-email-id", AZURE_PASSWORD="my-microsoft-password")
# credential = ManagedIdentityCredential()
)
terms = [
Term(c=-9, indices=[0]),
Term(c=-3, indices=[1,0]),
Term(c=5, indices=[2,0]),
Term(c=9, indices=[2,1]),
Term(c=2, indices=[3,0]),
Term(c=-4, indices=[3,1]),
Term(c=4, indices=[3,2])
]
problem.add_terms(terms=terms)
solver = ParallelTempering(workspace, timeout=100)
result = solver.optimize(problem)
print(result)
The above code throws the error:
EnvironmentCredential.get_token failed: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot.this issue.
---------------------------------------------------------------------------
CredentialUnavailableError Traceback (most recent call last)
[<ipython-input-19-90cd448f8194>](https://localhost:8080/#) in <module>()
3 solver = ParallelTempering(workspace, timeout=100)
4
----> 5 result = solver.optimize(problem)
6 print(result)
19 frames
[/usr/local/lib/python3.7/dist-packages/azure/identity/_credentials/environment.py](https://localhost:8080/#) in get_token(self, *scopes, **kwargs)
141 "this issue."
142 )
--> 143 raise CredentialUnavailableError(message=message)
144 return self._credential.get_token(*scopes, **kwargs)
CredentialUnavailableError: EnvironmentCredential authentication unavailable. Environment variables are not fully configured.
Visit https://aka.ms/azsdk/python/identity/environmentcredential/troubleshoot to troubleshoot this issue.
Details
The above code works perfectly fine when I do not pass any credentials to the workspace, but this pops up a window for authentication. I do not want to click manually on the browser every time I run something to autheticate. I just want to pass the credentials in code with ease without having to deal with all the complicated things defined for the authentication in the docs.
Note: I'm passing my email and password in the EnvironmentCredential (I'm obviously not writing confidential info here like subscription id passed in workspace)
The EnvironmentCredential takes its parameters from environment variables -not from arguments in the constructor-, so in order to avoid getting prompted for credentials you would need to set the corresponding environment variables with the correct values, then run your program. Something like:
set AZURE_USERNAME="my-email-id"
set AZURE_PASSWORD="my-microsoft-password"
python myprogram.py
in your code you would then do something like:
workspace = Workspace (
subscription_id = "my-subscription-id", # Add your subscription_id
resource_group = "AzureQuantum", # Add your resource_group
name = "my-workspace-name", # Add your workspace name
location = "my-workspace-location" , # Add your workspace location (for example, "westus")
credential = EnvironmentCredential()
)
That being said, an easier way to not get prompted for credentials is to install the Azure CLI and login using az login; that will prompt once for credentials and then persist them locally in your machine so you don't have to login again.
Optionally, if you are using VS Code you can install the Azure Account extension.
For all these options, you don't need to provide any credential in the AzureQuantum constructor. It will automatically try to discover if you've set the environment variables, used the CLI or the extension to log in and used that. It only defaults to prompt on the browser if it can't find anything else.

I am trying to read a file in in Google Cloud Storage bucket with a python code but getting the error

I am trying to read a file in stored in Google Cloud Storage bucket python:
textfile = open("${gcs_bucket}mdm/OFF-B/test.txt", 'r')
times = textfile.read().splitlines()
textfile.close()
print(getcwd())
print(times)
The file is present in that location but I am receiving the following error:
File "/var/cache/tomcat/temp/interpreter-9196592956267519250.tmp", line 3, in <module>
textfile = open("gs://tp-bi-datalake-mft-landing-dev/mdm/OFF-B/test.txt", 'r')
IOError: [Errno 2] No such file or directory: 'gs://tp-bi-datalake-mft-landing-dev/mdm/OFF-B/test.txt'
That's because you are trying to read it as a local file.
To read from Cloud Storage you need to import the library and use the client.
Check this similar Stackoverflow Question.
In your case it would be something like:
from google.cloud import storage
# Instantiates a client
client = storage.Client()
bucket_name = 'tp-bi-datalake-mft-landing-dev'
bucket = client.get_bucket(bucket_name)
blob = bucket.get_blob('mdm/OFF-B/test.txt')
downloaded_blob = blob.download_as_string()
print(downloaded_blob)
Also you will need to install the library, you can do that simply by running:
pip install google-cloud-storage before you run your code.
Also here you can find some more Google Cloud Storage Python Samples.

why is Airtable API and Python not workinng?;

I am new to Python and APIs and I am trying to get some data from the API. More specifically I am trying to get some tables from a base. I am following this GitHub repository's instructions:
https://github.com/bayesimpact/airtable-python
So far I have created a virtual environment called mypyth and in there I have initiated a script called get_data.py
This is what my script looks like (I have written API KEY instead of my real api key:
import requests
from airtable import airtable
at = airtable.Airtable('appa3r2UUo4JxpjSv', 'API KEY')
at.get('Campaigns')
Now these are the commands I run on the console:
(mypyth) PS C:\Users\andri\PythonProjects\mypyth> py do_get_account.py
Traceback (most recent call last):
File "do_get_account.py", line 2, in <module>
from airtable import airtable
ModuleNotFoundError: No module named 'airtable'
(mypyth) PS C:\Users\andri\PythonProjects\mypyth>
Does anyone understand why I get this error? Perhaps there is a step I havent followed? Thanks in advace for any answer

Unable to create s3 bucket using boto3

I'm trying to create a aws bucket from python3 using boto3. create_bucket() is the method I use. Still I get an error botocore.errorfactory.BucketAlreadyExists
MY CODE:
import boto3
ACCESS_KEY = 'theaccesskey'
SECRET_KEY = 'thesecretkey'
S3 = boto3.client('s3',
aws_access_key_id = ACCESS_KEY,
aws_secret_access_key = SECRET_KEY)
response = S3.create_bucket(Bucket='mynewbucket',
CreateBucketConfiguration={'LocationConstraint':'ap-south-1'})
ERROR:
botocore.errorfactory.BucketAlreadyExists: An error occurred (BucketAlreadyExists)
when calling the CreateBucket operation: The requested bucket name is not available.
The bucket namespace is shared by all users of the system.
Please select a different name and try again.
However, the Bucket does not exist and it still failed to create the bucket.
EDIT
I found the reason from the link and I also posted that in answers in-order to help someone.
I got it after reading few articles on-line. The bucket name should be globally unique once it satifies that condition it works as I expect.
I share this to help someone wonders just like me
Reference

migrating from boto2 to 3

I have this code that use boto2 that I need to port to boto3, and frankly I got a little lost in the boto3 docs:
connection = boto.connect_s3(host=hostname,
aws_access_key_id=access_key,
aws_secret_access_key=secret_key,
is_secure=False,
calling_format=boto.s3.connection.OrdinaryCallingFormat())
s3_bucket = connection.get_bucket(bucket_name)
I also need to make this work with other object stores that aren't aws S3.
import boto3
s3 = boto3.client('s3', aws_access_key_id=access_key,
aws_secret_access_key=secret_key,
endpoint_url=hostname, use_ssl=False)
response = s3.get_bucket(Bucket=bucket_name)
client docs
s3 docs
boto3 and boto are incompatible. Most of the naming are NOT backward compatible.
You MUST read the boto3 documentation to recreate script. The good news is, Boto3 documentation is better than boto, though not superb (many tricky parameter example not provided) .
If you have some apps using some old function, you should create a wrapper code for it to make the switching transparent.
Thus, you instance any object store connection through wrapper, then instantiate various bucket usign different connector. Here is some idea.
#AWS
# object_wrapper is a your bucket wrapper that All the application willc all
from object_wrapper import object_bucket
from boto3lib.s3 import s3_connector
connector = s3_connector()
bucket = object_bucket(BucketName="xyz", Connector=connector)
# say you use boto2 to connect to Google object store
from object_wrapper import object_bucket
from boto2lib.s3 import s3_connector
connector = s3_connector()
bucket = object_bucket(BucketName="xyz", Connector=connector)
# say for Azure
from object_wrapper import object_bucket
from azure.storage.blob import BlockBlobService
connector = BlockBlobService(......)
bucket = object_bucket(BucketName="xyz", Connector=connector)

Resources