Python Flask Azure mail daemon options and O365 - python-3.x

[UPDATE]
I have implemented the below solution but am running into problems when running as a service due to the manual intervention required from the print output.
Any other suggestions for a solution would be most welcome :(
[ORIGINAL]
I have been looking for a mail daemon type solution for a flask application. I'd like to integrate with Auzure rather than run a local mail server, so the MS Graph API looked to be a great choice. I then came across O365 which looked to be a perfect solution.
So I setup some testing and it works great, I even tried setting up the 2 step authentication as an option for login identity (Web app based authentication interface). This works but is not something I want for this project. I would just like the flask app to be able to send emails using Azure (app invites, password resets etc, various mail automations). Moving on...
I found the following authentication method to work perectly well and I dont need to handel any tokens and refreshing. However I cant find anything in the docs to suggest I'd be able to authentication this way but not print() the URL but redirect the user. Id like to be able to then capture the redirection in a route flow.
Have I missed something here?
from O365 import Account
credentials = ('CLIENT-ID', 'CLIENT-SECRET')
account = Account(credentials)
if not account.is_authenticated:
account.authenticate(scopes=['basic', 'message_all'])
print('Authenticated!')
# A URL is printed if not authed, you would then be required to navigate to this URL,
# give consent to the app then paste the returned URL back into the console.
# I want to try avoid this direct console approach.

Update:
As no one responded to my calls for help, I thought this might be useful to some.
I resolved this by using the two-stage authentication flow and TokenBackendStorage; details are here.
I collected Azure App details via a web front-end. It supplies instructions for configuring the application in Azure and captures relevant details to setup token-based access.
My issues were threefold. I wasn't setting permissions correctly (delegate vs application), I wasn't reading the documentation thoroughly enough, so was implementing a mishmash of auth methods.
This will need refining as its currently POC, storage of the token, for instance, should be encrypted either to disk or within a table on the DB and permissions should be set appropriately. Following the principle of least privilege.
Example authentication two-stage flow:
#app.route('/mail-config-step-one', methods=['GET'])
def mail_config_step_one():
if request.method == 'GET':
scopes = ['message_all']
try:
mail_data = db.session.query(mail_conf).one()
except:
mail_data = None
if mail_data :
application_id = mail_data.client_id
application_secret = mail_data.client_s
web_redirect = mail_data.web_redir
callback = web_redirect
credentials = (application_id, application_secret)
token_backend = FileSystemTokenBackend(token_path='./', token_filename='o365_token.txt')
account = Account(credentials, scopes=scopes, token_backend=token_backend, tenant_id='TENANT ID')
url, state = account.con.get_authorization_url(redirect_uri=callback)
mail_conf.query.filter_by(id=1).update(dict(auth_s=state))
print(url)
db.session.commit()
return redirect(url)
else:
return redirect(url_for('login'))
else:
return redirect(url_for('login'))
#app.route('/mail-config-step-two/', methods=['GET', 'POST'])
def mail_config_step_two():
scopes = ['message_all']
mail_data = db.session.query(mail_conf).one()
application_id = mail_data.client_id
application_secret = mail_data.client_s
web_redirect = mail_data.web_redir
credentials = (application_id, application_secret)
token_backend = FileSystemTokenBackend(token_path='./', token_filename='o365_token.txt')
account = Account(credentials, scopes=scopes, token_backend=token_backend, tenant_id='TENANT ID')
# retreive the state saved in mail-config-step-one
saved_state = mail_data.auth_s
# rebuild the redirect_uri used in auth_step_one
callback = web_redirect
result = account.con.request_token(request.url, state=saved_state, redirect_uri=callback)
# if the result is True, then authentication was successful
# and the auth token is stored in the token backend
if result:
flash('MailConfigurationForm: Authentication Was Successful!', 'success')
return redirect(url_for('configuration'))
else:
logger.error('mail_config_step_two failed: not post request')
return redirect(url_for('configuration'))

Related

Google Calendar API v3 error after allowing permission using oauth2.0

I followed the quickstart example to integrate my django app with google calendar. The difference from quickstart to my situation is that i just want to generate a URL and send it back to my user, through
from google_auth_oauthlib.flow import InstalledAppFlow
SCOPES = ['https://www.googleapis.com/auth/calendar']
flow = InstalledAppFlow.from_client_secrets_file(f"{PATH_TO_FILE}/{CLIENT_SECRET_FILE}", SCOPES)
(auth_url, state) = flow.authorization_url()
if is_dev():
auth_url += '&redirect_uri=http%3A%2F%2Flocalhost%3A43759%2F'
print(auth_url)
(OBS: I added this is_dev option, because no redirect_uri was not considered)
I get this printed URL and get this steps:
1- The URL from auth_url printed when i ran the program
2- After choosing my user
3- and BAM, I cant proceed (i am redirected to localhost:47759 and cant access)
What should I do?
we wen't through one solution, that 3 steps are important to talk about.
1- Create a new credential on Google Cloud, OAuth 2.0 Client ID for Web Application, as js origins with my local url, and another redirect URL authorized (this redirect solved the number 3 error for the question)
2- Also I read some examples about, and to get the user authorization, we send him an URL, if everything goes ok, he is redirected to our endpoint described above
from google_auth_oauthlib.flow import InstalledAppFlow
SCOPES = ['https://www.googleapis.com/auth/calendar']
flow = InstalledAppFlow.from_client_secrets_file(f"{PATH_TO_FILE}/{CLIENT_SECRET_FILE}", SCOPES)
flow.redirect_uri = URL_SAVED_ON_STEP_1
(auth_url, state) = flow.authorization_url()
print(auth_url)
3- And to the URL receiving my code, was necessary an endpoint where we could save the user credential and use it if we wanted to add an event on the user calendar
flow = InstalledAppFlow.from_client_secrets_file(f"{CONFIG_FILES_PATH}/{CLIENT_SECRET_FILE}", SCOPES)
flow.redirect_uri = URL_SAVED_ON_STEP_1
flow.fetch_token(authorization_response=request.url)
creds = flow.credentials
with open(f"{CONFIG_FILES_PATH}/token.json", 'w') as token:
token.write(creds.to_json())
So we can let any user share their calendar and we can manage as they allow

Search User Information Across different Microsoft Tenants

I want to be able to search for users across multiple tenants, and therefore my thoughts were to create a python script that runs on HTTP triggered Azure functions. This python script can authenticate to Microsoft Graph API for different tenants via service principals and then search for a user and return the data. is this a good idea or is there a better way of doing this?
Let's discuss on the achievement.
I find that one multi-tenant azure ad application is enough for querying users in different tenant through graph api. For example, there're 2 tenants, I created a multi-tenant application in azure ad app registration, after that I generated the client secret and add api permission of User.Read.All.
Now I have an app with its client id and secret in 'tenant_a'. Next, visit https://login.microsoftonline.com/{tenant_b}/adminconsent?client_id={client-id} in the browser, after sign in with the admin account in tenant_b, it will appear a 'permission' window to make consent the application have permission in tenant_b, after the consent, you will the the app created in tenant_a appears in the list of Enterprise applications in tenant_b.
Now we need to generate access token for different tenant to call graph api. It's necessary to generate access token for each tenant, because I tried to use common to replace the domain in the request(https://login.microsoftonline.com/common/oauth2/v2.0/token), it can generate access token successfully, but the token can't used in the api to query user information. The query user api needs user principal name as the input parameter. For example, I have a user which account is 'bob#tenant_b.onmicrosoft.com', use the account as the parameter is ok to get response, but if I use 'bob' as the parameter, it will return 'Resource xxx does not exist...'.
I'm not an expert in python, I only found a sample and tested successfully with it. Here's my code, it will execute loop query until the user be found. And if you wanna a function, you may create a http trigger base on it.
import sys
import json
import logging
import requests
import msal
config = json.load(open(sys.argv[1]))
authorityName = ["<tenant_a>.onmicrosoft.com","<tenant_b>.onmicrosoft.com"]
username = "userone#<tenant_a>.onmicrosoft.com"
for domainName in authorityName:
# Create a preferably long-lived app instance which maintains a token cache.
print("==============:"+config["authority"]+domainName)
app = msal.ConfidentialClientApplication(
"<client_id>", authority="https://login.microsoftonline.com/"+domainName,
client_credential="<client_secret>",
)
# The pattern to acquire a token looks like this.
result = None
# Firstly, looks up a token from cache
# Since we are looking for token for the current app, NOT for an end user,
# notice we give account parameter as None.
result = app.acquire_token_silent(["https://graph.microsoft.com/.default"], account=None)
if not result:
result = app.acquire_token_for_client(scopes=["https://graph.microsoft.com/.default"])
if "access_token" in result:
print("access token===:"+result['access_token'])
# Calling graph using the access token
graph_data = requests.get( # Use token to call downstream service
"https://graph.microsoft.com/v1.0/users/"+username,
headers={'Authorization': 'Bearer ' + result['access_token']}, ).json()
if "error" in graph_data:
print("error===="+json.dumps(graph_data, indent=2))
else:
print(json.dumps(graph_data, indent=2))
break
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id"))

google-api-python-client : Generate oauth URL

Working with google-api-python-client for the first time, I'm trying to generate a "link" in my authorization flow that I can pass to a user for them to allow the app to access their calendar, then I need google to pass-back the token to my app.
Currently I have something like this (basically the getcredentials() function from the quickstart demo, with user-specific tokens and WebApplication credentials.json):
def find_creds(user_id):
creds = None
token_pickle = f'./credentials/{user_id}.token.pickle'
if os.path.exists(token_pickle):
with open(token_pickle, 'rb') as token:
creds = pickle.load(token)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'OAUTH.json', SCOPES) # Google WebApplication.json for OAUTH
creds = flow.run_local_server(port=0)
with open(token_pickle, 'wb') as token:
pickle.dump(creds, token)
return creds
This would work great if the user was running my application locally, however the issue is it's prompting on the server (vm) for a login, rather than passing the request to my users. No bueno.
The users are accessing my application through another application (which I don't control), so I can't really serve them a page to authorize the app - though I could pass them a URL/link to click.
This introduces a few new hurdles since the user isn't logging in locally, so I can't just "save" their authorization token.
The Authorization "flow" I'm trying to achieve should be (I think) something like this:
Pass the user a google authorization URL (I'm not sure how to generate this url/link, though I think it can be done with the google_auth_oauthlib.flow.InstalledAppFlow class. Maybe using authorization.url?)
User Authorizes app to access limited scope (calendar)
Google returns the users token back to my app (I guess this will need to be done via a return URi? So I think my server will need to run apache and have a listener running to collect/store credentials accordingly)
In tackling that first step, I'm already getting stuck though. I suspect that my flow object needs to change but I'm having a difficult time finding documentation on InstalledAppFlow:
Does it sound like I'm on the right track here? Any help/tips (or documentation) on InstalledAppFlow or google.oauth2.credentials class would be helpful too.
I've read through google-auth-library-python so far without figuring it out.

Youtube Data API v.3 - fully automated oAuth flow (Python)?

I have been exploring the YouTube Data API. The premise of my project is simple: using the API, authenticate (yes, I have the credentials for the account) and then simply retrieve the list of all my videos, public and private.
I have been able to accomplish this successfully, except for the fully automated part. I have used code from various sources and when I run it on the command line, it provides me a link to be used in a browser so that the authorization takes place.
It looks something like this:
Please visit this URL to authorize this application: https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=7932902759886-cb8ai84grcqshe24nn459ka46uh45ssj.apps.googleusercontent.com&redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fyoutube.readonly&state=zNVvgEyO47nmacvdEEAhDsQipY194k&prompt=consent&access_type=offline&code_challenge=aF7uTCghjwgwjg49o3fgiIU-_ryK19rDeX4l1uzr37w&code_challenge_method=S256
Enter the authorization code:
....
Here's a snippet of my python code:
import google_auth_oauthlib.flow
import googleapiclient.discovery
import googleapiclient.errors
...
...
# Get credentials and create an API client
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(
client_secrets_file, scopes)
credentials = flow.run_console()
youtube = googleapiclient.discovery.build(
api_service_name, api_version, credentials=credentials)
## MAKE youtube SEARCH REQUEST
last_date = '2018-10-01T00:00:00Z'
request = youtube.search().list(
part="snippet",
forMine=True,
maxResults=50,
order="date",
type="video"
)
all_items = []
response = request.execute()
My question here is the following: Is it possible to programatically perform the authorization so that the app can run standalone and not have to wait for this user action (to literally copy the URL from CMD, visit to get the token, and the copy and paste the token again)? I'd like to schedule this and therefore would like it to run and authenticate without human intervention. Is this possible at all? If so, can someone please point me to some working examples and/or other resources to help me get there? Thanks a million.
# -*- coding: utf-8 -*-
# Sample Python code for youtube.channels.list
# See instructions for running these code samples locally:
# https://developers.google.com/explorer-help/guides/code_samples#python
#!/usr/bin/python3.7
import os
import pickle
import google_auth_oauthlib.flow
import googleapiclient.discovery
import googleapiclient.errors
scopes = ["https://www.googleapis.com/auth/youtube.readonly"]
client_secrets_file = "client_secret.json"
api_service_name = "youtube"
api_version = "v3"
def main():
# Disable OAuthlib's HTTPS verification when running locally.
# *DO NOT* leave this option enabled in production.
os.environ["OAUTHLIB_INSECURE_TRANSPORT"] = "1"
# Get credentials and create an API client
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(
client_secrets_file, scopes)
youtube = get_authenticated_service()
request = youtube.channels().list(
part="contentDetails",
mine=True
)
response = request.execute()
print(response)
def get_authenticated_service():
if os.path.exists("CREDENTIALS_PICKLE_FILE"):
with open("CREDENTIALS_PICKLE_FILE", 'rb') as f:
credentials = pickle.load(f)
else:
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(client_secrets_file, scopes)
credentials = flow.run_console()
with open("CREDENTIALS_PICKLE_FILE", 'wb') as f:
pickle.dump(credentials, f)
return googleapiclient.discovery.build(
api_service_name, api_version, credentials=credentials)
if __name__ == "__main__":
main()
The Credentials instance from credentials = flow.run_console() has a built-in functionality to refresh token.
It'll will refresh the token when a request being execute if needed.
Therefore you can save the credentials object into pickle, and read it back when need it
A few alteration on Google python sample code:
def get_authenticated_service():
if os.path.exists(CREDENTIALS_PICKLE_FILE):
with open(CREDENTIALS_PICKLE_FILE, 'rb') as f:
credentials = pickle.load(f)
else:
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRETS_FILE, SCOPES)
credentials = flow.run_console()
with open(CREDENTIALS_PICKLE_FILE, 'wb') as f:
pickle.dump(credentials, f)
return build(API_SERVICE_NAME, API_VERSION, credentials = credentials)
copied from https://developers.google.com/identity/protocols/OAuth2InstalledApp
Step 3: Google prompts user for consent
In this step, the user decides whether to grant your application the requested access. At this stage, Google displays a consent window that shows the name of your application and the Google API services that it is requesting permission to access with the user's authorization credentials. The user can then consent or refuse to grant access to your application.
Your application doesn't need to do anything at this stage as it waits for the response from Google's OAuth 2.0 server indicating whether the access was granted. That response is explained in the following step.
Where this is important:
At this stage, Google displays a consent window that shows the name of your application and the Google API services that it is requesting permission to access with the user's authorization credentials.
So, at least as I interpret it, what you want to do should not be done for security reasons.
However: you can "simulate" a browser by how ever python have libs for do such. On the other hand: Once you got the auth-token you can re-use it instead of request a new token each time. I couldn't find it in provided doc on GitHub, but Java as example supports to store an obtained token along with its refresh token so it can be reused once obtained and auto-refreshed. Maybe python provides some way to store the obtained token (check if it contains a refresh token) and re-load it. Also: if you load such token, first you have to do is to refresh it before using it. Java provieds a way to just save a refresh token instead of the whole auth-token wich can be used in a later run to automatic obtain a new auth-token.
As response is a JSON maybe you could build some yourself if the lib doesn't already offer this.
// edit
In addition from https://github.com/googleapis/google-auth-library-python/blob/master/google/oauth2/credentials.py
There are methods to load a credential object either from an "authorized user info" (wich I also somewhere found can be loaded from file) or to load it directly from file. So, I guess you just have to figure out how to store the token. As doc says for from_authorized_user_file:
Creates a Credentials instance from an authorized user json file.
I guess that means you just have to save the token response you get after the initial authorization was done.

How to get multiple google credentials (Gmail API, Pub/Sub, Contacts) during signup process

I'm working on a Python based project that would gain access to various Google APIs such as Google Contacts API, Pub/Sub API, Gmail API etc.
Getting the relevant tokens and credentials with OAuth 2.0 for those APIs are highly manual at the moment via Google API console. I'd like automate this for multiple users who're willing to let me manage their gmail mailbox through APIs mentioned above (not just Gmail API).
How can I get the credentials for all these APIs during signup process so that I can save the credentials json file in db and then manage the mailboxes? "Sign-up with Google" feature produce just a basic credentials and I couldn't figure out how to route users to relevant page in which I ask for him/her permission to gain access to mailbox with the APIs (Google Contacts, Gmail and pub/sub APIs). Then I'm planning to use this credentials (object) in my Python script programmatically..
Here is the script that I create the credentials by get_credentials(). As you can see, I need to manually get client-secret-file at first on API Console, and then generate credentials wrt scopes with the following script (this is where I need to automate and get several other credentials during signup process)
SCOPES = 'https://www.googleapis.com/auth/gmail.modify'
CLIENT_SECRET_FILE = "client_secret_pubsub.json"
APPLICATION_NAME = "pub-sub-project-te"
def get_credentials():
home_dir = os.path.expanduser('~')
credential_dir = os.path.join(home_dir, '.credentials')
if not os.path.exists(credential_dir):
os.makedirs(credential_dir)
credential_path = os.path.join(credential_dir,
'gmail-python-quickstart.json')
store = oauth2client.file.Storage(credential_path)
credentials = store.get()
if not credentials or credentials.invalid:
flow = client.flow_from_clientsecrets(CLIENT_SECRET_FILE, SCOPES)
flow.user_agent = APPLICATION_NAME
print('Storing credentials to ' + credential_path)
return credentials
def pull_emails_from_mailbox(credentials_obj):
credentials = get_credentials()
http = credentials.authorize(Http())
GMAIL=discovery.build('gmail', 'v1', http=http)
user_id = 'me'
label_id_one = 'INBOX'
label_id_two = 'UNREAD'
# Getting all the unread messages from Inbox
# labelIds can be changed accordingly
messages = GMAIL.users().messages().list(userId=user_id, maxResults=1000).execute()
#unread_msgs = GMAIL.users().messages().list(userId='me',labelIds=[label_id_one,label_id_two]).execute()
# We get a dictonary. Now reading values for the key 'messages'
mssg_list = messages['messages']
print ("Total messages in inbox: ", str(len(mssg_list)))
final_list = []
new_messages=[]
for mssg in mssg_list:
m_id = mssg['id'] # get id of individual message
new_messages.append(GMAIL.users().messages().get(userId=user_id, id=m_id).execute()) # fetch the message using API
return new_messages
def prepare_raw_db (raw_messages):
messageId=[]
historyId=[]
raw=[]
print ("Total number of emails to be parsed:", len(raw_messages))
for msg in raw_messages:
messageId.append(msg["id"])
historyId.append(msg['historyId'])
raw.append(msg)
#'addLabelIds': ['UNREAD']
GMAIL.users().messages().modify(userId="me", id=msg["id"],body={ 'removeLabelIds': ['UNREAD'] }).execute()
msg_dict={"messageId":messageId, "historyId":historyId, "raw":raw}
df=pd.DataFrame(msg_dict)
df.raw=df.raw.astype(str)
return df
thanks
You've to make web server to do the same. The flow will be following -
User goes to your web app.
User clicks on Sign in with Google.
User will be redirected to Goole OAuth2 url with required scopes (in your case, it is Google Contacts API, Pub/Sub API, Gmail API etc.)
User will give access to your application created on Google Developer Console.
It will return with token/code for your application with required access as per OAuth2 request.
You can store the same in some database and can use it as per OAuth2.
Above process is given step-by-step here.
Make your scopes a list and then add them like that, for example
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly', 'https://www.googleapis.com/auth/spreadsheets']
This worked for me so i hope it helps

Resources