fastapi how to read nested json as dictionary? - python-3.x

I am trying to receive the following JSON:
{
"va": "{1: 5, 2:1, 3:5}"
}
in my main.py I have the following:
from typing import Optional, Dict
from fastapi import FastAPI
from pydantic import BaseModel
class rq(BaseModel):
va: Dict[str, str]
app = FastAPI(debug=True)
#app.post("/hello")
async def create_item(rq: rq):
return 1
but I get
"msg": "value is not a valid dict",
"type": "type_error.dict"
how may I receive va as dict to iterate over it?

When you create a model, every field is actually a key-value pair, so with your example it expects something like this:
{
"va": {"some":"value"}
}
But what you send is
"va": str
So i don't know how you send the value but you are definitely sending a str instead of a Dict[str, str]

Related

FastAPI using Pydantic returns {'detail': [{'loc': ['body', 'dict'], 'msg': 'field required', 'type': 'value_error.missing'}]}

Using the code posted below, FastAPI returns a 422 Unprocessable Entity Error when I pass a json dictionary through a post request. I am using a Pydantic BaseModel advised in this documentation: https://fastapi.tiangolo.com/tutorial/body/#__tabbed_3_1
I've shortened the code snippets so it's easier to read and reproduce.
##CLIENT.PY##
import requests
import json
url = 'http://127.0.0.1:8000/'
param1 = 0
param2 = 'hello world'
param3 = True
data = {"param1":param1,
"param2":param2,
"param3":param3,
}
resp = requests.post(url=url,json=json.dumps(data))
print(resp.json())
##MAIN.PY##
from fastapi import FastAPI, UploadFile
from typing import List, Union
import uvicorn
from pydantic import BaseModel
app = FastAPI()
class Params(BaseModel):
orm_maram1ode=True
param1: int
param2: Union[str, None] = None
param3: Union[bool, None] = None
#app.post("/")
async def API(dict:Params):
return(dict)
if __name__ == '__main__':
uvicorn.run(app='main:app', host='127.0.0.1',port=8000)
Thank you in advance!
Ive Found similar posts on Stack Overflow, but unfortunately these solutions are not relevent

Sending Notifications to certain clients using Server Sent Events in FastApi?

I have been working with server sent events to send out certain type of notifications to only certain clients. I am using the module called sse-starlette to try and achieve this.
I am fairly new to FastApi so I am not able to figure out how to send the data to only certain clients instead of broadcasting to everyone.
This is what I have thought so far:
Subscribe request with query param
localhost:8000/subscribe?id=1
from sse_starlette.sse import EventSourceResponse
class EmitEventModel(BaseModel):
event_name: str
event_data: Optional[str] = "No Event Data"
event_id: Optional[int] = None
recipient_id: str
async def connection_established():
yield dict(data="Connection established")
clients = {}
#app.get("/subscribe")
async def loopBackStream(req: Request, id: str = ""):
clients[id] = EventSourceResponse(connection_established())
return clients[id]
#app.post("/emit")
async def emitEvent(event: EmitEventModel):
if clients[event.recipient_id]:
clients[event.recipient_id](publish_event())
Whenever there is an api call to localhost:8000/emit containing the body, Based on the recipient_id the event is going to be routed.
Ofcourse this doesn't work so far. Any pointers as to what should be done to achieve this?
sse_starlette for reference:
https://github.com/sysid/sse-starlette/blob/master/sse_starlette/sse.py
The idea here is that you're going to need to identify the recipient_id on the SSE generator. I've slightly modified your code, to be able to show what I mean:
from __future__ import annotations
import asyncio
import itertools
from collections import defaultdict
from fastapi import Request, FastAPI
from pydantic import BaseModel
from sse_starlette.sse import EventSourceResponse
app = FastAPI()
clients = defaultdict(list)
class EmitEventModel(BaseModel):
event_name: str
event_data: Optional[str] = "No Event Data"
event_id: Optional[int] = None
recipient_id: str
async def retrieve_events(recipient_id: str) -> NoReturn:
yield dict(data="Connection established")
while True:
if recipient_id in clients and len(clients[recipient_id]) > 0:
yield clients[recipient_id].pop()
await asyncio.sleep(1)
print(clients)
#app.get("/subscribe/{recipient_id}")
async def loopBackStream(req: Request, recipient_id: str):
return EventSourceResponse(retrieve_events(recipient_id))
#app.post("/emit")
async def emitEvent(event: EmitEventModel):
clients[event.recipient_id].append(event)

How to pass typedDict as arguments in Python FastAPI?

I am working on this simple FastAPI example and I would like to replace the read_books arguments with an equivalent TypedDict, like the commented line.
Is it possible?
from fastapi import FastAPI, Query
from typing import Optional, TypedDict
# would like to define model for query parameter in separeted object
class GetModel(TypedDict):
test: Optional[str]
query_param = GetModel(test=Query('default value', max_length=10, title='titulo', description="description 123", regex="^[a-z]+"))
app = FastAPI()
#app.get("/books")
def read_books(test: Optional[str] = Query('default value', max_length=10, title='titulo', description="description 123", regex="^[a-z]+")):
#def read_books(query_param):
"""
comentario do endpoint
"""
results = {"books": [{"book_name": "The Great Hunt"}, {"book_name": "The Dragon Reborn"}]}
if test:
results.update({"test": {"name":test}})
return results

Download a database under Excel format

I am creating an application in which there is a button to download the database. I am working in the back with FastAPI and MongoDB.
filtered_db = db.collection.find(base_query)
docs = []
async for doc in docs:
docs.append(Document(**doc).dict())
df = pd.DataFrame(docs)
return Response(content=df, media_type="text/csv")
Here is a sample of my code, filtering database that is in MongoDB, then using a model and transforming it into a dataframe. But this is not working, could you help me?
I have the error: "AttributeError: 'DataFrame' object has no attribute 'encode'"
First, if you are trying to append the retrieved filtered_db data to a docs list. You're instead reading from an empty list docs
It should be
filtered_db = db.collection.find(base_query)
docs = []
async for doc in filtered_db:
If you are trying to export the data to a .csv file then you can use the DataFrame.to_csv function
return df.to_csv('output.csv')
If you want your API to show the .csv file in the Response, you can do it like this
return Response(content=df.to_csv(), media_type="text/csv")
By the way, here is a complete FastAPI example
from fastapi import FastAPI, Response
from pydantic import BaseModel
import pandas as pd
app = FastAPI()
class Document(BaseModel):
id: int
column1: str
column2: str
filtered_db = [
{"id": "1", "column1": "c1-value1", "column2": "c2-value1"},
{"id": "2", "column1": "c1-value2", "column2": "c2-value2"}
]
async def read_data(data: list):
docs = []
for doc in data:
docs.append(Document(**doc).dict())
df = pd.DataFrame(data=docs)
return df
#app.get("/xlsx")
async def get_excel():
df = await read_data(filtered_db)
# export data to csv
# df.to_csv('output.csv')
return Response(content=df.to_csv(), media_type="text/csv")

Mocking global/module level variables in Python

I have a module that looks like this:
import psycopg2
client = vault_client(vault_url, vault_certs_path, credentials)
vault_data = client.read(vault_path)['data']
def do_thing
connection = psycopg2.connect(
dbname=vault_data['database'],
host=vault_data['cluster_name'],
...
)
How do I test this do_thing method. I need to mock out vault_data and the import of psycopy2. I need to ensure:
That the psycopg2.connect methods receives the right arguments
How do I mock the vault_client method to return a mock that then returns a dictionary when the read method is called on the mock?
I have this but the real methods get called:
#mock.patch("sources.segment.handler")
#mock.patch("sources.segment.handler.psycopg2")
def test_attempts_to_connect_to_redshift(self, mock_psycopg2, mock_handler):
mock_handler.vault_client.return_value = {
"data": {
"database": "some_database",
"cluster_name": "some_cluster_name",
"port": "some_port",
"username": "some_username",
"password": "some_password",
}
}
do_thing()
mock_psycopg2.connect.assert_called_with("some database")
...
It seems that the problem here is that it's too late to mock something by the time the module with do_thing() has already been imported. You can try the following trick:
import sys
from unittest.mock import MagicMock
psycopg2_mock = MagicMock()
sys.modules['psycopg2'] = psycopg2_mock
import module_with_do_thing
del sys.modules['psycopg2']
class TestSomething(unittest.TestCase):
...
def test_attempts_to_connect_to_redshift(self):
...
assert psycopg2_mock.has_required_properties
Moving the import module_with_do_thing line to the actual test method with the right patch might work as well.

Resources