hypothesis has a lot of strategies and I'm still struggling with understanding them. It would help me a lot to see which values they generate. Is that possible?
MVCE
With hypothesis==5.18.3 and pydantic==1.5.1:
from typing import Optional
from hypothesis import given
from hypothesis.strategies import from_type
from pydantic import BaseModel
class Adress(BaseModel):
city: str
street: str
house_number: int
postal_code: int
class Person(BaseModel):
prename: str
middlename: Optional[str]
lastname: str
address: Adress
#given(from_type(Person))
def test_me(person: Person):
seen = [
Person(
prename="",
middlename=None,
lastname="",
address=Adress(city="", street="", house_number=0, postal_code=0),
),
Person(
prename="0",
middlename=None,
lastname="",
address=Adress(city="", street="", house_number=0, postal_code=0),
),
Person(
prename="",
middlename=None,
lastname="0",
address=Adress(city="", street="", house_number=0, postal_code=0),
),
Person(
prename="",
middlename=None,
lastname="",
address=Adress(city="", street="0", house_number=0, postal_code=0),
),
]
assert person in seen
As you can see, the way I currently figure out what hypothesis is doing is by manually adding it to this seen list. Is there a way to use a strategy as a generator / produce the list of values that the strategy tests?
I would recommend turning up the verbosity setting, which will print all the examples Hypothesis generates for your test.
If you're using pytest, you'll also need to disable output capturing: pytest -s --hypothesis-verbosity=verbose MY_TEST_HERE
Alternatively, in an interactive session you can call the .example() method on strategy objects to get an arbitrary example.
Related
Is it possible to reference multiple vars defined within a class by classmethod (or by some other means)?
For context I'm trying to consolidate the CRUD and model classes for a SQL database to simplify the codebase.
For example I'm looking to implement something like the below:
from __future__ import annotations
class Person:
name: str
gender: str
age: int
#classmethod
def get_person(cls, db: Session) -> list[Person]:
return db.query(cls.Person) # <-- Key part is here. I'll need to send name,
# gender, and age to the database. Currently
# this is implemented separately as
# `class CrudPerson` and `class ModelPerson`.
Adding from __future__ import annotations and referencing the class directly seems to work. (e.g. db.query(Person))
Additional information on this can be found in PEP 563
If you make Person a NamedTuple, you can use cls._fields.
Or if you make Person a dataclass, you can use dataclasses.fields(cls).
I'm porting a Laravel PHP code to Python Django/Django Rest Framework.
My endpoint will output JSON.
I need to output many objects, but I need to add extra computed values for each object.
How can I achieve this ?
For example, my model is :
from django.db import models
from rest_framework.serializers import ModelSerializer
class MyObject(models.Model):
name = models.CharField(max_length=255)
score = models.IntegerField()
class MyObjectSerializer(ModelSerializer):
class Meta:
model = MyObject
fields = ( 'name', 'score' )
I retrieve a queryset with MyObject.objects.all() (or with filter).
For each MyObject in my queryset, I compute an extra value, called 'stats', that I want to output in my JSON output.
For example, if I have 2 objects MyObject(name='foo',score='1') and MyObject(name='bar',score='2'), I will compute a stats value for each object.
And my JSON output should be like :
{
{
'name': 'foo',
'score': 1,
'stats': 1.2
},
{
'name': 'bar',
'score': 2,
'stats': 1.3
},
}
What is the cleanest way , if any to achieve this ?
I can have a loop for each MyObject, serialize each MyObject, one by one with a serializer, and create and update dictionary for this object adding 'stats' key.
I'm afaid about performance.
What if I compute stats value only for some objects, mixing 2 kind of output ?
You can use SerializerMethodField:
class MyObjectSerializer(ModelSerializer):
stat = SerializerMethodField()
class Meta:
model = MyObject
fields = ( 'name', 'score', 'stat' )
def get_stat(self, obj):
# obj is the model instance (it passes only one even if many=True)
# do calculations with obj and return the value
return None
If performance is a concern where stat field uses related/foreign key models, you can either use annotations or select_related/prefetch_related. Using annotation is more efficient but can get difficult to create depending on the requirement.
If it's possible to annotate you can use other serializer fields like:
class MyObjectSerializer(ModelSerializer):
stat = FloatField(read_only=True)
class Meta:
model = MyObject
fields = ( 'name', 'score', 'stat' )
Apart from what #kyell wrote, you can also create a property in models using #property decorator and return your calculated data, this property is always read only.
Like I used to do with FastAPI routes, I want to make a function that is expecting a dict. I want to type hint like in FastAPI with a Pydantic model.
Note that I am just using FastAPI as a reference here and this app serves a total different purpose.
What I did:
models.py
from pydantic import BaseModel
class Mymodel(BaseModel):
name:str
age:int
main.py
def myfunc(m:Mymodel):
print(m)
print(m.name)
myfunc({"name":"abcd","age":3})
It prints m as a normal dict and not Mymodel and m.name just throws an AttributeError.
I don't understand why it is behaving like this because the same code would work in FastAPI. Am I missing something here? What should I do to make this work.
I am expecting a dict arg in the func, I want to type hint with a class inherited from pydantic BaseModel. Then I want to acccess the attributes of that class.
I don't want to do:
def myfunc(m):
m = Mymodel(**m)
Thank You.
from pydantic import BaseModel
from pydantic import validate_arguments
class Mymodel(BaseModel):
name:str
age:int
#validate_arguments
def myfunc(m:Mymodel):
print(m)
print(m.name)
myfunc({"name":"abcd","age":3})
This might be what you are looking for: https://pydantic-docs.helpmanual.io/usage/validation_decorator/
Since you pass a dict to your custom function, the attribute should be accessed in the following way:
print(m['name'])
# or
print(m.get('name'))
Otherwise, to use m.name instead, you need to parse the dict to the corresponding Pydantic model, before passing it to the function, as shwon below:
data = {"name":"abcd", "age":3}
myfunc(Mymodel(**data))
# or
myfunc(Mymodel.parse_obj(data))
The reason that passing {"name":"abcd", "age":3} in FastAPI and later accessing the attributes using the dot operator (e.g., m.name) works, is that FastAPI does the above parsing and validation internally, as soon as a request arrives. This is the reason that you can then convert it back to a dictionary in your endpoint, using m.dict(). Try, for example, passing an incorrect key, e.g., myfunc(Mymodel(**{"name":"abcd","MYage":3}))—you would get a field required (type=value_error.missing) error (as part of Pydantic's Error Handling), similar to what FastAPI would return (as shown below), if a similar request attempted to go through (you could also test that through Swagger UI autodocs at http://127.0.0.1:8000/docs). Otherwise, any dictionary passed by the user (in the way you show in the question) would go through without throwing an error, in case it didn't match the Pydantic model.
{
"detail": [
{
"loc": [
"body",
"age"
],
"msg": "field required",
"type": "value_error.missing"
}
]
}
You could alternatively use Pydantic's validation decorator (i.e., #validate_arguments) on your custom function. As per the documentation:
The validate_arguments decorator allows the arguments passed to a
function to be parsed and validated using the function's annotations
before the function is called. While under the hood this uses the same
approach of model creation and initialisation; it provides an
extremely easy way to apply validation to your code with minimal
boilerplate.
Example:
from pydantic import validate_arguments
from pydantic import BaseModel
class Model(BaseModel):
name: str
age: int
#validate_arguments
def myfunc(m: Model):
print(m)
print(m.name)
myfunc({"name":"abcd","age":3})
I want to use FastAPI without an ORM (using asyncpg) and map the returned values from a select query to a pydantic model. This way the returned values are validated with pydantic and the response that is returned is structured like the pydantic model/schema.
I’ve tried looking for documentation on this but it’s pretty hard to find/not clear. I’d appreciate any help!
Every pydantic model inherits a couple of utility helpers to create objects. One is parse_obj which takes a dict and creates the model object from that.
parse_obj: this is very similar to the __init__ method of the model, except it takes a dict rather than keyword arguments. If the object passed is not a dict a ValidationError will be raised.
From the example on the linked section above:
class User(BaseModel):
id: int
name = 'John Doe'
signup_ts: datetime = None
m = User.parse_obj({'id': 123, 'name': 'James'})
print(m)
#> id=123 signup_ts=None name='James'
You might be able to give parse_obj a Record directly since it implements dict-like accessors, so just try it and see if it works. If not you can use dict(<row record from asyncpg>) to convert the record to an actual dict.
Can we declare a model that inherits basemodel and put a value in it, not a normal Python model?
I can forcefully create a PyTimeZoneSetting model, but can I use the existing TimeZoneSetting?
from pydantic import BaseModel
from typing import List, Optional
class TimeZoneSetting(BaseModel):
time_zone: str
type: int
sync_time: Optional[str] = None
time_server: Optional[str] = None
time_set_type: Optional[int] = None
class PyTimeZoneSetting():
time_zone: str
type: int
sync_time: str
time_server: str
time_set_type: str
def update_system_timezone():
!Here, I want to create a TimeZoneSetting model and put in a value.!
I am not quite sure of your intentions bu I will try to provide an example which could fulfill your needs.
Here are some ways you could initialize your Pydantic Model:
# initialize every field one by one
model_instance = TimeZoneSetting(time_zone="foo", type=1, sync_time="bar")
# provide your dict object
kwargs = {"time_zone": "foo", "type": 1}
model_instance = TimeZoneSetting(**kwargs)
Here is the way to update your model, where every field is reached as an attribute to your class:
model_instance.time_zone = "bar"
Here is the way to get you model as a dict:
model_instance.dict()