Declare fast api, python's object - python-3.x

Can we declare a model that inherits basemodel and put a value in it, not a normal Python model?
I can forcefully create a PyTimeZoneSetting model, but can I use the existing TimeZoneSetting?
from pydantic import BaseModel
from typing import List, Optional
class TimeZoneSetting(BaseModel):
time_zone: str
type: int
sync_time: Optional[str] = None
time_server: Optional[str] = None
time_set_type: Optional[int] = None
class PyTimeZoneSetting():
time_zone: str
type: int
sync_time: str
time_server: str
time_set_type: str
def update_system_timezone():
!Here, I want to create a TimeZoneSetting model and put in a value.!

I am not quite sure of your intentions bu I will try to provide an example which could fulfill your needs.
Here are some ways you could initialize your Pydantic Model:
# initialize every field one by one
model_instance = TimeZoneSetting(time_zone="foo", type=1, sync_time="bar")
# provide your dict object
kwargs = {"time_zone": "foo", "type": 1}
model_instance = TimeZoneSetting(**kwargs)
Here is the way to update your model, where every field is reached as an attribute to your class:
model_instance.time_zone = "bar"
Here is the way to get you model as a dict:
model_instance.dict()

Related

What is the correct type hint to use when exporting a pydantic model as a dict?

I'm writing an abstraction module which validates an excel sheet against a pydantic schema and returns the row as a dict using dict(MyCustomModel(**sheet_row))
. I would like to use type hinting so any function that uses the abstraction methods gets a type hint for the returned dictionary with its keys instead of just getting an unhelpful dict. Basically I'd like to return the keys of the dict that compose the schema so I don't have to keep referring to the schema for its fields and to catch any errors early on.
My current workaround is having my abstraction library return the pydantic model directly and type hint using the Model itself. This means every field has to be accessed using a dot notation instead of accessing it like a regular dictionary. I cannot annotate the dict has being the model itself as its a dict, not the actual pydantic model which has some extra attributes as well.
I tried type hinting with the type MyCustomModel.__dict__(). That resulted in the error TypeError: Parameters to generic types must be types. Got mappingproxy({'__config__': <class 'foo.bar.Config'>, '__fields__': {'lab.. Is there a way to send a type hint about the fields in the schema, but as a dictionary? I don't omit any keys during the dict export. All the fields in the model is present in the final dict being returned
I am going to try and abstract that question and create minimal reproducible example for you.
Question
Consider this working example:
from typing import Any
from pydantic import BaseModel
class Foo(BaseModel):
x: str
y: int
def validate(data: dict[str, Any], model: type[BaseModel]) -> dict[str, Any]:
return dict(model.parse_obj(data))
def test() -> None:
data = {"x": "spam", "y": "123"}
validated = validate(data, Foo)
print(validated)
# reveal_type(validated["x"])
# reveal_type(validated["y"])
if __name__ == "__main__":
test()
The code works fine and outputs {'x': 'spam', 'y': 123} as expected. But if you uncomment the reveal_type lines and run mypy over it, obviously the type it sees is just Any for both.
Is there a way to annotate validate, so that a type checker knows, which keys will be present in the returned dictionary, based on the model provided to it?
Answer
Python dictionaries have no mechanism built into them for distinguishing their type via specific keys. The generic dict type is parameterized by exactly two type parameters, namely the key type and the value type.
You can utilize the typing.TypedDict class to define a type based on the specific keys of a dictionary. However (as pointed out by #hernán-alarcón in the comments) the __dict__ method still returns just a dict[str, Any]. You can always cast the output of course and for this particular Foo model this would work:
from typing import Any, TypedDict, cast
from pydantic import BaseModel
class Foo(BaseModel):
x: str
y: int
class FooDict(TypedDict):
x: str
y: int
def validate(data: dict[str, Any], model: type[BaseModel]) -> FooDict:
return cast(FooDict, dict(model.parse_obj(data)))
def test() -> None:
data = {"x": "spam", "y": "123"}
validated = validate(data, Foo)
print(validated)
reveal_type(validated["x"]) # "builtins.str"
reveal_type(validated["y"]) # "builtins.int"
if __name__ == "__main__":
test()
But it is not very helpful, if validate should be able to deal with any model, not just Foo.
The easiest way to generalize this that I can think of is to make your own base model class that is generic in terms of the corresponding TypedDict. Binding the type argument in a dedicated private attribute should be enough. You won't actually have to set it or interact with it at any point. It is enough to specify it, when you subclass your base class. Here is a working example:
from typing import Any, Generic, TypeVar, TypedDict, cast
from pydantic import BaseModel as PydanticBaseModel, PrivateAttr
T = TypeVar("T")
class BaseModel(PydanticBaseModel, Generic[T]):
__typed_dict__: type[T] = PrivateAttr(...)
class FooDict(TypedDict):
x: str
y: int
class Foo(BaseModel[FooDict]):
x: str
y: int
def validate(data: dict[str, Any], model: type[BaseModel[T]]) -> T:
return cast(T, model.parse_obj(data).dict())
def test() -> None:
data = {"x": "spam", "y": "123"}
validated = validate(data, Foo)
print(validated)
reveal_type(validated["x"]) # "builtins.str"
reveal_type(validated["y"]) # "builtins.int"
reveal_type(validated) # "TypedDict('FooDict', {'x': builtins.str, 'y': builtins.int})"
if __name__ == "__main__":
test()
This works well enough to convey the dictionary keys and corresponding types.
If you are wondering, whether there is a way to just dynamically infer the TypedDict rather than just duplicating the model fields manually, the answer is no.
Static type checkers do not execute your code, they just read it.
This brings me to the final consideration. I don't know, why you would even want to use a dictionary over a model instance in the first place. It seems that for the purposes of dealing with structured data, the model is superior in every aspect, if you already are using Pydantic anyway.
The fact that you access the fields as attributes (via dot-notation) is a feature IMHO and not a drawback of this approach. If you for some reason do need to have dynamic attribute access via field names as strings, you can always just use getattr on the model instance.

Python dataclass inferred fields

Is it possible in python to have fields in a dataclass which infer their value from other fields in the dataclass? In this case the cacheKey is just a combination of other fields and I don't want to mention it explicitly in the object instantiation.
#dataclass
class SampleInput:
uuid: str
date: str
requestType: str
cacheKey = f"{self.uuid}:{self.date}:{self.requestType}" # Expressing the idea
You can use post_init to use the other fields:
#dataclass
class SampleInput:
uuid: str
date: str
requestType: str
def __post_init__(self):
self.cacheKey = f"{self.uuid}:{self.date}:{self.requestType}"
Just use a Python property in your class definition:
from dataclasses import dataclass
#dataclass
class SampleInput:
uuid: str
date: str
requestType: str
#property
def cacheKey(self):
return f"{self.uuid}:{self.date}:{self.requestType}"
This is the most straightforward approach. The only drawback is that cacheKey won't show up as a field of your class if you use serialization methods such as daclasses.asdict.

How to serialize multiples objects from a Django model and add dynamically computed data in outputed JSON for each object?

I'm porting a Laravel PHP code to Python Django/Django Rest Framework.
My endpoint will output JSON.
I need to output many objects, but I need to add extra computed values for each object.
How can I achieve this ?
For example, my model is :
from django.db import models
from rest_framework.serializers import ModelSerializer
class MyObject(models.Model):
name = models.CharField(max_length=255)
score = models.IntegerField()
class MyObjectSerializer(ModelSerializer):
class Meta:
model = MyObject
fields = ( 'name', 'score' )
I retrieve a queryset with MyObject.objects.all() (or with filter).
For each MyObject in my queryset, I compute an extra value, called 'stats', that I want to output in my JSON output.
For example, if I have 2 objects MyObject(name='foo',score='1') and MyObject(name='bar',score='2'), I will compute a stats value for each object.
And my JSON output should be like :
{
{
'name': 'foo',
'score': 1,
'stats': 1.2
},
{
'name': 'bar',
'score': 2,
'stats': 1.3
},
}
What is the cleanest way , if any to achieve this ?
I can have a loop for each MyObject, serialize each MyObject, one by one with a serializer, and create and update dictionary for this object adding 'stats' key.
I'm afaid about performance.
What if I compute stats value only for some objects, mixing 2 kind of output ?
You can use SerializerMethodField:
class MyObjectSerializer(ModelSerializer):
stat = SerializerMethodField()
class Meta:
model = MyObject
fields = ( 'name', 'score', 'stat' )
def get_stat(self, obj):
# obj is the model instance (it passes only one even if many=True)
# do calculations with obj and return the value
return None
If performance is a concern where stat field uses related/foreign key models, you can either use annotations or select_related/prefetch_related. Using annotation is more efficient but can get difficult to create depending on the requirement.
If it's possible to annotate you can use other serializer fields like:
class MyObjectSerializer(ModelSerializer):
stat = FloatField(read_only=True)
class Meta:
model = MyObject
fields = ( 'name', 'score', 'stat' )
Apart from what #kyell wrote, you can also create a property in models using #property decorator and return your calculated data, this property is always read only.

How to share variables across Python modules when getter and setter methods are required

How can I share variables across different modules of my Python project if I need these variables to have setter and getter methods.
The reason I need setter\getter methods is because when getting and setting the variables I need to have backwards compatibility with code that stored these variable as environment variables. So I need to write and read using os.environ too.
Usually all I need to do is create a class with class-level variables, import the class in each Module and access the module as follows:
datastore.py/
class DataStore:
target_server_ip: str = '10.10.10.100'
consumer.py/
from project.datastore import DataStore
def print_target_server_ip():
print(DataStore.target_server_ip)
This doesn't work (at least not in Python 3.6.5) if the variables require property getter and setter methods.
The reason is that I cannot define a class level method as a property. The following code just isn't possible:
datastore.py/
class DataStore:
target_server_ip: str = '10.10.10.100'
#classmethod
#property
def target_server_ip(cls):
return cls.target_server_ip
#classmethod
#target_server_ip.setter
def target_server_ip(cls, value):
cls.target_server_ip = value
To solve this issue I propose the following code section. It is split into two classes.
The first class is working at the class level and maintains a 2 level nested dictionary that contains the name of the datastore and the variable name.
The second class is the datastore itself. It has the minimum required code to keep it visually simple.
This specific implementation has one known error prone limitation. If you declare two or more variables with the same name in different datastore classes, i.d. you define class FrameworkDatastore and another class SecondDatastore with the same variable in both, the environment will have only one of them.
import inspect
import logging
import os
from typing import Any, Dict, Type
logger = logging.getLogger(__name__)
class _BaseDataStoreWithEnvironSupport:
"""
The class support global storing of variables in a class level dictionary, allowing all instances of the
datastore to access the same values.
This class is backward compatible to store the global variables as os.environ, but also
"""
_members: Dict[str, Dict[str, Any]] = {} # holds all the members of the datastore
#classmethod
def get_value(cls) -> Any:
datastore_name: str = cls.__name__
member_name: str = inspect.stack()[1][3]
env_value: str = os.environ.get(member_name)
ds_value: Any = cls._members[datastore_name][member_name]
if env_value:
type_ds_value: Type = type(ds_value)
if type_ds_value is bool:
value: bool = (env_value == True.__str__())
else:
value: Any = type(ds_value)(env_value)
if value != ds_value:
logger.warning('Environment stored value is different from Datastore value. Check your implementation')
else:
value: Any = ds_value
return value
#classmethod
def set_value(cls, value: Any) -> None:
datastore_name: str = cls.__name__
name: str = inspect.stack()[1][3]
if datastore_name not in cls._members.keys():
cls._members[datastore_name] = {}
cls._members[datastore_name][name] = value
os.environ[name] = str(value)
def validate_datastore(self):
members = set([attr for attr in dir(self) if not callable(getattr(self, attr)) and not attr.startswith("_")])
if members.__len__() == 0:
raise RuntimeError(f'There are no members in the datastore or the validation runs at the start of __init__')
datastore_name: str = self.__class__.__name__
dict_keys: set = set(self._members[datastore_name].keys())
if members != dict_keys:
missing_members: set = members - dict_keys
raise NotImplementedError(f'Datastore is missing get and set methods for members: {missing_members}')
class FrameworkDatastore(_BaseDataStoreWithEnvironSupport):
"""
This class is storing all variables that are currently saved as global or os.environ variables
If the data stored here becomes irrelevant after the code change or is seldom used, remove it and merge its
functionality into other sections
"""
def __init__(self):
"""
predefine all the members of the datastore.
Members which dont implement get/set methods will be flagged by the validate_datastore check
"""
self.run_traffic_validations: bool = True # Should Ixia traffic validations run in the current suite
# The validation of the datastore must come at the end of the __init__ method
self.validate_datastore()
#property
def run_traffic_validations(self):
return self.get_value()
#run_traffic_validations.setter
def run_traffic_validations(self, value: Any):
self.set_value(value)
if __name__ == '__main__':
# This tests the datastore code
fd1 = FrameworkDatastore()
fd2 = FrameworkDatastore()
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)
fd1.run_traffic_validations = False
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)
fd2.run_traffic_validations = True
print(fd1.run_traffic_validations)
print(fd2.run_traffic_validations)

Python 3 Dataclass Type Hinting for Model Object

I am trying to define a dataclass with a variable type of a model object how do I do it?
a = Model.objects.get(a_id=1)
#dataclass
class B:
model: ???
name: str
You can simply use the class as type-hint.
#dataclass
class B:
model: Model
name: str
(I suppose that your Model is a Django-like model class, given by the syntax of the first line of your snippet)

Resources