update_forward_refs() fails for dynamically-created model - python-3.x

When I create a pydantic model dynamically via create_model() then in some situations update_forward_refs() can't find the relevant definition.
This works:
from typing import List, Union
from pydantic import BaseModel
class Foo(BaseModel):
foo: List["Bar"]
Bar = Union[Foo, int]
Foo.update_forward_refs()
But the following, which I believe should be equivalent, fails with a NameError:
from typing import List, Union
from pydantic import create_model, BaseModel
Foo = create_model("Foo", foo=(List["Bar"], ...))
Bar = Union[Foo, int]
Foo.update_forward_refs()
resulting in:
Traceback (most recent call last):
File "test_forward_ref.py", line 11, in <module>
Foo.update_forward_refs()
File "pydantic\main.py", line 832, in pydantic.main.BaseModel.update_forward_refs
File "pydantic\typing.py", line 382, in pydantic.typing.update_field_forward_refs
or class checks.
File "pydantic\typing.py", line 62, in pydantic.typing.evaluate_forwardref
'MutableSet',
File "C:\Users\Ian\.conda\envs\tso\lib\typing.py", line 518, in _evaluate
eval(self.__forward_code__, globalns, localns),
File "<string>", line 1, in <module>
NameError: name 'Bar' is not defined
It is significant that "Bar" is refered to within a List in the annotation for field foo. If the annotation of field foo is directly "Bar" then there is no problem.
Can someone please me towards fixing this please? What else do I need to do?
Python 3.8 and pydantic 1.8.2

update_forward_refs() admits a **localns: Any parameter. It seems that in this case you can pass Bar=Bar to update_forward_refs() and it will work:
from typing import List, Union
from pydantic import create_model
Foo = create_model("Foo", foo=(List["Bar"], ...))
Bar = Union[Foo, int]
Foo.update_forward_refs(Bar=Bar)
print(Foo(foo=[1, Foo(foo=[2])]))
Output:
foo=[1, Foo(foo=[2])]

Related

TypeError: Params.my_params is expected to be <class 'params.MyParams'>, but value {'host': 'txt', 'project': 'txt', 'roi_term_id': 123} is a dict wit

Goal: read in parameters from .yaml to pass to functions during runtime.
I've seen no reference to this error online; so decided to make a post.
I have a user-defined params.yaml:
my_params:
host: txt
project: txt
roi_term_id: 123
# ...
That is read-in by params.py:
import os
from dataclasses import dataclass
from pathlib import Path
import yaml
from decouple import config
from typed_json_dataclass import TypedJsonMixin
#dataclass
class MyParams(TypedJsonMixin):
host: str
project: str
roi_term: str
def __post_init__(self):
self.public_key = config('KEY')
assert isinstance(self.public_key, str)
self.private_key = config('SECRET')
assert isinstance(self.private_key, str)
super().__post_init__()
# ...
#dataclass
class Params(TypedJsonMixin):
my_params: MyParams
# ...
def load_params_dict():
parameter_file = 'params.yaml'
cwd = Path(os.getcwd())
params_path = cwd / parameter_file
if params_path.exists():
params = yaml.safe_load(open(params_path))
else: # If this script is being called from the path directory
params_path = cwd.parent / parameter_file
params = yaml.safe_load(open(params_path))
return params
params_dict = load_params_dict()
print(params_dict)
project_params = Params.from_dict(params_dict)
Traceback:
File "/home/me/miniconda3/envs/myvenv/lib/python3.7/site-packages/typed_json_dataclass/typed_json_dataclass.py", line 152, in __post_init__
expected_type(**field_value)
TypeError: __init__() got an unexpected keyword argument 'roi_term_id'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "path/main.py", line 7, in <module>
from params import project_params
File "/home/me/PycharmProjects/project/path/params.py", line 89, in <module>
project_params = Params.from_dict(params_dict)
File "/home/me/miniconda3/envs/myvenv/lib/python3.7/site-packages/typed_json_dataclass/typed_json_dataclass.py", line 248, in from_dict
return cls(**raw_dict)
File "<string>", line 9, in __init__
File "/home/me/miniconda3/envs/myvenv/lib/python3.7/site-packages/typed_json_dataclass/typed_json_dataclass.py", line 155, in __post_init__
raise TypeError(f'{class_name}.{field_name} '
TypeError: Params.my_params is expected to be <class 'params.MyParams'>, but value {'host': 'txt', 'project': 'txt', 'roi_term_id': 123} is a dict with unexpected keys
2 things, name and dtype.
Subject:
roi_term: str
Name:
Keys' names in params.yaml must be the exact same as the attributes' names in a class (and I assume order).
dtype:
Attribute in class says str, but when the params.yaml file gets parsed it is considered int - since the value in the file is a whole number. I changed str to int in MyParams class.
Thus, roi_term_id from params.yaml conflicted with roi_term from MyParams class.

How to dynamically validate custom Pydantic models against an object?

I have a function like this:
class Name(BaseModel):
name_id: str
first_name: str
last_name: str
def get_all_names() -> List[Name]:
names = []
try:
# this API returns a list of NAME objects
names = requests.get("some-url")
# I want to validate, that each NAME object comforms to the model "Name" above
# this is what I do currently
validate_name_objects = [Name(**each_name_object) for each_name_object in names]
except Exception as e:
# if any of the NAME object fails the validation check above, then it will automatically
# be caught in this exception block and logged
logger.info(f"log this error (could be requests error, could be validation error: {e}")
return names
FastAPI does this validation check automatically, it somehow takes in type hint of response from function signature which in this case will be List[Name] and then automatically raise exception if the response does not conform to it.
I have these kind of checks in a lot of places in my code with different custom Pydantic models.
So I am looking for a mechanism of this sort where:
# Here SOME_FUNCTION takes in 2 arguments: A custom model to compare(which could be in any custom form
# made from Pydantic models like List[some-model] or Dict[str, some-model], etc)
# and a payload to validate that model against
validate_name_objects = SOME_FUNCTION(List[Name], names)
How to achieve this?
The closest content I found to my problem is provided by Pydantic here - https://pydantic-docs.helpmanual.io/usage/validation_decorator/ but this is only for validating input arguments of a given function and does not accept custom models dynamically.
Update:
After answer by #MatsLindh, here are more flexible ways by which we can use the solution he gave:
(but remember, as pointed out, Pydantic is a parsing library -> not a Validation library)
We can use it with just native python data types
from typing import Dict
from pydantic import parse_obj_as
items = parse_obj_as(Dict[int, str], {"asdasd": "23232"})
print(items)
This will as expected give Validation error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pydantic/tools.py", line 35, in pydantic.tools.parse_obj_as
File "pydantic/main.py", line 406, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ParsingModel[Dict]
__root__ -> __key__
value is not a valid integer (type=type_error.integer)
We can also use the same function, for custom data models:
from typing import List, Dict
from pydantic import BaseModel, parse_obj_as
class Item(BaseModel):
id: int
name: str
items = parse_obj_as(Dict[int, Item], {1: "asdfasdasf"})
print(items)
This will as expected give Validation error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pydantic/tools.py", line 35, in pydantic.tools.parse_obj_as
File "pydantic/main.py", line 406, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ParsingModel[Dict]
__root__ -> 1
value is not a valid dict (type=type_error.dict)
or let's try much more complicated custom types
from typing import List, Dict, Tuple, Optional
from pydantic import BaseModel, parse_obj_as
class Item(BaseModel):
id: int
name: str
items = parse_obj_as(Dict[Tuple[str, Optional[float], Optional[float]], Item], {(1, "123fdsfds", None): {'id': 1, 'name': 'My Item'}})
print(items)
This gives a validation error too as intended:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "pydantic/tools.py", line 35, in pydantic.tools.parse_obj_as
File "pydantic/main.py", line 406, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ParsingModel[Dict]
__root__ -> __key__ -> 1
value is not a valid float (type=type_error.float)
You can use parse_obj_as to convert a list of dictionaries to a list of given Pydantic models, effectively doing the same as FastAPI would do when returning the response.
from pydantic import parse_obj_as
...
name_objects = parse_obj_as(List[Name], names)
However, it's important to consider that Pydantic is a parser library, not a validation library - so it will do conversions if your models allow for them.

Multiple bases have instance lay-out conflict in Robot Framework

I'm creating a new testing frameworks, I started to implement my own functions in *.py files, but when I try to run test, I've got following stack:
(venv) PLAMWS0024:OAT user$ robot -v CONFIG_FILE:"/variables-config.robot" ./catalog/tests/test1.robot
Traceback (most recent call last):
File "/Users/user/PycharmProjects/OAT/venv/bin/robot", line 5, in <module>
from robot.run import run_cli
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/__init__.py", line 44, in <module>
from robot.rebot import rebot, rebot_cli
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/rebot.py", line 45, in <module>
from robot.run import RobotFramework
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/run.py", line 44, in <module>
from robot.running.builder import TestSuiteBuilder
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/running/__init__.py", line 98, in <module>
from .builder import TestSuiteBuilder, ResourceFileBuilder
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/running/builder/__init__.py", line 16, in <module>
from .builders import TestSuiteBuilder, ResourceFileBuilder
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/running/builder/builders.py", line 20, in <module>
from robot.parsing import SuiteStructureBuilder, SuiteStructureVisitor
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/parsing/__init__.py", line 380, in <module>
from .model import ModelTransformer, ModelVisitor
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/parsing/model/__init__.py", line 18, in <module>
from .statements import Statement
File "/Users/user/PycharmProjects/OAT/venv/lib/python3.8/site-packages/robot/parsing/model/statements.py", line 453, in <module>
class Error(Statement, Exception):
TypeError: multiple bases have instance lay-out conflict
I suspect it's because in one of my files I'm trying to get variables from Robot Framework built in functionalities.
and I'm thinking it's because I'm trying to use protected methods, but I am not sure.
I found issue TypeError: multiple bases have instance lay-out conflict and it shows that there might be a mismatch in naming convention (or am I wrong?), but my project is a bit small, so the only option is that Robot can't see the function itself.
What can I miss?
Some code:
Test itself:
*** Settings ***
Documentation TO BE CHANGED
... SET IT TO CORRECT DESCRIPTION
Library ${EXECDIR}/file.py
Library String
*** Test Cases ***
User can do stuff
foo bar
from datetime import datetime
from robot.api import logger
from robot.libraries.BuiltIn import _Variables
from robot.parsing.model.statements import Error
import json
import datetime
from catalog.resources.utils.clipboardContext import get_value_from_clipboard
Vars = _Variables()
def foo_bar(params):
# Get all variables
country = get_value_from_clipboard('${COUNTRY}')
address = get_value_from_clipboard('${ADDRESS}')
city = get_value_from_clipboard('${CITY}')
postcode = get_value_from_clipboard('${POSTALCODE}')
And calling Vars:
from robot.libraries.BuiltIn import _Variables
from robot.parsing.model.statements import Error
Vars = _Variables()
def get_value_from_clipboard(name):
"""
Returns value saved inside variables passed in Robot Framework
:param name: name of the variable, needs to have ${} part
as example: ${var} passed in config file
:return: value itself, passed as string
"""
try:
return Vars.get_variable_value(name)
except Error as e:
raise Error('Missing parameter in the clipboard, stack:' + str(e))
What fixed issue:
uninstall all requirements from requirements.txt file and install all one-by-one.
Additional steps I tried:
comment out all files one-by-one and run only robot command - failed, got same errors
cleaned vnenv as described here: How to reset virtualenv and pip? (failed)
check out if any variable has same naming as described in python3.8/site-packages/robot/parsing/model/statements.py - none
So looks like there was some clash in installing requirements by PyCharm IDE

AttributeError: module 'typing' has no attribute 're' in pandas Python 3.7

I can't import pd from pandas because i have this error. I search on google but I didn't find the fix for this..
>>> import pandas
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python37\lib\site-packages\pandas\__init__.py", line 23, in <module>
from pandas.compat.numpy import *
File "C:\Python37\lib\site-packages\pandas\compat\__init__.py", line 431, in <module>
re_type = typing.re.Pattern
AttributeError: module 'typing' has no attribute 're'
I think this is changing underneath us as Python's typing module matures, but in our case, the issue was that we did from Typing import re, and then later did something like:
def funct(some_re: re.Pattern):
The fix was dumb. Simply change your import to be from typing import Pattern and then do:
def funct(some_re: Pattern):
Bleh. Warts.

Type Error when I try to make a 3D graph in python

I know how to make 3D-graph in Python but for this one I have an error I've never seen. I want to have the graph of :
$$f(x,y)=\frac{8\cos(\sqrt{x^2+y^2}}{\sqrt{1+x^2+y^2}}$$ (LaTeX doesn't work here... ???)
My code :
import math
import matplotlib.pyplot as plt
import numpy as np
from mpl_toolkits.mplot3d import Axes3D
ax = Axes3D(plt.figure())
def f(x,y):
return (8*math.cos(math.sqrt(x**2+y**2)))/(math.sqrt(1+x**2+y**2))
X = np.arange(-1,1,0.1)
Y = np.arange(-1,1,0.1)
X,Y=np.meshgrid(X,Y)
Z=f(X,Y)
ax.plot_surface(X,Y,Z)
plt.show()
The error :
runfile('C:/Users/Asus/Desktop/Informatiques/nodgim.py', wdir=r'C:/Users/Asus/Desktop/Informatiques')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\Asus\Desktop\WinPython\python-3.4.3.amd64\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 680, in runfile
execfile(filename, namespace)
File "C:\Users\Asus\Desktop\WinPython\python-3.4.3.amd64\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 85, in execfile
exec(compile(open(filename, 'rb').read(), filename, 'exec'), namespace)
File "C:/Users/Asus/Desktop/Informatiques/nodgim.py", line 22, in <module>
Z=f(X,Y)
File "C:/Users/Asus/Desktop/Informatiques/nodgim.py", line 16, in f
return (8*math.cos(math.sqrt(x**2+y**2)))/(math.sqrt(1+x**2+y**2))
TypeError: only length-1 arrays can be converted to Python scalars
Can you explain me what I have to do?
Thank you in advance
math.cos and math.sqrt expect to get a scalar value but instead were passed an array type that they cannot handle properly which results in your type error. Essentially Python's built in math functions don't know how to deal with numpy arrays, so to fix this you need to use the mathematical functions that numpy provides to work on these data types: numpy.cos and numpy.sqrt
This will then give you the vectorization you need.

Resources