Python Pydantic double base model - python-3.x

I'm using FastAPI with Pydantic and I'm trying to achieve that my API accepts cammel case parameters, for this, I'm using the following
from pydantic import BaseModel
from humps import camelize
class CamelModel(BaseModel):
class Config:
alias_generator = camelize
allow_population_by_field_name = True
class MyClass(CamelModel):
my_field1: int
my_field2: int
my_field3: int
So far it works great, but MyClass is a base class for others classes, for example as
class MyNewClass(MyClass):
my_field4: float
How can I get the MyNewClass to also use the camel case base class? I've tried something like
from typing import Union
class MyNewClass(Union[MyClass, CamelModel]):
my_field4: float
But I'm getting this error
TypeError: Cannot subclass <class 'typing._SpecialForm'>
Is there any way to accomplish this?
Thanks!

What you are trying to achieve is called multiple inheritance. Since you are inheriting from a class which inherited from the CamelModel, there's no need to inherit it again
The appropriate code should be
class MyNewClass(MyClass):
It's the python syntax for multiple inheritance. See an extensive example here https://www.python-course.eu/python3_multiple_inheritance_example.php#An-Example-of-Multiple-Inheritance
The code you are using (class MyNewClass(Union[MyClass, CamelModel]):) is used for declaring data types, which is kinda of correct, but not in the right place. Typing is almost only (as far as I've seen) used for parameters of functions.
NOTE
I did not test the piece of code above, but I'm pretty sure it works. Let me know if there are any problems

Related

Same mock object has different ids in test class and testable class

I want to write unit tests for the main.py class. The file structure of my project is like this,
my unit tests are included in test/source/code_files folder. I want to mock some methods in the main class too. (which uses variables in source/config/config.py) I'm using the patch for this.
ex:
import main
#patch('main.config.retry_times')
def test_method(self, mock_retry_times):
mock_retry_times().return_value = 'retry_times_mock_val'
#calling the main class method
in the main class method, retry_times is defined like this,
from source.config import config
def method():
var1 = config.retry_times['url']
# Do other stuff
This gave me an error as,
I tried with Magic Mock object as well. But it didn't work as per this solution. My imports also work fine.
But I figure out one thing.
when I check mock IDs in both the testable class and the test class they were different like this.
It seems like an issue in these ids. I think they must be the same in both classes. Does anyone help me to sortout this issue.

Switching multiple inheritance via mixins to composition but keep the same API

Firstly, thank you for taking the time to read and input. It is greatly appreciated.
Question: What kind of approach can we take to keep the same public API of a class currently using multiple mixins but refactor it internally to be composed of objects that do the same work as the mixin. Autocomplete is a must (so runtime dynamics are kind of out such as hacking things on via __getattr__ or similar - I know this depends on the runtime environment i.e ipython vs pycharm etc, for the sake of this question, assume pycharm which cannot leverage __dir__ I think fully.
Accompanying Information:
I am writing a little assertion library in python and I have a core class which is instantiated with a value and subsequently inherits various assertion capabilities against that value via a growing number of mixin classes:
class Asserto(StringMixin, RegexMixin):
def __init__(self, value: typing.Any, type_of: str = AssertTypes.HARD, description: typing.Optional[str] = None):
self.value = value
self.type_of = type_of
self.description = description
These mixin classes offer various assertion methods for particular types, here is a quick example of one:
from __future__ import annotations
class StringMixin:
def ends_with(self, suffix: str) -> StringMixin:
if not self.value.endswith(suffix):
self.error(f"{self.value} did not end with {suffix}")
def starts_with(self, prefix: str) -> StringMixin:
if not self.value.startswith(prefix):
self.error(f"{self.value} did not end with {prefix}")
I would like to refactor the Asserto class to compose itself of various implementations of some sort of Assertable interface rather than clobber together a god class here with Mixins, I'm likely to have 10+ Mixins by the time I am finished.
Is there a way to achieve the same public facing API as this mixins setup so that client code has access to everything through the Asserto(value).check_something(...) but using composition internally?
I could define every single method in the Asserto class that just delegate to the appropriate concrete obj internally but then I am just making a massive god class anyway and the composition feels like a pointless endeavour in that instance?
for example in client code, I'd like all the current mixins methods to be available on an Asserto instance with autocomplete.
def test_something():
Asserto("foo").ends_with("oo")
Thank you for your time. Perhaps using the mixin approach is the correct way here, but it feels kind of clunky.

Mocking in Odoo environment?

Does anyone know how can you write mock tests for Odoo objects?
I have these classes and methods:
my_module:
from odoo import models
class MyModel(models.Model):
_name = 'my.model'
def action_copy(self):
IrTranslation = self.env['ir.translation']
for rec in self:
if rec.translate:
IrTranslation.force_translation(rec)
my_module_2:
from odoo import models
class IrTranslation(models.Model):
_inherit = 'ir.translation'
def force_translation(self, rec):
# do stuff
When I call it, I want to test if IrTranslation.force_translation was called in action_copy method and how many times.
But this method is not imported directly, it is referenced through env.
If let say force_translation would be imported like:
from my_module_2.IrTranslation import force_translation
def action_copy(self):
# do stuff.
force_translation()
Then I could try doing something like this:
from unittest import mock
from my_module import action_copy
def test_some_1(self):
with mock.patch('my_module.my_module_2.IrTranslation') as mocked_translation:
action_copy()
mocked_translation.force_translation.assert_called_once()
But because modules in Odoo are not imported directly (like you do it in plain Python), I don't understand how to specify methods in Odoo environment to be mocked.
P.S. I also did not see any mocked tests in standard Odoo, except for base classes that do not inherit Model class -> which then you need to use its _inherit attribute instead of importing class and passing it to be inherited on another class.
Testing in Odoo does not use the concept of mocking. Instead, tests are derived from standard base classes. The standard class TransactionalTest opens a transaction and never commits it, but rolls it back to undo any changes.
This is obviously not the same as regular mocking in that you can't replace other methods or classes to return fixed/expected values and/or avoid other side effects apart from persisting changes in the database, like sending emails or calling a remote web service.
It can be done. I do it all the time since Odoo 8.0 (until 15.0 now). The key is to know where to patch. Odoo adds odoo.addons to your module's package when its imported so in your case, you may do the following:
from odoo import tests
from mock import patch
from odoo.addons.my_module_2.models.ir_translations import IrTranslation
class TestMyModule2(tests.TransactionCase):
def some_test_1(self):
my_model = self.env['my.model'].create({})
with patch.object(IrTranslation, 'force_translation') as mocked_translation:
my_model.action_copy()
mocked_translation.assert_called_once()
Or using just patch, then no need to import:
with patch('odoo.addons.my_module_2.models.ir_translations.IrTranslation.force_translation') as mocked_translation:
my_model.action_copy()
This patches your specific method in your specific class. This way you can also target the method of a super class.
If you need to patch a method and you don't care where it is or where it's overriden, just patch using Python's type() (then no need to import class):
with patch.object(type(self.env['ir.translation']), 'force_translation') as mocked_translation:
my_model.action_copy()
Some additional notes to save you some headaches:
If you use pyCharm, don't mock socket objects. It messes with
pyCharm's mechanismes. Better to put your calls to socket into a one line
method and mock that method instead.
datetime.datetime.now() cannot be mocked, as all builtin types, but fields.Datetime.now() can.

Inheriting from both ABC and django.db.models.Model raises metaclass exception

I am trying to implement a Django data model class, which is also an interface class, using Python 3. My reason for doing so is, I'm writing a base class for my colleague, and need him to implement three methods in all of the classes he derives from mine. I am trying to give him a simplified way to use the functionality of a system I've designed. But, he must override a few methods to supply the system with enough information to execute the code in his inherited classes.
I know this is wrong, because it's throwing exceptions, but I'd like to have a class like the following example:
from django.db import models
from abc import ABC, abstractmethod
class AlgorithmTemplate(ABC, models.Model):
name = models.CharField(max_length=32)
#abstractmethod
def data_subscriptions(self):
"""
This method returns a list of topics this class will subscribe to using websockets
NOTE: This method MUST be overriden!
:rtype: list
"""
I understand I could avoid inheriting from the ABC class, but I'd like to use it for reasons I won't bore you with here.
The Problem
After including a class, like the one above, into my project and running python manage.py makemigrations I get the error: TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases. I have searched Stack Overflow, but have only find solutions like the following one:
class M_A(type): pass
class M_B(type): pass
class A(metaclass=M_A): pass
class B(metaclass=M_B): pass
class M_C(M_A, M_B): pass
class C:(A, B, metaclass=M_C): pass
I've read the following posts:
Using ABC, PolymorphicModel, django-models gives metaclass conflict
Resolving metaclass conflicts
And I've tried many variations of those solutions, but I still get the dreaded metaclass exception. Help me Obi-Wan Kenobi, you're my only hope. :-)
I had the same need and found this. I've altered the code for clarity and completeness. Basically you need an extra class which you can use for all your model interfaces.
import abc
from django.db import models
class AbstractModelMeta(abc.ABCMeta, type(models.Model)):
pass
class AbstractModel(models.Model, metaclass=AbstractModelMeta):
# You may have common fields here.
class Meta:
abstract = True
#abc.abstractmethod
def must_implement(self):
pass
class MyModel(AbstractModel):
code = models.CharField("code", max_length=10, unique=True)
class Meta:
app_label = 'my_app'
test = MyModel(code='test')
> TypeError: Can't instantiate abstract class MyModel with abstract methods must_implement
Now you have the best of both worlds.
I found a solution that worked for me, so thought I would post it here in case it helps someone else. I decided to not inherit from the ABC class, and instead just raise an exception in the "abstract" methods (the ones the derived class must implement). I did find helpful information in the Django docs, describing using Django data models as an Abstract base class and also Multi-table inheritance.
Django Data Model as an Abstract Base Class
Quoted from the docs:
Abstract base classes are useful when you want to put some common information into a number of other models. You write your base class and put abstract=True in the Meta class. This model will then not be used to create any database table. Instead, when it is used as a base class for other models, its fields will be added to those of the child class.
An example:
from django.db import models
class CommonInfo(models.Model):
name = models.CharField(max_length=100)
age = models.PositiveIntegerField()
class Meta:
abstract = True
class Student(CommonInfo):
home_group = models.CharField(max_length=5)
The Student model will have three fields: name, age and home_group.
The CommonInfo model cannot be used as a normal Django model, since it
is an abstract base class. It does not generate a database table or
have a manager, and cannot be instantiated or saved directly.
Fields inherited from abstract base classes can be overridden with
another field or value, or be removed with None.
Multi-table Inheritance with a Django Data Model
My understanding of "multi-table inheritance" is, you can define a data model and then also use it as a base class for a second data model. The second data model will inherit all the fields from the 1st model, plus its own fields.
Quoted from the docs:
The second type of model inheritance supported by Django is when each
model in the hierarchy is a model all by itself. Each model
corresponds to its own database table and can be queried and created
individually. The inheritance relationship introduces links between
the child model and each of its parents (via an automatically-created
OneToOneField). For example:
from django.db import models
class Place(models.Model):
name = models.CharField(max_length=50)
address = models.CharField(max_length=80)
class Restaurant(Place):
serves_hot_dogs = models.BooleanField(default=False)
serves_pizza = models.BooleanField(default=False)
All of the fields of Place will also be available in Restaurant,
although the data will reside in a different database table. So these
are both possible:
>>> Place.objects.filter(name="Bob's Cafe")
>>> Restaurant.objects.filter(name="Bob's Cafe")

export class vs functions

In case of utility module, I can have either class with static methods or just export methods. I think the first solution is better, though I saw a lot of implementations with second option. Are there any "nuances" here which I am not considering?
I would argue that a class with static methods is better for the following reasons:
If your class name is Utils, all imports will by default import it as Utils too. With exported functions however, they could be imported as Utils yet that would only be a convention , one that likely won't be the case in all the different places.
A class named Utils in a file named utils.js with all the utility methods neatly grouped together is aesthetically more pleasant than flat functions defined all over the place.
A class could have properties that are used among its methods, for this you'd need #babel/plugin-proposal-class-properties though. Again, much nicer than variables defined all over the place.
Exporting methods is safer because you don't give access to class properties. Note also that in javascrpt the concept of class does not have a lot of sense, it's been introduced to make feel more confortable developers with oo languages background. Try to work with Object prototyping instead.
A couple options I can think of, if the methods are intended to be a utility method on another class, you could use handbaked mixins: http://coffeescriptcookbook.com/chapters/classes_and_objects/mixins or rely on something similar in underscore/lowdash
If you want the encapsulation of methods and still have the ability to extend, you can do this:
class Foo
foo = -> alert 'foo'
#static: -> foo()
Foo.static() #=> 'foo'
Foo.foo #=> undefined
new Foo().foo #=> undefined
class Bar extends Foo
Bar.static() # => 'foo'
jsfiddle: http://jsfiddle.net/4ne7ccxk/

Resources