Serializing foreign key in Django rest framework (A->B->C) - nested

How do i serialize foreign key that already refers to a foreign key of another table. Currently I am having three tables A,B and C (A->B->C). A refers to B and B refers to C. I need to get JSON as shown below
A:
{
A_id:1,
A_name:'aaa',
B:
{
B_id:1,
B_name:'bbb'
C:
{
C_id:1,
C_name:'ccc'
}
}
}
I am able to get the JSON for one reference. i.e; A->B
i used RelatedField and overrided to_representation function to achieve this. This is the code i used
class B_foreign(serializers.RelatedField):
def to_representation(self, value):
return value
class ASerializer(serializers.ModelSerializer):
B = B_foreign(source='B_id', read_only=True)
class Meta:
model = A
fields = '__all__'
Now what will i do to refer 'C' from B ? Do another RelatedField class and override to_representation() function?
I tried that too. It didn't work. Can someone help me with solution?

You just use the serializer for the given model:
class CSerializer(serializers.ModelSerializer):
class Meta:
model = C
fields = '__all__'
class BSerializer(serializers.ModelSerializer):
C = CSerializer()
class Meta:
model = B
fields = '__all__'
class ASerializer(serializers.ModelSerializer):
B = BSerializer()
class Meta:
model = A
fields = '__all__'

Related

Inherit methods in Django rest framework

I'm new to learning django rest framework & python. I'm currently trying to inherit a few methods to which I don't know how. An alternative is I copy the same code which is code replication and against code design principles.
An example of what I what to do :
Class A (models.Model)----->
field1 = models()
field2 = models()
field3 = models()
field4 = models()
#classmethod
def method1():
return something
def method2():
return something
Class B (models.Model)----->
field5 = models() - New field, no relation to Class A fields
field6 = models() - New field, no relation to Class A fields
field7 = models() - New field, no relation to Class A fields
field8 = models() - "Here I wish to link field8 to class A so that filed8 is also referenced while viewing Class A fields .The relation I wish to establish is Many to One using Foreign Key"
#classmethod
"Here I wish to execute the 2 methods from Class A that are already defined, so that when When Class B is called , the 2 methods of Class A are also executed along with the new method of Class B itself"
def method3(): ----> New method of class B.
return something
you can make a Common model class which have fields and methods common for other class. For example
class Common(models.Model):
common_field1 = models.TextField()
common_field2 = models.TextField()
def common_method(self):
pass
class Meta:
abstract = True
class NewModelA(Common):
# ... models A fields
# ... models A methods
pass
class NewModelB(Common):
# ... models B fields
# ... models B fields
pass
Note: you will get related name issue when using relational field in common models. In that case defined each field in each models class.

How do I test for str equality using factory_boy faker method?

I have two factory classes, the other is linked to the one through foreign key relationships, and was kinda hoping to achieve some similarities with the attributes. To start with, the model looks something like this:
class Track(models.Model):
response = models.ForeignKey('Response')
def __str__(self):
return str(self.response)
class Response(models.Model):
title = models.CharField(max_length=640)
def __str__(self):
return self.title
I should be able to access these classes as I have done below
r = Response(title='foo')
r.save()
t = Track(response=r)
t.save()
# with this I wanted to test that str(t) == t.response
The factory classes look like this:
class ResponseFactory(factory.django.DjangoModelFactory):
class Meta:
model = Response
title = factory.Faker('text')
class TrackFactory(factory.django.DjangoModelFactory):
class Meta:
model = Track
response = factory.SubFactory(ResponseFactory)
Below is how I have accessed these factory classes to test for str equality
track = TrackFactory() # generates random string e.g `foo`
a = str(track) # == foo
b = track.response # == foo
# however I get an assertion error with the statement below
assert a == b
Could you point out where I've gone wrong, thank you.

How to pass a Count field to export field

I'm working on an export resource but I can't figure out how to pass this field from the view as a column in my export.
issues = Student.objects.annotate(Count('issue'))
def view_student(request):
issues = Student.objects.annotate(Count('issue'))
students = Student.objects.filter(school = request.user.school).order_by('year')
return render(request, 'view_student.html', {'students': students,'issues':issues})
This is how I tried it in the resoucrces.py but it shows no result
class ExportStudentsResource(resources.ModelResource):
books = fields.Field(attribute = 'books',column_name='books',widget= ForeignKeyWidget(Student, 'issue_count'))
class Meta:
model = Student
fields = ('student_id','name','year','klass','stream','books')
This field is not from any model so I just thought Student model could be habouring it. How can I make it work
You can override the .get_queryset(…) method [Django-doc] and annotate your Student objects with:
from django.db.models import Count
class ExportStudentsResource(resources.ModelResource):
books = fields.Field(
attribute='books',
column_name='books',
widget= ForeignKeyWidget(Student,'issue_count')
)
issues_count = fields.Field(attribute='issue_count')
def get_queryset(self):
return super().get_queryset().annotate(
issue_count=Count('issue')
)
class Meta:
model = Student
fields = ('student_id','name','year','klass','stream','books')

FactoryBoy: How do I define a factory field for a generic foreign key?

Since the type of a generic foreign key object is not know until a record is created in the model, what sub factory do I define it as ? Or is there another way to approach this ?
models.py
class Contract(models.Model):
offer_type = models.ForeignKey(ContentType, on_delete=models.PROTECT)
offer_id = models.PositiveIntegerField()
offer = GenericForeignKey('offer_type', 'offer_id')
invoice = models.ForeignKey('Invoice', on_delete=models.PROTECT)
status = models.CharField(max_length=8)
commission = models.DecimalField(max_digits=100, decimal_places=2)
factories.py
class ContractFactory(factory.DjangoModelFactory):
class Meta:
model = models.Contract
#What to do here ???
offer = factory.SubFactory(????)
invoice = factory.SubFactory(InvoiceFactory)
status = 'active'
commission = 40.00
Somehow you have to pass what kind of model you're willing to see
(if not random)
so your proposal is good for one choice, but for a more generic approach
use this:
class BaseContractFactory(dj_factory.DjangoModelFactory):
class Meta:
model = Contract
exclude = ['offer']
offer_id = factory.SelfAttribute('offer.id')
offer_type = factory.LazyAttribute(
lambda obj: ContentType.objects.get_for_model(obj.offer)
)
...
class Concrete1ContractFactory(BaseContractFactory):
offer = factory.SubFactory(Concrete1Factory)
class Concrete2ContractFactory(BaseContractFactory):
offer = factory.SubFactory(Concrete2Factory)
where Concrete1 and Concrete2 are some existing Django models

Dynamic SQLAlchemy ORM relationship generation

Premise: I have a lot of tables that have to individually created (they cannot be dynamically created) and therefore, I find myself constantly having to make mixins that allow the standardization of relating tables:
class A_Table(Base):
id = Column(Integer, primary_key=True)
class A_Relator(My_Mixin_Base):
#declared_attr
def a_table_id(cls):
return Column(ForeignKey(A_Table.id))
#declared_attr
def a_table(cls):
return relationship(A_Table)
class B_Table(A_Relator, Base):
id = Column(Integer, primary_key=True)
class C_Table(A_Relator, Base):
id = Column(Integer, primary_key=True)
class D_Table(A_Relator, Base):
id = Column(Integer, primary_key=True)
# ad nauseam
Simple, but when B_Table, C_Table, etc. all have their own Relator classes, it gets very repetitive, and thus, something that should be easily solved in code.
My Solution: I made a class factory (?) that creates a mixin class to be used one time.
def related(clss, defined=False, altName=None):
class X((Definer if defined else Relator),):
linkedClass = clss
#classmethod
def linkedClassFieldName(cls):
return "{}Id".format(clss.getBackrefName())
def linkId(cls):
return Column(ForeignKey(clss.id))
def linkRe(cls):
return relationship(clss,
foreign_keys=getattr(cls, "{}Id".format(clss.getBackrefName() if not altName else altName)),
backref=cls.getBackrefName())
setattr(X, "{}Id".format(clss.getBackrefName() if not altName else altName), declared_attr(X.linkId))
setattr(X, "{}".format(clss.getBackrefName() if not altName else altName), declared_attr(X.linkRe))
del X.linkId
del X.linkRe
return X
Which allows you to do the following and be done with it:
class B_Table(related(A_Table), Base):
id = Column(Integer, primary_key=True)
...but this is messy and confusing, and I would guess there is a much better way to do this that leaves a lot less to uncertainty.
Question: I'm looking for a way to do this in a more direct SQLAlchemy-aligned way with less roundabout "hack". Or in summary: how do I make a generic SQLAlchemy mixin that generates a relationship?
I had a mess around with this. Not sure how well this solution will suit your needs but I did it as more of a learning exercise for myself, and if it helps for you, then great.
So with the objective to be able to have foreign keys and relationships defined on models with as little input as possible, this is what I came up with.
Here are the models that I used:
class Base:
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
#declared_attr
def id(cls):
return Column(Integer, primary_key=True)
def __repr__(self):
return f'<{type(self).__name__}(id={self.id})>'
Base = declarative_base(cls=Base)
class A_Table(Base):
parents = []
class B_Table(Base):
parents = ['A_Table']
class C_Table(Base):
parents = ['A_Table', 'B_Table']
Notice the class variable parents on each model which is a sequence of strings that should be other model names that inherit from the same declarative_base instance. Foreign keys and relationships to the parent classes will be created on the class that declares them as parents.
So then leveraging off of the fact that:
Attributes may be added to the class after its construction, and they
will be added to the underlying Table and mapper() definitions as
appropriate
(see docs)
I iterate through all of the models that are defined on Base and build the required objects according to the parents it's given and plug them in.
Here's the function that does all of that:
from sqlalchemy import inspect # this would be the only new import you'd need
def relationship_builder(Base):
""" Finds all models defined on Base, and constructs foreign key
columns and relationships on each as per their defined parent classes.
"""
def make_fk_col(parent):
""" Constructs a Column of the same type as the primary
key of the parent and establishes it as a foreign key.
Constructs a name for the foreign key column and attribute.
"""
parent_pk = inspect(parent).primary_key[0]
fk_name = f'{parent.__name__}_{parent_pk.name}'
col = Column(
fk_name, parent_pk.type,
ForeignKey(f'{parent.__tablename__}.{parent_pk.name}')
)
return fk_name, col
# this bit gets all the models that are defined on Base and maps them to
# their class name.
models = {
cls.__name__: cls for cls in Base._decl_class_registry.values() if
hasattr(cls, '__tablename__')
}
for model in models.values():
for parentname in model.parents:
parent = models.get(parentname)
if parent is not None:
setattr(model, *make_fk_col(parent))
rel = relationship(parent, backref=model.__name__)
setattr(model, parentname, rel)
To test, this is just at the bottom of the same module that I've got everything else defined in:
if __name__ == '__main__':
relationship_builder(Base)
a = A_Table(id=1)
b = B_Table(id=1)
c = C_Table(id=1)
a.B_Table.append(b)
a.C_Table.append(c)
b.C_Table.append(c)
print(b.A_Table)
print(c.A_Table)
print(c.B_Table)
# <A_Table(id=1)>
# <A_Table(id=1)>
# <B_Table(id=1)>
Here's the schema it created:
This won't work for composite primary/foreign keys but I don't think it would be too much of a stretch to get it there. If len(inspect(parent).primary_keys) > 1 you'd need to build ForeignKeyConstraints and add them to the table definition, but I haven't tested that at all.
I also don't think it would be too much of a stretch to make it fully automated if you could name your models in such a manner that the subordination of a model could be inferred from the name of the model itself. Again, just thinking out loud.

Resources