Why some fields are not created while making migrations? - python-3.x

I have this two models in models.py
class Bid(models.Model):
bid = models.IntegerField(default = 0)
user = models.ForeignKey(User, on_delete = models.CASCADE, related_name = "bid")
def __str__(self):
return f"Bid of {self.bid} from {self.user}"
class AuctionListings(models.Model):
name_of_item = models.CharField(max_length=32)
description = models.CharField(max_length=400)
owner = models.ForeignKey(User, on_delete = models.CASCADE, related_name="auctionlistings", default = None)**
bid = models.ForeignKey(Bid, on_delete = models.CASCADE, related_name = "auctionlistings", default = None)**
is_closed = models.BooleanField(default=False, blank=True, null=True)
url = models.CharField(max_length=800)
watchlist = models.ManyToManyField(User, blank=True, related_name="watch_listings")
category = models.CharField(max_length=50)
When i makemigrations:
operations = [
migrations.CreateModel(
name='AuctionListings',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name_of_item', models.CharField(max_length=32)),
('description', models.CharField(max_length=400)),
('is_closed', models.BooleanField(blank=True, default=False, null=True)),
('url', models.CharField(max_length=800)),
('category', models.CharField(max_length=50)),
],
),
migrations.CreateModel(
name='Bid',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('bid', models.IntegerField(default=0)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='bid', to=settings.AUTH_USER_MODEL)),
],
),
My Question is: Why Django did not create the fields: "bid" and "user" specified in models.py.

The key to understand this behaviour is the method generate_created_models used when Django tries to figure out the operations needed when generating the migrations. In this method's docs we can read the following:
"""
Find all new models (both managed and unmanaged) and **make create
operations for them as well as separate operations to create any
foreign key or M2M relationships (these are optimized later, if
possible)**.
Defer any model options that refer to collections of fields that might
be deferred (e.g. unique_together, index_together).
"""
So for fields that refer to Foreign Keys or Many to Many relationships, Django will try to optimize the operations. The most common options are:
Include the field when creating the model with migrations.CreateModel (as you can see, you have an example in your code for the Bid.user field).
Create the model with migrations.CreateModel without the field. After the creation, Django adds each new field with migrations.AddField (this is what is happening to your AuctionListings.owner, AuctionListings.bid and AuctionListings.watchlist fields, which aren't included while calling migrations.CreateModel in your code).
All in all, an important thing to note is that fields are not created. Actually, models are created and this models contain fields, which can be specified during creation or added later.
With that into account, in your migrations file, just below the lines of code that you posted you should see some operations using migrations.AddField and account for the addition of the fields that are not included in migrations.CreateModel.

Related

Django API, manage linked tabes using multiple GETs

In my django app i expose two tables using DjangoRestAPI for manage data one related to other.
There is a result data with this structure:
"results": [
{
"id": 175194,
"device": "f906e9db70b0cc822cb44ccd1b2b89a7",
"res_key": "b865c3125cb4ef173e55377026d94b2b",
"read_date": "2021-03-31T07:06:04.143569Z",
"unit": 2
},
{
"id": 21278,
"device": "f906e9db70b0cc822cb44ccd1b2b89a7",
"res_key": "c8a961f3ef9f8fa0ebdac3c910070055",
"read_date": "2021-03-26T15:54:04.171926Z",
"unit": 1
},
{
"id": 25173,
"device": "f906e9db70b0cc822cb44ccd1b2b89a7",
"res_key": "75126c6b2b4e78fc553ec75c7eb927ea",
"read_date": "2021-03-26T16:48:03.259185Z",
"unit": 1
},
...
and a result_details API endpoint related to results:
"results_details": [
{
"id": 1,
"id_res": 236,
"var_id": 1,
"var_val": "[41]",
"var_hash": "6f241d5445cf3031f6420de63c0a409bad527ea3"
},
{
"id": 2,
"id_res": 326,
"var_id": 1,
"var_val": "[45]",
"var_hash": "e5f03cfbed7ee88445b44ddf8e64365da310f8ec"
},
...
My models:
class Results(models.Model):
id = models.AutoField(primary_key=True)
device = models.ForeignKey(Device, null=True, on_delete=models.SET_NULL)
res_key = models.SlugField(max_length=80, verbose_name="Message unique key", unique=True)
read_date = models.DateTimeField(verbose_name="Datetime of vals readings")
unit = models.ForeignKey(ModbusDevice, null=True, on_delete=models.SET_NULL)
def __str__(self):
return self.device
class Meta:
indexes = [
models.Index(fields=['device', 'unit']),
]
"""
Detailed vars results table
"""
class VarsResults(models.Model):
id = models.AutoField(primary_key=True)
id_res = models.ForeignKey(Results, related_name="mainres", on_delete=models.CASCADE)
var_id = models.ForeignKey(ModbusVariable, null=True, on_delete=models.SET_NULL)
var_val = models.CharField(max_length=400, blank=True)
var_hash = models.CharField(max_length=400)
def __str__(self):
return self.var_hash
class Meta:
indexes = [
models.Index(fields=['id_res', 'var_id']),
]
Here my serializers:
class ResultsSerializer(serializers.ModelSerializer):
#main_res = serializers.RelatedField(read_only=True)
class Meta:
model = Results
fields = ['id', 'device', 'res_key', 'read_date', 'unit']
class VarsResultsSerializer(serializers.ModelSerializer):
id_res = serializers.ReadOnlyField(source='id_res.id')
class Meta:
model = VarsResults
fields = ['id', 'id_res', 'var_id', 'var_val', 'var_hash']
there is an id_res field that link to main results data.
So, in this configuration, every time a user have to get results related to a specific device have to execute a GET to the first table filtering for device and then, inside a for loop, manage n GETs on second table passing the id for link to id_res.
If i find 100 rows from Results API call i have to execute 100 different calls to the Results_data.
Someone have an idea for better manage this king of situation from external API calls?
I try to avoid internal data aggregation, i would external apps after API call manage internally data structure.
So many thanks in advance
Django automatically adds entity_id column for foreign keys. It's not only a bad practice to add them manually but it might also break a few things.
docs
Behind the scenes, Django appends "_id" to the field name to create its database column name. In the above example, the database table for the Car model will have a manufacturer_id column. (You can change this explicitly by specifying db_column) However, your code should never have to deal with the database column name, unless you write custom SQL. You’ll always deal with the field names of your model object.
id pk is also added automatically.
Your model names should be singular.
class Result(models.Model):
device = models.ForeignKey(Device, null=True, on_delete=models.SET_NULL)
res_key = models.SlugField(max_length=80, verbose_name="Message unique key", unique=True)
read_date = models.DateTimeField(verbose_name="Datetime of vals readings")
unit = models.ForeignKey(ModbusDevice, null=True, on_delete=models.SET_NULL)
def __str__(self):
return self.device
class Meta:
indexes = [
models.Index(fields=['device', 'unit']),
]
class VarsResult(models.Model):
result = models.ForeignKey(Result, related_name="mainres", on_delete=models.CASCADE)
modbus_variable = models.ForeignKey(ModbusVariable, null=True, on_delete=models.SET_NULL)
var_val = models.CharField(max_length=400, blank=True)
var_hash = models.CharField(max_length=400)
def __str__(self):
return self.var_hash
class Meta:
indexes = [
models.Index(fields=['result', 'modbus_variable']),
]
Now if you pass Result instance to ResultsSerializer it will also return all its mainres.
class ResultsSerializer(serializers.ModelSerializer):
mainres = VarsResultsSerializer(many=True, read_only=True)
class Meta:
model = Results
fields = ['id', 'device', 'res_key', 'read_date', 'unit', 'mainres']
class VarsResultsSerializer(serializers.ModelSerializer):
class Meta:
model = VarsResults
fields = ['id', 'id_res', 'var_id', 'var_val', 'var_hash']
You can take it a step further (I'm not sure what exactly do you need) and return all results from a single Device.
class DeviceSerializer(serializers.ModelSerializer):
results = ResultSerializer(many=True, read_only=True, source='result_set')
class Meta:
model = Device
fields = ['id', 'results']
Depending on your needs, read about select_related and prefetch_related.
If you want to use view for Result, you should Result.objects.prefetch_related('mainres').all()
Use django-rest-framework serializer to do it.
https://www.django-rest-framework.org/api-guide/serializers/
Read the doc, make some tests and your life will be simplify with django ;)
Change your serializer with :
class VarsResultsSerializer(serializers.ModelSerializer):
id_res = ResultsSerializer(read_only=True)
class Meta:
model = VarsResults
fields = ['id', 'id_res', 'var_id', 'var_val', 'var_hash']

How to join two tables in django and serialize the same using one serializer?

I have been learning django and django rest framework since couple of weeks and I want to figure out how can I join two tables and serialize the data of same to return the json response using django rest framework.
I want to return result as json response:
{ 'user_id_id': 1, 'request_msg': 'Hi', 'response_msg': "Hi, Welcome" }
where result is
from django.db import connection
cursor = connection.cursor()
con = cursor.execute("SELECT backend_request_messages.user_id_id, backend_request_messages.request_msg as request_msg,backend_response_messages.response_msg as response_msg FROM backend_request_messages,backend_response_messages Where backend_request_messages.user_id_id=backend_response_messages.user_id_id=1 ")
Here is what I have tried :
#backend/Models.py
class User(models.Model):
username = models.CharField(max_length=50)
name = models.CharField(max_length=50, blank=True, null=True)
uid = models.CharField(max_length=12, blank=True, null=True)
age = models.CharField(max_length=3, blank=True, null=True)
active = models.BooleanField(default=True)
class Meta:
default_related_name = 'users'
def __str__(self):
return self.name
class Request_Messages(models.Model):
request_msg = models.CharField(max_length=100)
request_msg_created_at = models.DateTimeField(auto_now_add=True)
user_id = models.ForeignKey(
User, on_delete=models.CASCADE, null=True)
class Meta:
default_related_name = 'request_messages'
def __str__(self):
return self.request_msg
class Response_Messages(models.Model):
response_msg = response_msg = models.CharField(max_length=400)
response_msg_created_at = models.DateTimeField(auto_now_add=True)
user_id = models.ForeignKey(
User, on_delete=models.CASCADE, null=True)
class Meta:
default_related_name = 'response_messages'
def __str__(self):
return self.response_msg
#backend/serializers.py
class ListSerializer (serializers.Serializer):
user_id_id = serializers.IntegerField()
request_msg = serializers.CharField(max_length=100)
# request_msg_created_at = serializers.DateTimeField(read_only=True)
response_msg = serializers.CharField()
# response_msg_created_at = serializers.DateTimeField(read_only=True)
#backend/views.py
from rest_framework import status
from rest_framework.decorators import api_view
from rest_framework.response import Response
from .models import Response_Messages, Request_Messages, User
from .serializers import ListSerializer
from django.db import connection
#api_view(['GET', 'POST'])
def chatbot(request):
if request.method == 'GET':
cursor = connection.cursor()
query_set = cursor.execute("SELECT backend_request_messages.user_id_id, backend_request_messages.request_msg as request_msg,backend_response_messages.response_msg as response_msg FROM backend_request_messages,backend_response_messages Where backend_request_messages.user_id_id=backend_response_messages.user_id_id=1 ")
columns = [column[0] for column in query_set.description]
results = []
for row in query_set.fetchall():
results.append(dict(zip(columns, row)))
serializer = ListSerializer(results)
return Response(serializer.data)
About serializers, You should refer to the docs (they're awesome and explain it best).
To give you a direction, I like to create a serializer for every model and if it's related to another model, I refer that in serializer, that way, You can easily customize behavior for each model (although not the only way at all).
So, about serializing I would do the following (notice my comments as well):
from django.contrib.auth.models import User
class User(User):
# Your user class, except, it should inherit Django's User/AbstractUser class.
class RequestMessages(models.Model):
request_msg = models.CharField(max_length=100)
request_msg_created_at = models.DateTimeField(auto_now_add=True)
user = models.ForeignKey(
User, on_delete=models.CASCADE, null=True, related_name='requests_msg')
# NOTICE THE NEW RELATED NAME, WE'LL USE IT LATER.
class Meta:
default_related_name = 'request_messages'
def __str__(self):
return self.request_msg
class ResponseMessages(models.Model):
response_msg = response_msg = models.CharField(max_length=400)
response_msg_created_at = models.DateTimeField(auto_now_add=True)
user = models.ForeignKey(User, on_delete=models.CASCADE, null=True, related_name='responses_msg')
def __str__(self):
return self.response_msg
class RequestMsgSerializer(serializers.ModelSerializer):
# Specify what ever you like...
class Meta:
model = RequestMessages
fields = # Whatever you like to serialize.
class ResponseMsgSerializer(serializers.ModelSerializer):
class Meta:
model = ResponseMessages
fields = # Whatever you want serialized.
class UserSerializer(serializers.ModelSerializer):
# Using required = False will cause that every time you create a user they don't have to own messages.
requests_msg = RequestMsgSerializer(many=False, required=False)
responses_msg = ResponseMsgSerializer(many=False, required=False)
class Meta:
model = User
field = # Same as above ..
About your query, using raw SQL in Django is rear, usually, in most cases the Django built-in ORM will do the job and usually faster and better than you.
In your case, if you'll call your query like this for exmaple:
query_set = User.objects.filter(user=request.user)
the QuerySet object created will hit the DB one for the user object and X queries for all the associated messages with said user, so expensive.
But no need for a custom query with joins and stuff like that, Django has prefetch_related and select_related.
exmaple:
query_set = User.objects.filter(user=request.user).prefetch_related('requests_msg')
will reduce all the queries made for request messages associated to a user to only one!
Recap:
I wrote a lot because I'm still learning this stuff myself self and if you teach others you got it!
Refer to DRF's docs about serializers (there's even a dedicated section for nested serializers) and API Views, they really great.
Refer to Django's docs about prefetch related, select related and queries in general, again, Amazing docs that cover everything.
Don't just copy my code or anyone else's, there's no problem with that, just make sure you understand it first if not, you're bound to get stuck with it again!

How do I enforce a ManyToMany blank=False constraint on my Django model?

I'm using Django 3 and Python 3.8. I have the below model, Note the "types" ManyToMany field, in which I set "blank" to False.
class Coop(models.Model):
objects = CoopManager()
name = models.CharField(max_length=250, null=False)
types = models.ManyToManyField(CoopType, blank=False)
addresses = models.ManyToManyField(Address)
enabled = models.BooleanField(default=True, null=False)
phone = models.ForeignKey(ContactMethod, on_delete=models.CASCADE, null=True, related_name='contact_phone')
email = models.ForeignKey(ContactMethod, on_delete=models.CASCADE, null=True, related_name='contact_email')
web_site = models.TextField()
I want verify a validation error occurs if I leave that field blank, so I have
#pytest.mark.django_db
def test_coop_create_with_no_types(self):
""" Verify can't create coop if no """
coop = CoopFactory.create(types=[])
self.assertIsNotNone(coop)
self.assertNone( coop.id )
and use the following factory (with FactoryBoy) to build the model
class CoopFactory(factory.DjangoModelFactory):
"""
Define Coop Factory
"""
class Meta:
model = Coop
name = "test model"
enabled = True
phone = factory.SubFactory(PhoneContactMethodFactory)
email = factory.SubFactory(EmailContactMethodFactory)
web_site = "http://www.hello.com"
#factory.post_generation
def addresses(self, create, extracted, **kwargs):
if not create:
# Simple build, do nothing.
return
if extracted:
# A list of types were passed in, use them
for address in extracted:
self.addresses.add(address)
else:
address = AddressFactory()
self.addresses.add( address )
#factory.post_generation
def types(self, create, extracted, **kwargs):
if not create:
# Simple build, do nothing.
return
if extracted:
# A list of types were passed in, use them
for _ in range(extracted):
self.types.add(CoopTypeFactory())
However, the "self.assertNone( coop.id )" assertion fails (an ID is generated). I would expect this not to happen, since I haven't specified any types. What else do I need to do to enforce my constraint, or should I be using a different constraint?
Edit: In response to #Melvyn's suggestion, tried modifying the test to the below
#pytest.mark.django_db
def test_coop_create_with_no_types(self):
""" Test customer model """ # create customer model instance
coop = CoopFactory.build(types=[])
coop.full_clean()
self.assertIsNotNone(coop)
self.assertIsNone( coop.id )
but not only did not get a validation error for the "types," field, got validation errors for the email and phone fields, which are clearly being populated in the factory.
File "/Users/davea/Documents/workspace/chicommons/maps/web/tests/test_models.py", line 76, in test_coop_create_with_no_types
coop.full_clean()
File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/django/db/models/base.py", line 1221, in full_clean
raise ValidationError(errors)
django.core.exceptions.ValidationError: {'phone': ['This field cannot be blank.'], 'email': ['This field cannot be blank.']}
Edit: Per the answer given #ArakkalAbu, I implemented the suggestion (https://github.com/chicommons/maps/blob/master/web/directory/serializers.py) but this test continues to pass
#pytest.mark.django_db
def test_coop_create_no_coop_types(self):
""" Test coop serizlizer model """
name = "Test 8899"
street = "222 W. Merchandise Mart Plaza, Suite 1212"
city = "Chicago"
postal_code = "60654"
enabled = True
postal_code = "60654"
email = "test#example.com"
phone = "7732441468"
web_site = "http://www.1871.com"
state = StateFactory()
serializer_data = {
"name": name,
"types": [
],
"addresses": [{
"formatted": street,
"locality": {
"name": city,
"postal_code": postal_code,
"state": state.id
}
}],
"enabled": enabled,
"phone": {
"phone": phone
},
"email": {
"email": email
},
"web_site": web_site
}
serializer = CoopSerializer(data=serializer_data)
assert serializer.is_valid(True), serializer.errors
You can't enforce this constrain, blank=False into neither database-level nor model-level. Because every m2m relation has a record with a foreign key to both sides of the m2m relation (the through--(Django Doc) relation).
Also in m2m relations, the m2m items are linked by a separate operation under the hood.
Create CoopFactory instance
Add CoopType to Coop.types by using .add()--(django doc) or .set()--(django doc) methods
That is, you can not create a M2M relation directly by
Coop.objects.create(name='foo', types=[1, 2, 3]) # 1,2 & 3 are PKs of `CoopType`
This statement trigger an exception by saying,
TypeError: Direct assignment to the forward side of a many-to-many set is prohibited. Use types.set() instead.
What is the best bet?
as per your this comment,
I'm not using this in a form though. I'm using the model by a serializer as part of the Django rest framework.
Since you are using DRF, you can validate the incoming payload.
class CoopSerializer(serializers.ModelSerializer):
class Meta:
model = Coop
fields = '__all__'
extra_kwargs = {
'types': {
'allow_empty': False
}
}
# execution
s = CoopSerializer(data={'name': 'foo coop', 'types': []})
s.is_valid(True)
s.save()
# excption
rest_framework.exceptions.ValidationError: {'types': [ErrorDetail(string='This list may not be empty.', code='empty')]}
This will help you to enforce to have a required M2M data.
blank=True on a ManyToManyField is not translated as a DBMS constraint, but will be (for exemple) checked on form validation.
On your unit-test, you use CoopFactory.create that seems to not check this logical (and non-dbms) constraint.
See https://docs.djangoproject.com/en/3.0/ref/models/fields/#blank
Note that this is different than null. null is purely database-related, whereas blank is validation-related. If a field has blank=True, form validation will allow entry of an empty value. If a field has blank=False, the field will be required.

how to convert this model into non editable model in admin of django?

I create a model but I need a non-editable model. When I visit the model it's just displaying text, not giving any option of update the record in admin panel.
model.py
class Contact(models.Model):
name = models.CharField(max_length=160)
email = models.EmailField()
subject = models.CharField(max_length=255)
message = models.TextField()
def __str__(self):
return self.subject
While searching I get information about readonly but I do not still understand how to use it.
There are two ways to do this. One, make editable=False for all the fields, then it will not be edible anywhere(modelform and adminsite):
class Contact(models.Model):
name = models.CharField(max_length=160, editable=False)
email = models.EmailField(editable=False)
subject = models.CharField(max_length=255, editable=False)
message = models.TextField(editable=False)
Two, in adminsite, use readonly_fields:
class ContactAdmin(admin.ModelAdmin):
readonly_fields = ('name', 'email', 'subject', 'message')
admin.site.register(Contact, ContactAdmin)

NOT NULL constraint failed: product_product.author_id

When I am going to post objects in the postman I am getting this error
NOT NULL constraint failed: product_product.author_id
I included Basic Auth in the Authorization section anyway it gives me error.
models.py
class Product(models.Model):
category = models.ForeignKey(Category, on_delete=models.CASCADE)
name = models.CharField(max_length=200)
brand = models.CharField(max_length=200)
rating = models.PositiveSmallIntegerField(default=0)
description = models.TextField()
author = models.ForeignKey(User, on_delete=models.CASCADE)
serializers.py
class ProductSerializer(serializers.HyperlinkedModelSerializer):
category = serializers.SlugRelatedField(queryset=Category.objects.all(), slug_field='name')
author = serializers.ReadOnlyField(source='author.username')
class Meta:
model = Product
fields = ['id', 'url', 'category', 'name', 'brand', 'rating', 'description', 'price']
Why Not Null constraint happens? how can I solve this? Thanks in advance!
I think you should add models.create_all() in Models.py to create the database
The error happens simply because your create request is not providing a value for
author attribute of a product.
# models.py
class Product(models.Model):
....
author = models.ForeignKey(User, on_delete=models.CASCADE)
so in order to resolve this issue totally depends on your business logic,
Is it important for a product to have an Author?
[NO] ==> then just make the foreign key null=True, blank=True
[YES] ==> Then you need to modify your creation logic a lil bit.
Is the author of a product is the same one who created it?
[YES], then this can easily be done by overriding your serializer's create method
....
# Inside you serializer
def create(self, validated_data):
validated_data['author'] = self.context['request'].user
return super().create(validated_data)
[NO], You have to make the serializer accept writes on author field.
A small note, your ProductSerializer Meta class's fields attribute, doesn't include 'author', make sure it is added there too.

Resources