Templating Python class level attributes to create generic rest serializers - python-3.x

I'm using the Django Rest Framework and would like to serialize different types using the same format. The format being a list of all instances of a specific type as well a certain selected instance.
My problem is that I have to write a different serializer class for every type that I want to serialize. In C++ I'd solve this by giving the type and type serializer as a template argument. How can I do this in Python?
The generic Object I'd like to serialize:
class OptionSelect(object):
def __init__(self, options, selected):
self.options = options
self.selected = selected
What I currently need to serialize it:
class TypeAOptionSerializer(serializers.Serializer):
options = TypeASerializer(many=True)
selected = TypeASerializer()
class TypeBOptionSerializer(serializers.Serializer):
options = TypeBSerializer(many=True)
selected = TypeBSerializer()
class TypeCOptionSerializer(serializers.Serializer):
options = TypeCSerializer(many=True)
selected = TypeCSerializer()
Instead I'd like to create a Serializer like this:
class OptionSerializer(serializers.Serializer):
options = serializer(many=True)
selected = serializer()
def __init__(self, serializer):
self.serializer = serializer
super().__init__()
Is there maybe a different approach that I should be taking?

You can try the following:
def create_serializer(serializer):
class MySerializer(serializers.Serializer):
options = serializer(many=True)
selected = serializer()
return MySerializer
TypeAOptionSerializer = create_serializer(TypeASerializer)
TypeBOptionSerializer = create_serializer(TypeBSerializer)
TypeCOptionSerializer = create_serializer(TypeCSerializer)
This should be equivalent to your current approach with three separate classes.

Related

Accessing variables from a method in class A and using it in Class B in python3.5

I have a BaseClass and two classes (Volume and testing) which inherits from the BaseClass. The class "Volume" use a method "driving_style" from another python module. I am trying to write another method "test_Score" which wants to access variables computed in the method "driving_style" which I want to use to compute further. These results will be accessed to the class "testing" as shown.
from training import Accuracy
import ComputeData
import model
class BaseClass(object):
def __init__(self, connections):
self.Type = 'Stock'
self.A = connections.A
self.log = self.B.log
def getIDs(self, assets):
ids = pandas.Series(assets.ids, index=assets.B)
return ids
class Volume(BaseClass):
def __init__(self, connections):
BaseClass.__init__(self, connections)
self.daystrade = 30
self.high_low = True
def learning(self, data, rootClass):
params.daystrade = self.daystrade
params.high_low = self.high_low
style = Accuracy.driving_style()
return self.Object(data.universe, style)
class testing(BaseClass):
def __init__(self, connections):
BaseClass.__init__(self, connections)
def learning(self, data, rootClass):
test_score = Accuracy.test_score()
return self.Object(data.universe, test_score)
def driving_style(date, modelDays, params):
daystrade = params.daystrade
high_low = params.high_low
DriveDays = model.DateRange(date, params.daystrade)
StopBy = ComputeData.instability(DriveDays)
if high_low:
style = ma.average(StopBy)
else:
style = ma.mean(StopBy)
return style
def test_score(date, modelDays, params):
"want to access the following from the method driving_style:"
DriveDays =
StopBy =
return test_score ("which i compute using values DriveDays and StopBy and use test_score in the method learning inside
the 'class - testing' which inherits some params from the BaseClass")
You can't use locals from a call to a function that was made elsewhere and has already returned.
A bad solution is to store them as globals that you can read from later (but that get replaced on every new call). A better solution might to return the relevant info to the caller along with the existing return values (return style, DriveDays, StopBy) and somehow get it to where it needs to go. If necessary, you could wrap the function into a class and store the computed values as attributes on an instance of the class, while keeping the return type the same.
But the best solution is probably to refactor, so the stuff you want is computed by dedicated methods that you can call directly from test_score and driving_style independently, without duplicating code or creating complicated state dependencies.
In short, basically any time you think you need to access locals from another function, you're almost certainly experiencing an XY problem.

add dynamic field to serializer class

In my serializer class, I have defined two properties, and the third property could be derived from those two properties. Please see the code below
class ItemNameSerializer(NestedCreateUpdateMixin, ModelSerializer):
nested_child_field_name = 'attribute_names'
nested_child_serializer = AttributeNameSerializer
attribute_names = AttributeNameSerializer(many=True)
class Meta:
model = ItemName
fields = '__all__'
From the above code, we can see that
attribute_names = AttributeNameSerializer(many=True)
can be derived by
[nested_child_field_name] = nested_child_serializer(many=true)
So my question is
can I add a dynamic field which will be derived from other fields (to avoid writing redundant code) ?
if yes then how ?
the possible solutions can be of two types
A. overriding some ModelSerializer method.
B. generalized solution for any python class.
please try to provide both type of solutions (if possible)(and may be of some another type ?)
Well I found the Answer myself.
The serializer specific answer:
Turns out django rest frame work initialise the fields from deepcopy of instance (irrelevant)
But you can override __init__ method of the serializer and add field in self.fields. In my case I did it in the NestedCreateUpdateMixin where nested_child_field_name and nested_child_serializer already available
please see following code
def __init__(self, *args, **kwargs):
super(NestedCreateUpdateMixin, self).__init__(*args, **kwargs)
self.fields[self.nested_child_field_name] = self.nested_child_serializer(many=True)

How to add members to a class dynamically using options as coonstructor args in python?

I want to create my Python class which will dynamically build its members in the constructor based on the options.
Check the following sample code that I am trying to build.
options = {
'param1' = 'v1'
'param2' = 2
'param3' = bool
'param4' = {}
}
class MyCustomObject:
def __init__(self, options):
self.props = {}
for key in options.keys():
self.props[key] = options[key]
a1 = MyCustomObject(options)
# I want to access the attributes of the props using dot
a1.param1 = 10
print(a1.param1)
If I had known what options I need to add to the my object when defining the class, the I could have added following to the class definition:
#property
def param1(self):
return self.options['param1']
#param1.setter(self, value)
self.options['param1'] = value
How to achieve the same behavior dynamically i.e. providing the options during object instantiation?
You can make use of setattr method of Python.
setattr(object, name, value)
The setattr() function sets the value of the attribute of an object.
options = {'param1' :'v1','param2' : 2,'param3' : "bool",'param4' : {}}
class MyCustomObject:
def __init__(self, options):
for key in options.keys():
setattr(self, key, options[key])
a1 = MyCustomObject(options)

how to group homogeneous properties in a python class

using Python I'm creating a class with some properties that can be divided in homogeneous groups.
In some other languages (like C) I liked to use structures to group fields belonging to the same "topic", to maintain the code clean.
For example, let's say that I would like to group all the field related to the configurations of my program like: filepath, username, version, etc... under the config properties.
Folks, What do you do/use to manage this kind of data?
here an extract of the class what I wrote but doesn't work, because it is not supported.
...
...
self.config.filepath = ''
self.config.username = ''
self.config.version = ''
...
...
What is the more elegant way, or the best practice, to face to this situation?
Many thanks to all.
There are a few different ways:
Using a dict:
class MyClass:
def __init__(self):
self.config = {}
self.config['filepath'] = ''
self.config['username'] = ''
self.config['version']= ''
Using argparse.Namespace:
from argparse import Namespace
class MyClass:
def __init__(self):
self.config = Namespace(filepath='', username='', version='')
Using types.SimpleNamespace:
from types import SimpleNamespace
class MyClass:
def __init__(self):
self.config = SimpleNamespace(filepath='', username='', version='')

How to inherit from a class with __init__ and take multiple input parms with default values

I have two classes as follows:
class Buttons():
def __init__(self, dDict):
self.TCT = Tool()
self.dDict = dDict
def btnName(self):
# I will use both self.TCT and self.dDict here
a = self.dDict['setup']
class Switch(Buttons):
def __init__(self, iButtonType, sButtName=None):
self.TCT =Tool()
self.iButtonType = iButtonType
#sButtName being used below ....
I need to instantiate Switch class, but I need to provide dDict so that Buttons class will have the data it is looking for. What is the right way to instantiate the class Switch? I'm not sure how to handle multiple input parms with default data.
python python-3.x
This other possible solution is not an exact match. Both my classes take input parms and one of the classes has a default value set for its input parm.

Resources