python: de-serializing objects - object

We have classes of this form classA version1, classA version2, classA version3 .. etc. This is same class that has been modified. Each "modification" creates a new version of a class. Each object has a version attribute which refers to the version of the class from which it was derived. eg ObjectA.version =1 # means it was derived from ClassA version1
Here is my problem. During object de-serializing, i would like to use the specific version of the class that was used to used to make the object. For example, if i am de-serializing object ObjectA with version=3 then ClassA version 3 should be used. Source code for all the different variations of the classes is stored.
This looks getting the object first the get the class. Any idea on how to approach this?

You have three options:
Custom serialisation/deserialisation - you can put version information and class information first.
Create a "union class" which has all of the members of all of the class versions, then use that to create an instance of the appropriate class.
Refactor to a common base class, and have each "version" inherit from that class.
I would recommend option 3, because then your versions can co-exist cleanly.

Related

Puppet passing parameters from profile to module

I have a module "base" with an init.pp class which has some parameters as such:
class base (
$listen_ip = "xx.xx.xx.xx",
$listen_port = 3306,
$admin_username = 'admin',
$admin_password = 'admin',
)
{
...
}
Then I have created a profile "base" where I want to set some of the parameters:
class profile::base {
class { 'base':
$listen_ip = "xxx.xxx.xx.xx",
$listen_port => 6033,
}
}
Then the is a secondary profile where I want to set the username and password:
class profile::department::sales::base {
class { '::profile::base':
$admin_username = "some_user",
$admin_password => "some_pw",
}
}
However it's not possible to set the parameters from the "sales" profile.
The idea is that some values will be always the same for the base class and that some differ based on the department.
However it's not possible to set the parameters from the "sales" profile.
Not exactly. What is not allowed is using two different resource-like declarations for the same class while building one manifest. If you use even one then you must make certain that it is the first (or only) declaration of that class that the catalog builder evaluates.
To understand this, you need to appreciate that assigning parameter values is not the principal purpose of declarations such you are using. The principal purpose is rather to specify that the class in question should be included in the catalog in the first place. In service to that goal, values are bound to all the parameters of a class at the point where its first declaration is evaluated. Thus, your two class declarations do not supplement each other. Instead, they conflict with each other.
Even if the parameter values it specified for class base were identical to those declared by class profile::base, however, Puppet would still object to all uses of class profile::department::sales::base. To simplify evaluation and be absolutely certain to avoid inconsistency, it implements a stronger constraint than is actually required: that only the first-evaluated declaration of any given class may be a resource-like one.
Note: the latest docs actually specify an even stronger constraint than that: "Resource-like class declarations require that you declare a given class only once." In practice, however, this is a simplification (in every version of Puppet so far released since the introduction of parameterized classes). It is likely inspired by the fact that the order in which Puppet manifests are evaluated can be difficult to predict, so if you use include-like declarations along with a resource-like declaration of the same class, in different manifests, then it can be hard to ensure that the resource-like one is always evaluated first.
The idea is that some values will be always the same for the base
class and that some differ based on the department.
For most purposes it is best to avoid resource-like class declarations altogether, relying instead on external data (Hiera) for binding values to class parameters. Hiera recognizes a hierarchy of data sources (hence the name) and supports specifying different parameters at different levels, and even overriding data from one level at a higher-priority level.
My suggestion, then, is to leverage Hiera to assign appropriate parameter values to class base. There are many ways the specifics could play out.

PyQt4: QAbstractItemModel Object to QStandardItemModel Object

I have a QtGui.QAbstractItemModel object. I'd like to create a new QtGui.QStandardItemModel object based on the QtGui.QAbstractItemModel. Because QtGui.QStandardItemModel is sub-classed from QtGui.QAbstractItemModel I should be able to copy all data from one object to another. How do you do so?
Usually you would something like this:
data_model = QtGui.QAbstractItemModel()
new_data_model = QtGui.QAbstractItemModel(data_model)
but the Constructor does not support instantiating with that type of argument.
Any Ideas?
QAbstractItemModel is an abstract class that can not and should not be instantiated. Its usefulness is mainly for inheritance since it serves as a basis for any type of model such as QStandardItemModel.
In addition to passing as a parent a model to another does not imply that the data will be copied, and finally QAbstractItemModel is in the submodule QtCore, not in QtGui.

Saving Object State with Pickle (objects containing objects)

I'm trying to figure out how to serialize an object with Pickle to a save file. My example is an object called World and this object has a list (named objects) of potentially hundreds of instantiated objects of different class types.
The problem is that Pickle won't let me serialize the items within the World.objects list because they aren't instantiated as attributes of World.
When I attempt to serialize with:
with open('gsave.pkl', 'wb') as output:
pickle.dump(world.objects, output, pickle.DEFAULT_PROTOCOL)
I get the following error:
_pickle.PicklingError: Can't pickle <class 'world.LargeHealthPotion'>:
attribute lookup LargeHealthPotion on world failed
So, my question is: what is an alternative way of storing the world.objects list items so that they are attributes of world rather than list items that don't get saved?
UPDATE
I think my issue isn't where the objects are stored; but rather that the class LargeHealthPotion (and many others) are dynamically created within the World class by operations such as this:
def __constructor__(self, n, cl, d, c, h, l):
# initialize super of class type
super(self.__class__, self).__init__(name=n, classtype=cl, description=d, cost=c,
hp=h, level=l)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name,
(eval("{}.{}".format(name,row[1].value)),),
{'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[item_name] = item(obj_name, obj_classtype, obj_description, obj_cost,
obj_hp, obj_level)
When this finishes, I will have a new object like <world.LargeHealthPotion object at 0x103690ac8>. I do this dynamically because I don't want to explicitly have to create hundreds of different types of classes for each different type of object in my world. Instead, I create the class dynamically while iterating over the item name (with it's stats) that I want to create.
This introduces a problem though, because when pickling, it can't find the static reference to the class in order to deconstruct, or reconstruct the object...so it fails.
What else can I do? (Besides creating literal class references for each, and every, type of object I'm going to instantiate into my world.)
Pickle does not pickle classes, it instead relies on references to classes which doesn't work if the class was dynamically generated. (this answer has appropriate exert and bolding from documentation)
So pickle assumes that if your object is from the class called world.LargeHealthPotion then it check that that name actually resolves to the class that it will be able to use when unpickling, if it doesn't then you won't be able to reinitialize the object since it doesn't know how to reference the class. There are a few ways of getting around this:
Define __reduce__ to reconstruct object
I'm not sure how to demo this method to you, I'd need much more information about your setup to suggest how to implement this but I can describe it:
First you'd make a function or classmethod that could recreate one object based on the arguments (probably take class name, instance variables etc.) Then define __reduce__ on the object base class that would return that function along with the arguments needed to pass to it when unpickling.
Put the dynamic classes in the global scope
This is the quick and dirty solution. Assuming the class names do not conflict with other things defined in the world module you could theoretically insert the classes into the global scope by doing globals()[item_name] = item_type, but I do not recommend this as long term solution since it is very bad practice.
Don't use dynamic classes
This is definitely the way to go in my opinion, instead of using the type constructor, just define your own class named something like ObjectType that:
Is not a subclass of type so the instances would be pickle-able.
When an instance is it called constructs a new game-object that has a reference to the object type.
So assuming you have a class called GameObject that takes cls=<ObjectType object> you could setup the ObjectType class something like this:
class ObjectType:
def __init__(self, name, description):
self.item_name = name
self.base_item_description = description
#other qualities common to all objects of this type
def __call__(self, cost, level, hp):
#other qualities that are specific to each item
return GameObject(cls=self, cost=cost, level=level, hp=hp)
Here I am using the __call__ magic method so it uses the same notation as classes cls(params) to create instances, the cls=self would indicate to the (abstracted) GameObject constructor that the class (type) of GameObject is based on the ObjectType instance self. It doesn't have to be a keyword argument, but I'm not sure how else to make a coherent example code without knowing more about your program.

One class in same package does not bind to JAXB context

Using JAXBContext.newInstance("com.jaxbgen") to bind classes in this package.
And then use this context to create mashaller.
It's so strange that one entity class xxx in this package could not mashaller, and throw JAXBException nor any of its super class is known to this context.
And the other works well.
I try to use JAXBContext.newInstance(xxx.class) to initial the context, it works well.
But I need to use package name to mashaller all classed in this package.
Could anyone help me on it?
When the package name is used to create a JAXBContext the JAXB impl does one of the following:
Looks for a class called ObjectFactory and then transitively pulls in all reference ed classes.
Looks for a text file called jaxb.index which contains a carriage return separated list of short class names (not package qualified). These classes and all referenced classes are the processed.

Add constraints to properties of Groovy class (not Grails domain class!)

How can we add some common constraints (i.e. maxLength, nullable) to a property of a Groovy class? I know we can do it at Grails domain class, but is it possible if that is a Groovy class (I use it as a DTO class for my Grails project)?
Thank you so much!
You can add constraints to command classes. If a command class is in the same .groovy file as a controller (in Groovy you can have more than one public class in each .groovy file), you don't need to do anything special for Grails to recongise it as a command class.
However, if your command class is somewhere else (e.g. under src/groovy), you need to annotate it with #Validateable and add the package name to the grails.validateable.packages parameter in Config.groovy. Here's an example of a command that's not in the same file as a controller
pacakge com.example.command
#Validateable
class Person {
Integer age
String name
static constraints = {
name(blank: false)
age(size 0..100)
}
}
Add the following to Config.groovy
grails.validateable.packages = ['com.example.command']
Command classes have a validate() method added by Grails. After this method is called, any errors will be available in the errors property (as per domain classes).
Using a grails Command Object is probably your best bet. It has constraints and validation, but no database backing. It's normally a value object that controllers use, but you could instantiate one outside of a controller without any problems.
Not sure if this is relevant to your use (I am not familiar with DTOs), but in the current version (2.3.8), you can also add Grails constraints to an abstract class, and they will be inherited by the domains that extend it. Your IDE might not like it though ;)

Resources