I am working with Groovy 1.7.6 and we have a class which uses a basic mixin (simplified for this purpose below):
class MixinClass {
def self
def init(me){
this.self = me
}
}
I add this mixin to another class (Also simplified) as follows:
#Mixin(MixinClass)
class FunctionEntry {
def FunctionEntry(){
init(this)
}
..
The above is basically just passing a reference to the instance of the FunctionEntry class to the #mixin for reference (all though this could be any data)
If I now run a test, and loop through thousands of times calling new FunctionEntry() I eventually get a Heap OOM error (note, the loop literally just instantiates the object, it doesn't store it anywhere ).
Having debugged the heap I see the problem is MixinInMetaClass - My conclusion is that even though I have discarded the object instance, the MixinInMetaClass is retaining the data and not being collected.
Has anyone come across this problem? Is this a known Groovy 1.7.6 problem? I was under the impression that Mixins could hold state, but this evidence seems to suggest that the MixinInMetaClass is holing on to object data everytime i create an object.
Related
Firstly, thank you for taking the time to read and input. It is greatly appreciated.
Question: What kind of approach can we take to keep the same public API of a class currently using multiple mixins but refactor it internally to be composed of objects that do the same work as the mixin. Autocomplete is a must (so runtime dynamics are kind of out such as hacking things on via __getattr__ or similar - I know this depends on the runtime environment i.e ipython vs pycharm etc, for the sake of this question, assume pycharm which cannot leverage __dir__ I think fully.
Accompanying Information:
I am writing a little assertion library in python and I have a core class which is instantiated with a value and subsequently inherits various assertion capabilities against that value via a growing number of mixin classes:
class Asserto(StringMixin, RegexMixin):
def __init__(self, value: typing.Any, type_of: str = AssertTypes.HARD, description: typing.Optional[str] = None):
self.value = value
self.type_of = type_of
self.description = description
These mixin classes offer various assertion methods for particular types, here is a quick example of one:
from __future__ import annotations
class StringMixin:
def ends_with(self, suffix: str) -> StringMixin:
if not self.value.endswith(suffix):
self.error(f"{self.value} did not end with {suffix}")
def starts_with(self, prefix: str) -> StringMixin:
if not self.value.startswith(prefix):
self.error(f"{self.value} did not end with {prefix}")
I would like to refactor the Asserto class to compose itself of various implementations of some sort of Assertable interface rather than clobber together a god class here with Mixins, I'm likely to have 10+ Mixins by the time I am finished.
Is there a way to achieve the same public facing API as this mixins setup so that client code has access to everything through the Asserto(value).check_something(...) but using composition internally?
I could define every single method in the Asserto class that just delegate to the appropriate concrete obj internally but then I am just making a massive god class anyway and the composition feels like a pointless endeavour in that instance?
for example in client code, I'd like all the current mixins methods to be available on an Asserto instance with autocomplete.
def test_something():
Asserto("foo").ends_with("oo")
Thank you for your time. Perhaps using the mixin approach is the correct way here, but it feels kind of clunky.
I am working on a shared library for Jenkins, and I want to access some utilities methods between some classes, but not all of them, thus I have established some statements:
I would like to avoid using static methods, since it does not access pipeline steps directly, and passing the pipeline instance every call would be a pain;
I would like to avoid a singleton as well, or prefixing every method call with the util class' instance;
Since it is not supposed to be shared between all classes I would like to avoid putting every method as a file on vars/ special directory, but I would like a similar behavior;
Despite extending the class would be a anti-pattern, it would be acceptable, though I would like to avoid the verbose Java syntax for declaring the class the same name as the file, once it is implicit in groovy;
This question does solve my problem partially, although there are issues with serialization, I noted that when I use checkpoint and some build is resumed from some stage, the instance loses all extra methods.
This other question would have helped me fix the serialization issue, however the author seems the have solved the root cause of his problem using a way that is not the original question titled for.
Is there a way to extends a implicit script class in groovy without using the class NameOfFile extends SomeOtherClass { put every thing inside this block } syntax? And without working with inner-class?
Or else, is there a way to declare a constructor using the script groovy syntax analogue as the previous question?
Or even, is there a way to change the serialization behavior to install the extra methods again after unserializing?
Appendix
The script syntax works more-or-less like this:
Consider the content of file src/cicd/pipeline/SomePipeline.groovy:
package cicd.pipeline
// there is no need to wrap everything inside class SomePipeline,
// since it is implicit
def method() {
// instance method, here I can access pipeline steps freely
}
def static otherMethod() {
// static method, here it is unable to access pipeline steps
// without a instance
}
#groovy.transform.Field
def field
def call() {
// if the class is used as method it will run
this.method()
SomePipeline.otherMethod() // or simply otherMethod() should work
this.field = 'foo'
println "this instance ${this.getClass().canonicalName} should be cicd.pipeline.SomePipeline"
}
// any code other than methods or variables with #Field
// annotation will be inside a implicit run method that is
// triggered likewise main method but isn't a static one
def localVar = 'foo'
println "It will not execute on constructor since it is on run: $localVar"
println "Method: ${org.codehaus.groovy.runtime.StackTraceUtils.sanitize(new Throwable()).stackTrace[0].methodName}"
println "this instance ${this.getClass().canonicalName} should be cicd.pipeline.SomePipeline"
If I was going to use the Java verbose syntax I would have to wrap almost everything inside a class SomePipeline which is implicit in groovy, this is the script syntax I want to keep.
I realised that this.getClass().superclass.canonicalName when outside Jenkins pipeline is groovy.lang.Script and when inside pipeline is org.jenkinsci.plugins.workflow.cps.CpsScript and based on this resource I was able to elaborate the following solution:
abstract class CustomScript extends org.jenkinsci.plugins.workflow.cps.CpsScript {
public CustomScript() {
// do something here, it will always execute regardless
// serialization, and before everything
}
}
#groovy.transform.BaseScript CustomScript baseScript
That is it, worked as expected! Of course you can elaborate this solution better in order to reduce repeating and avoid inner-classes, but I will leave it for your imagination.
I'm trying to figure out how to serialize an object with Pickle to a save file. My example is an object called World and this object has a list (named objects) of potentially hundreds of instantiated objects of different class types.
The problem is that Pickle won't let me serialize the items within the World.objects list because they aren't instantiated as attributes of World.
When I attempt to serialize with:
with open('gsave.pkl', 'wb') as output:
pickle.dump(world.objects, output, pickle.DEFAULT_PROTOCOL)
I get the following error:
_pickle.PicklingError: Can't pickle <class 'world.LargeHealthPotion'>:
attribute lookup LargeHealthPotion on world failed
So, my question is: what is an alternative way of storing the world.objects list items so that they are attributes of world rather than list items that don't get saved?
UPDATE
I think my issue isn't where the objects are stored; but rather that the class LargeHealthPotion (and many others) are dynamically created within the World class by operations such as this:
def __constructor__(self, n, cl, d, c, h, l):
# initialize super of class type
super(self.__class__, self).__init__(name=n, classtype=cl, description=d, cost=c,
hp=h, level=l)
# create the object class dynamically, utilizing __constructor__ for __init__ method
item = type(item_name,
(eval("{}.{}".format(name,row[1].value)),),
{'__init__':__constructor__})
# add new object to the global _objects object to be used throughout the world
self._objects[item_name] = item(obj_name, obj_classtype, obj_description, obj_cost,
obj_hp, obj_level)
When this finishes, I will have a new object like <world.LargeHealthPotion object at 0x103690ac8>. I do this dynamically because I don't want to explicitly have to create hundreds of different types of classes for each different type of object in my world. Instead, I create the class dynamically while iterating over the item name (with it's stats) that I want to create.
This introduces a problem though, because when pickling, it can't find the static reference to the class in order to deconstruct, or reconstruct the object...so it fails.
What else can I do? (Besides creating literal class references for each, and every, type of object I'm going to instantiate into my world.)
Pickle does not pickle classes, it instead relies on references to classes which doesn't work if the class was dynamically generated. (this answer has appropriate exert and bolding from documentation)
So pickle assumes that if your object is from the class called world.LargeHealthPotion then it check that that name actually resolves to the class that it will be able to use when unpickling, if it doesn't then you won't be able to reinitialize the object since it doesn't know how to reference the class. There are a few ways of getting around this:
Define __reduce__ to reconstruct object
I'm not sure how to demo this method to you, I'd need much more information about your setup to suggest how to implement this but I can describe it:
First you'd make a function or classmethod that could recreate one object based on the arguments (probably take class name, instance variables etc.) Then define __reduce__ on the object base class that would return that function along with the arguments needed to pass to it when unpickling.
Put the dynamic classes in the global scope
This is the quick and dirty solution. Assuming the class names do not conflict with other things defined in the world module you could theoretically insert the classes into the global scope by doing globals()[item_name] = item_type, but I do not recommend this as long term solution since it is very bad practice.
Don't use dynamic classes
This is definitely the way to go in my opinion, instead of using the type constructor, just define your own class named something like ObjectType that:
Is not a subclass of type so the instances would be pickle-able.
When an instance is it called constructs a new game-object that has a reference to the object type.
So assuming you have a class called GameObject that takes cls=<ObjectType object> you could setup the ObjectType class something like this:
class ObjectType:
def __init__(self, name, description):
self.item_name = name
self.base_item_description = description
#other qualities common to all objects of this type
def __call__(self, cost, level, hp):
#other qualities that are specific to each item
return GameObject(cls=self, cost=cost, level=level, hp=hp)
Here I am using the __call__ magic method so it uses the same notation as classes cls(params) to create instances, the cls=self would indicate to the (abstracted) GameObject constructor that the class (type) of GameObject is based on the ObjectType instance self. It doesn't have to be a keyword argument, but I'm not sure how else to make a coherent example code without knowing more about your program.
I'm doing this Ensime package for Atom.io https://github.com/ensime/ensime-atom and I've been thinking about the possibility to use scala.js instead of writing Coffeescript.
Atom is a web based editor which is scripted with js and is node.js based. A plugin/package defines it's main entry point by pointing out a javascript object with a few specific.
I figured I should start out simple and try using scala.js replacing the simplest coffeescript file I have:
{View} = require 'atom-space-pen-views'
# View for the little status messages down there where messages from Ensime server can be shown
module.exports =
class StatusbarView extends View
#content: ->
#div class: 'ensime-status inline-block'
initialize: ->
serialize: ->
init: ->
#attach()
attach: =>
statusbar = document.querySelector('status-bar')
statusbar?.addLeftTile {item: this}
setText: (text) =>
#text("Ensime: #{text}").show()
destroy: ->
#detach()
As you can see this exports a require.js module and is a class extending a class fetched with require as well.
Sooo.
I'm thinking I'd just use Dynamic for the require dep as I've seen on SO How to invoke nodejs modules from scala.js?:
import js.Dynamic.{global => g}
import js.DynamicImplicits._
private[views] object SpacePen {
private val spacePenViews = require("atom-space-pen-views")
val view = spacePenViews.view
}
But if I wanted to type the super-class, could I just make a facade-trait and do asInstanceOf?
Secondly, I wonder how I can export my class as a node module. I found this:
https://github.com/rockymadden/scala-node/blob/master/main/src/main/coffeescript/example.coffee
Is this the right way? Do I need to do the sandboxing? Couldn't I just get moduleimported from global and write module.exports = _some_scala_object_?
I'm also wondering how I could extend existing js classes. The same problem as asked here, but I don't really understand the answer:
https://groups.google.com/forum/#!topic/scala-js/l0gSOSiqubs
My code so far:
private[views] object SpacePen {
private val spacePenViews = js.Dynamic.global.require("atom-space-pen-views")
type View = spacePenViews.view
}
class StatusBarView extends SpacePen.View {
override def content =
super.div()
}
gives me compile errors that I can't extend sealed trait Dynamic. Of course.
Any pointers highly appreciated!
I'm not particularly expert in Node per se, but to answer your first question, yes -- if you have a pointer to a JS object, and you know the details of its type, you can pretty much always define a facade trait and asInstanceOf to use it. That ought to work.
As for the last bit, you basically can't extend JS classes in Scala.js -- it just doesn't work. The way most of us get around that is by defining implicit classes, or using implicit def's, to get the appearance of extending without actually doing so.
For example, given JS class Foo, I can write
implicit class RichFoo(foo:Foo) {
def method1() = { ... }
}
This is actually a wrapper around Foo, but calling code can simply call foo.method1() without worrying about that detail.
You can see this approach in action very heavily in jquery-facade, particularly in the relationship between JQuery (the pure facade), JQueryTyped (some tweaked methods over JQuery to make them work better in Scala), and JQueryExtensions (some higher-level functions built around JQuery). These are held together using implicit def's in package.scala. As far as calling code is concerned, all of these simply look like methods on JQuery.
Groovy noob here, I'm working through my first Groovy book and it has example code where it states roughly
"If I want a property to be a ready-only property then declare it final. This is not defining a final field but a read-only property-you can change the property from within instance methods of the defining class, but not from outside"
Here is the code I have in question, but I keep getting an error stating:
cannot modify final field 'miles' outside of constructor.
Code:
class Car
{
final miles = 0
def getMiles()
{
println "getMiles called"
miles
}
def drive(dist){if (dist>0) miles += dist }
}
The book says I should be able to modify miles from within the drive instance method, am I doing something wrong?
I think what they meant (not sure what they said, if you're paraphrasing) is that there's no setter method defined, so it can't be modified from outside the class.
It is, however, still a final property, which means it can't be modified once its set, which would be in a constructor or during the declaration.
Property and field rules
That said, see these two issues: 1628, 2752, so more exploration might be necessary, although this appears limited to local script properties.
My guess is you're using a 1.7+ Groovy, while the book targets <= 1.6.
See also this SO question.