Node.js Garbage Collection - node.js

Node.js will do auto garbage collections ?
var objUser = new Object ();
objUser.userName = objReq.userName;
userDB.registerUser (objUser , callback) ;
In the above code I have "objUser" which will be passed as an argument to another class and it is no longer required in the present class. Still, should I have to forcefully collect it or will it do automatically.
To do it manually, Will NULL help or is there any other mechanism given by Node Framework?
objUser = null;

Node does garbage collection, but if userDb.registerUser() retains a reference to it, your objUser will not be collected. Only when no references to an object remain it will be collected. You usually don't need to explicitly release local references by assigning null to the variable — when your function returns, all local references are released automatically. You need to worry only about global references to your object.

Also worth noting on this subject: It's been my experience that objects of the same type will reuse instances. So, if you truly want a "new Instance()" of an object make sure you nullify or reset any attributes in your constructors

Related

Service Fabric - Stateful Service Persistence

I am new to service fabric and started by looking at the MSDN articles covering the topic. I began by implementing the Hello World sample here.
I changed their original RunAsync implementation to:
var myDictionary = await this.StateManager.GetOrAddAsync<IReliableDictionary<int, DataObject>>("myDictionary");
while (!cancellationToken.IsCancellationRequested)
{
DataObject dataObject;
using (var tx = this.StateManager.CreateTransaction())
{
var result = await myDictionary.TryGetValueAsync(tx, 1);
if (result.HasValue)
dataObject = result.Value;
else
dataObject = new DataObject();
//
dataObject.UpdateDate = DateTime.Now;
//
//ServiceEventSource.Current.ServiceMessage(
// this,
// "Current Counter Value: {0}",
// result.HasValue ? result.Value.ToString() : "Value does not exist.");
await myDictionary.AddOrUpdateAsync(tx, 1, dataObject, ((k, o) => dataObject));
await tx.CommitAsync();
}
await Task.Delay(TimeSpan.FromSeconds(1), cancellationToken);
}
I also introduced a DataObject type and have exposed an UpdateDate property on that type.
[DataContract(Namespace = "http://www.contoso.com")]
public class DataObject
{
[DataMember]
public DateTime UpdateDate { get; set; }
}
When I run the app (F5 in visual studio 2015), a dataObject instance (keyed as 1) is not found in the dictionary so I create one, set UpdateDate, add it to the dictionary and commit the transaction. During the next loop, it finds the dataObject (keyed as 1) and sets UpdateDate, updates the object in the dictionary and commits the transaction. Perfect.
Here's my question. When I stop and restart the service project (F5 in visual studio 2015) I would expect that on my first iteration of the RunAsync that the dataObject (keyed as 1) would be found but it's not. I would expect all state to be flushed to its replica.
Do I have to do anything for the stateful service to flush its internal state to its primary replica?
From what I've read, it makes it sound as though all of this is handled by service fabric and that calling commit (on the transaction) is sufficient. If I locate the primary replica (in Service Fabric Explorer->Application View) I can see that the RemoteReplicator_xxx LastACKProcessedTimeUTC is updated once I commit the transaction (when stepping through).
Any help is greatly appreciated.
Thank you!
-Mark
This is a function of the default local development experience in Visual Studio. If you watch the Output window closely after hitting F5 you'll see a message like this:
The deployment script detects that there's an existing app of the same type and version already registered, so it removes it and deploys the new one. In doing that, the data associated with the old application is removed.
You have a couple of options to deal with this.
In production, you would perform an application upgrade to safely roll out the updated code while maintaining the state. But constantly updating your versions while doing quick iteration on your dev box can be tedious.
An alternative is to flip the project property "Preserve Data on Start" to "Yes". This will automatically bump all versions of the generated application package (without touching the versions in your source) and then perform an app upgrade on your behalf.
Note that because of some of the system checks inherent in the upgrade path, this deployment option is likely to be a bit slower than the default remove-and-replace. However, when you factor in the time it takes to recreate the test data, it's often a wash.
You need to think of a ReliableDictionary as holding collections of objects as opposed to a collection of references. That is, when you add an “object” to the dictionary, you must think that you are handing the object off completely; and you must not alter this object’s state in the anymore. When you ask ReliableDictionary for an “object”, it gives you back a reference to its internal object. The reference is returned for performance reasons and you are free to READ the object’s state. (It would be great if the CLR supported read-only objects but it doesn't.) However, you MUST NOT MODIFY the object’s state (or call any methods that would modify the object’s state) as you would be modifying the internal data structures of the dictionary corrupting its state.
To modify the object’s state, you MUST make a copy of the object pointed to by the returned reference. You can do this by serializing/deserializing the object or by some other means (such as creating a whole new object and copying the old state to the new object). Then, you write the NEW OBJECT into the dictionary. In a future version of Service Fabric, We intend to improve ReliableDictionary’s APIs to make this required pattern of use more discoverable.

Storing object in Esent persistent dictionary gives: Not supported for SetColumn Parameter error

I am trying to save an Object which implements an Interface say IInterface.
private PersistentDictionary<string, IInterface> Object = new PersistentDictionary<string, IInterface>(Environment.CurrentDirectory + #"\Object");
Since many classes implement the same interface(all of which need to cached), for a generic approach I want to store an Object of type IInterface in the dictionary.
So that anywhere I can pull out that object type cast it as IInterface and use that object's internal implementation of methods etc..
But, as soon as the Esent cache is initialized it throws this error:
Not supported for SetColumn
Parameter name: TColumn
Actual value was IInterface.
I have tried to not use XmlSerializer to do the same but is unable to deserialize an Interface type.Also, [Serializable] attribute cannot be used on top of a Interface, so I am stuck.
I have also tried to make all the implementations(classes) of the Interface as [Serializable] as a dying attempt but to no use.
Does any one know a way out ? Thanks in advance !!!
The only reason that only structs are supported (as well as some basic immutable classes such as string) is that the PersistentDictionary is meant to be a drop-in replacement for Dictionary, SortedDictionary and other similar classes.
Suppose I have the following code:
class MyClass
{
int val;
}
.
.
.
var dict = new Dictionary<int,MyClass>();
var x = new MyClass();
x.val = 1;
dict.Add(0,x);
x.val = 2;
var y = dict[0];
Console.WriteLine(y.val);
The output in this case would be 2. But if I'd used the PersistentDictionary instead of the regular one, the output would be 1. The class was created with value 1, and then changed after it was added to the dictionary. Since a class is a reference type, when we retrieve the item from the dictionary, we will also have the changed data.
Since the PersistentDictionary writes the data to disk, it cannot really handle reference types this way. Serializing it, and writing it to disk is essentially the same as treating the object as a value type (an entire copy is made).
Because it's intended to be used instead of the standard dictionaries, and the fact that it cannot handle reference types with complete transparency, the developers instead opted to support only structs, because structs are value types already.
However, if you're aware of this limitation and promise to be careful not to fall into this trap, you can allow it to serialize classes quite easily. Just download the source code and compile your own version of the EsentCollections library. The only change you need to make to it is to change this line:
if (!(type.IsValueType && type.IsSerializable))
to this:
if (!type.IsSerializable)
This will allow classes to be written to the PersistentDictionary as well, provided that it's Serializable, and its members are Serializable as well. A huge benefit is that it will also allow you to store arrays in there this way. All you have to keep in mind is that it's not a real dictionary, therefore when you write an object to it, it will store a copy of the object. Therefore, updating any of your object's members after adding them to the PersistentDictionary will not update the copy in the dictionary automatically as well, you'd need to remember to update it manually.
PersistentDictionary can only store value-structs and a very limited subset of classes (string, Uri, IPAddress). Take a look at ColumnConverter.cs, at private static bool IsSerializable(Type type) for the full restrictions. You'd be hitting the typeinfo.IsValueType() restriction.
By the way, you can also try posting questions about PersistentDictionary at http://managedesent.codeplex.com/discussions .
-martin

SSJS global variable seems not working

I had declared and used a global variable in ssjs library as below:
var backendDoc:NotesDocument = null;
function savedata () {
print (backendDoc.getItemValueString("fieldname")); // crash here
}
I assigned a document object to it in the Edit button just after changing docuemnt mode from read to edit:
backendDoc = document1.getDocument(); // get backend document from datasource called document1
The code in above function return error NotesDocument.getItemValueString("string")) null. Apparently, the backendDoc is null.
Any ideas how to assign value and use global variable in ssjs library? Thanks in advance
There are 2 problems with your code:
as Michael pointed out: you should use a scoped variable. Global variables in script libraries are actually application global (think applicationScope) and might be unloaded any time if memory gets tight (behavior of them depends on the XPages version)
You can't use NotesObjects here. Between the calls the C Object that backs the JS object is released and your object becomes invalid.
You can either store the NoteId in a scoped variable and retrieve the NotesDocument every time or actually use a JSON structure to keep the values you are interested in and only read/write when actually needed (load/save event). Hope this helps
I think you have to use a scoped variable in which you store the universalid of the document. This can then be used at any script to initialize the backend document.
From a ssjs you can set a scoped variable using the put method and the get method to read the variable. Example to set and read a scoped variable in session scope :
sessionScope.put(“myvar“,“myvalue“)
sessionScope.get(“myvar“)
To learn more about scoped variables watch this
http://notesin9.com/index.php/2009/11/07/episode-4-intro-to-scoped-variables/

mongoose and lazy initialize attributes not executing in correct scope

I'm using mongoose for my data access layer, and I really like the different
features it offers to create document models (Attributes, Methods, Static Methods..)
I use the virtual attribute feature of mongoose to create attributes that will not be persisted to MongoDB. However, these attributes are computationaly expensive (and using them many times is not helping me).
Lets take for example the same example on mongoose virtual
it persists person.name.first and person.name.last, and uses virtual attribute for person.name.full
Let's say I want to compute person.name.full only one time per the lifetime of the document
(and if I allow to set the attribute or its dependent fields like in the example, then also for every get after a dirty set).
I need an extra variable in the document scope, so naturally I used closures for this
but the 'this' scope in the function that computes the attribute, is of the global object, and not of the document I'm working on.
Code:
var makeLazyAttribute = function(computeAttribute) {
var attribute = null;
return function() {
if(!attribute) {
attribute = computeAttribute();
}
return attribute;
}
};
MySchema.virtual('myAttribute').get(makeLazyAttribute(function () {
// some code that uses this, this should be the document I'm working on
// error: first is not defined, inspecting what is this gives me the global object
return this.first + this.last
}));
Please help!
Well, ok, I've made some progress makeLazyAttribute does execute in the document scope
so I only needed to change attribute = computeAttribute(); to
attribute = computeAttribute.call(this); .
However, now Im only remembering the first ever computeAttribute() invocation instead of remembering
the first function invocation per each document.
Must have a way to mitigate this.

groovy domain objects in Db4O database

I'm using db4o with groovy (actually griffon). I'm saving dozen of objects into db4o objectSet and see that .yarv file size is about 11Mb. I've checked its content and found that it stores metaClass with all nested fields into every object. It's a waste of space.
Looking for the way to avoid storing of metaClass and therefore reduce the size of result .yarv file, since I'm going to use db4o to store millions of entities.
Should I try callConstructors(true) db4o configuration? Think it would help?
Any help would be highly appreciated.
As an alternative you can just store 'Groovy'-beans instances. Those are compiled down to regular Java-ish classes with no special Groovy specific code attached to them.
Just like this:
class Customer {
// properties
Integer id
String name
Address address
}
class Address{
String street;
}
def customer = new Customer(id:1, name:"Gromit", address:new Address(street:"Fun"))
I don't know groovy but based on your description every groovy object carries metadata and you want to skip storing these objects.
If that is the case installing a "null translator" (TNull class) will cause the "translated" objects to not be stored.
PS: Call Constructor configuration has no effect on what gets stored in the db; it only affects how objects are instantiated when reading from db.
Hope this helps

Resources