How to define a generic map in proto file for entries in map<string, Object> in java? - protobuf-java

I'm trying to write a proto file having generic map so that when I fetch the data from database in json format, it can be converted to proto format? Is there anyway to do this ?
Here is the image of the entries present in the database for which, proto file needs to written.
For example: I need to write a generic map in proto file in such a way that all the entries in the "PROPERTY_CATEGORY" should be converted to proto.

You can either use an Any or a struct to accomplish this
map<string, google.protobuf.Any>
Struct https://github.com/protocolbuffers/protobuf/blob/main/src/google/protobuf/struct.proto#L51

Related

No insertion orderr is preserved while converting from xml to json using org.json.XML.toJSONObject(xmlStirng)

I am using a dynamic data structure for my project. So instead of a predefined class I am using java.util.LinkedHashMap to store my dynamic data and preserve my insertion order as well.
I am able to convert the map to json and get the map and back from Json using ``.
fasterxml.jackson.databind.ObejctMapper mapper;
LinkedHashMap<String, Object> map =
mapper.readValue(json, new TypeReference<LinkedHashMap<String, Object>>() {});
String json = mapper.writeValueAsString(map);
I am trying to do some XSLT transformation on my map data. So I also need to transform from xml to map and map to xml. As there is no direct method to convert these I wrote my own utility for map to xml.
and to convert from xml to map I used -> org.json.JSONObject. I first convert the xml to json using
org.json.XML.toJSONObject(xmlstring)
and can convert the json to map easily using object mapper.
But the problem here is I am loosing the insertion order which is crucial for my data.
How can I convert my data from xml to json so that the insertion order is preserved.
Thats a superb idea to use LinkedHashMap for dynamic data structure.
However JSONObject internally uses HashMap to create the json. So it looses the insertion order.
public JSONObject() {
// HashMap is used on purpose to ensure that elements are unordered by
// the specification.
// JSON tends to be a portable transfer format to allows the container
// implementations to rearrange their items for a faster element
// retrieval based on associative access.
// Therefore, an implementation mustn't rely on the order of the item.
this.map = new HashMap<String, Object>();
}
So If you can override the JSONObject your problem will be solved.
Enjoy!!!
You don't need to change the jar. You just need to create a new class with the same name of the class inside the jar and also have to create the new classes those are dependent on the class.
You need to copy the code from the jar class to your new class and tweek.
then access that class from you code by changing the import statement.

Can I use two ciscoconfparse objects in a single Cisco IOS config file to get specific interfaces

Trying to use two of the conf objects find_objects_w_child &
find_objects_wo_child in a single file.
I need to find out "interfaces" from a Cisco config file which have a specific QoS "service-policy" command configured.
At the same time should not be a part of any Etherchannel.
Using object "find_objects_w_child" I can get all 'interface' objects having the command "service-policy" configured on it, and
Using object "find_objects_wo_child" to get all the 'interface' objects which do not have the command "channel-group".
Is it even possible to use these 2 objects on a same config file?
CiscoConfParse objects do not offer a method that allows you to find objects with specific children, but without other specific children. However, we can utilize a list comprehension to accomplish the same task with the IOSCfgLine object's re_search_children() method, as shown below:
from ciscoconfparse import CiscoConfParse
parse = CiscoConfParse("ios_cfg.txt")
phys_intfs_w_qos = [obj for obj in parse.find_objects_wo_child(r"^interface", "channel-group") if obj.re_search_children(r"service-policy")]
Because regex objects are truthy, the above list comprehension will only return IOSCfgLine objects representing interfaces that do not have channel-group configured, but does have service-policy configured.

Golang levelDB struct

I'm trying to use following DB API: https://godoc.org/github.com/syndtr/goleveldb/leveldb#
(simple file based key/value DB)
I was able to put and get "key"s into the database.
However, I'm wondering if value can be a struct such as:
type Thm struct {
Name string
Age int
}
Then,
var Tmp Thm
Tmp.Name = "Gon"
Tmp.Age = 33
db.Put([]byte("test3"), []byte(Tmp), nil)
Right now, the error I'm getting is "cannot covert Tmp (type Thm) to type []byte.
If you have experiences with levelDB, could you help me how normally this will be done?
OR, should I convert struct into byte in order to make this work?
Thank you
levelDB only supports strings/byte arrays as keys and values. This is actually a pretty smart feature, because it keeps serialization of complex data structures at the application level. To serialize your Thm struct you can try the gob package if you don't need applications in other languages to be able to read the values, or protobufs, json, or msgpack if you need the serialized data to be accessible to other languages.

Storing object in Esent persistent dictionary gives: Not supported for SetColumn Parameter error

I am trying to save an Object which implements an Interface say IInterface.
private PersistentDictionary<string, IInterface> Object = new PersistentDictionary<string, IInterface>(Environment.CurrentDirectory + #"\Object");
Since many classes implement the same interface(all of which need to cached), for a generic approach I want to store an Object of type IInterface in the dictionary.
So that anywhere I can pull out that object type cast it as IInterface and use that object's internal implementation of methods etc..
But, as soon as the Esent cache is initialized it throws this error:
Not supported for SetColumn
Parameter name: TColumn
Actual value was IInterface.
I have tried to not use XmlSerializer to do the same but is unable to deserialize an Interface type.Also, [Serializable] attribute cannot be used on top of a Interface, so I am stuck.
I have also tried to make all the implementations(classes) of the Interface as [Serializable] as a dying attempt but to no use.
Does any one know a way out ? Thanks in advance !!!
The only reason that only structs are supported (as well as some basic immutable classes such as string) is that the PersistentDictionary is meant to be a drop-in replacement for Dictionary, SortedDictionary and other similar classes.
Suppose I have the following code:
class MyClass
{
int val;
}
.
.
.
var dict = new Dictionary<int,MyClass>();
var x = new MyClass();
x.val = 1;
dict.Add(0,x);
x.val = 2;
var y = dict[0];
Console.WriteLine(y.val);
The output in this case would be 2. But if I'd used the PersistentDictionary instead of the regular one, the output would be 1. The class was created with value 1, and then changed after it was added to the dictionary. Since a class is a reference type, when we retrieve the item from the dictionary, we will also have the changed data.
Since the PersistentDictionary writes the data to disk, it cannot really handle reference types this way. Serializing it, and writing it to disk is essentially the same as treating the object as a value type (an entire copy is made).
Because it's intended to be used instead of the standard dictionaries, and the fact that it cannot handle reference types with complete transparency, the developers instead opted to support only structs, because structs are value types already.
However, if you're aware of this limitation and promise to be careful not to fall into this trap, you can allow it to serialize classes quite easily. Just download the source code and compile your own version of the EsentCollections library. The only change you need to make to it is to change this line:
if (!(type.IsValueType && type.IsSerializable))
to this:
if (!type.IsSerializable)
This will allow classes to be written to the PersistentDictionary as well, provided that it's Serializable, and its members are Serializable as well. A huge benefit is that it will also allow you to store arrays in there this way. All you have to keep in mind is that it's not a real dictionary, therefore when you write an object to it, it will store a copy of the object. Therefore, updating any of your object's members after adding them to the PersistentDictionary will not update the copy in the dictionary automatically as well, you'd need to remember to update it manually.
PersistentDictionary can only store value-structs and a very limited subset of classes (string, Uri, IPAddress). Take a look at ColumnConverter.cs, at private static bool IsSerializable(Type type) for the full restrictions. You'd be hitting the typeinfo.IsValueType() restriction.
By the way, you can also try posting questions about PersistentDictionary at http://managedesent.codeplex.com/discussions .
-martin

groovy domain objects in Db4O database

I'm using db4o with groovy (actually griffon). I'm saving dozen of objects into db4o objectSet and see that .yarv file size is about 11Mb. I've checked its content and found that it stores metaClass with all nested fields into every object. It's a waste of space.
Looking for the way to avoid storing of metaClass and therefore reduce the size of result .yarv file, since I'm going to use db4o to store millions of entities.
Should I try callConstructors(true) db4o configuration? Think it would help?
Any help would be highly appreciated.
As an alternative you can just store 'Groovy'-beans instances. Those are compiled down to regular Java-ish classes with no special Groovy specific code attached to them.
Just like this:
class Customer {
// properties
Integer id
String name
Address address
}
class Address{
String street;
}
def customer = new Customer(id:1, name:"Gromit", address:new Address(street:"Fun"))
I don't know groovy but based on your description every groovy object carries metadata and you want to skip storing these objects.
If that is the case installing a "null translator" (TNull class) will cause the "translated" objects to not be stored.
PS: Call Constructor configuration has no effect on what gets stored in the db; it only affects how objects are instantiated when reading from db.
Hope this helps

Resources