create dynamic object in GO operator controller - apache-spark

I am following the below blog which explains how to create operator and import another CR into existing one.
http://heidloff.net/article/accessing-third-party-custom-resources-go-operators/
here https://github.com/nheidloff/operator-sample-go/blob/aa9fd15605a54f712e1233423236bd152940f238/operator-application/controllers/application_controller.go#L276 , spec is created with hardcoded properties.
I want to import the spark operator types in my operator.
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/apis/sparkoperator.k8s.io/v1beta2/types.go
This spark operator is having say - 100+ types/properties. By following the above blog , i could create the Go object but it would be hardcoded. I want to create the dynamic object based on user provided values in CR YAML. e.g. - customer can provided 25 attributes , sometimes 50 for spark app. I need to have dynamic object created based on user YAML. Can anybody please help me out ?

If you set the spec type to be a json object, you can have the Spec contain arbitrary json/yaml. You don't have to have a strongly typed Spec object, your operator can then decode it and do whatever you want with it during your reconcile operation as long as its you as you can serialize and deserialize it from json. Should be able to set it to json.RawMesage I think?

What do you mean by hardcoded properties?
If I understood it correctly, you want to define an API for a resource which uses both types from an external operator and your customs. You can extend your API using the types from specific properties such as ScheduledSparkApplicationSpec from this. Here is an example API definition in Go:
type MyKindSpec struct {
// using external third party api (you need to import it)
SparkAppTemplate v1beta2.ScheduledSparkApplicationSpec `json:"sparkAppTemplate,omitempty"`
// using kubernetes core api (you need to import it)
Container v1.Container `json:"container,omitempty"`
// using custom types
MyCustomType MyCustomType `json:"myCustomType,omitempty"`
}
type MyCustomType struct {
FirstField string `json:"firstField,omitempty"`
SecondField []int `json:"secondField,omitempty"`
}

Related

Why TypeOrm advises to create empty object via `new EntityClass()` THEN only pass fields value?

I'm using NestJS (not Next) NodeJS framework
When I'm creating new objects I used to use new OjbectClass({...fieldsValues});
It's great especially when you use transform pipes from class-transformer;
Besides this approach is used for entity creating:
https://docs.nestjs.com/techniques/database#separating-entity-definition
But as far I see in different guides of TypeOrm usage
here: https://typeorm.io/#/ ,
and here: https://orkhan.gitbook.io/typeorm/docs/entities .
They show first to create an empty object, then only set fields with values:
const object = new EntityObject();
object.field = 'value';
Why? Does it make sense?
Does NodeJS create a redundant hidden class of properties passed via object into Entity Class constructor? If yes - then we can pass coma-separated arguments
I believe it's just cause that's how the docs are. Looking at the code for BaseEntity it does not look like having a constructor to assign the fields would be a problem

Azure Cosmos DB SQL API - read/query document with unknown data structure

I have a Cosmos DB with a container that contains document with varying structure.
I am using the Java SQL API for reading the documents from this container.
The issue I am having is that the API methods for querying/reading the container expects a model class as input param and will return instances of the model class. Because my container contains documents that have varying fields and depth, it is not possible for me to create a model class to represent this.
I need to be able to read/query the documents and then parse it myself and extract the values that I am looking for.
Any ideas? I have used "Object" in the API methods for e.g. queryItem and then it returns a LinkedHashMap that I can parse myself. Is this the way to do it? It looks a bit "raw" but I have not found a better way.
Below is a typical example from the SDK doc. I cannot create a "Family" model class in my code, because the structure can vary from document to document - both which fields are stored and the depth.
private void queryItems() {
CosmosQueryRequestOptions queryOptions = new CosmosQueryRequestOptions();
queryOptions.setQueryMetricsEnabled(true);
CosmosPagedIterable<Family> familiesPagedIterable = container.queryItems(
"SELECT * FROM Family WHERE Family.lastName IN ('Andersen', 'Wakefield', 'Johnson')", queryOptions, Family.class);
familiesPagedIterable.iterableByPage(10).forEach(cosmosItemPropertiesFeedResponse -> {
logger.info("Got a page of query result with {} items(s) and request charge of {}",
cosmosItemPropertiesFeedResponse.getResults().size(), cosmosItemPropertiesFeedResponse.getRequestCharge());
logger.info("Item Ids {}", cosmosItemPropertiesFeedResponse
.getResults()
.stream()
.map(Family::getId)
.collect(Collectors.toList()));
});
}
Per my understanding, it's determined by the sdk funtion's input parameters and output data type. And exactly, we can find that both sample code for java or spring are depends on the data model. So it's really good for you to use Object in your code because of the various documents.
And it's true that we can't design a data model to contain all the properties in the documents but I think it's also a good idea to set a model which contains all the properties required. I mean that maybe you have a useless property in a query, so the query model should exclude it.
I think I found the proper solution:
Create model class. Define the members with unknown depth and structure as JsonNode.
Then the model class could be used and the values of the JsonNode accessed using nice methods.

How to update a Cosmos DB item from a Node JS app

I have a Cosmos Document DB with many items in it and I'd like to update a single item.
I've tried doing it like this:
const { resources: updatedItem } = await container.item(existingType).replace(newItem)
But get this error:
Illegal characters ['/', '\', '?', '#'] cannot be used in resourceId
Upon further research it's because the _rid is being read from the existingType and contains patterns like AAAAA==/
What is the best practice for dealing with this?
According to the API document
Read the item's definition. Any provided type, T, is not necessarily
enforced by the SDK. You may get more or less properties and it's up
to your logic to enforce it. If the type, T, is a class, it won't pass
typeof comparisons, because it won't have a match prototype. It's
recommended to only use interfaces.
There is no set schema for JSON items. They may contain any number of
custom properties.
You can create entity class so that you can remove system properties.

Storing object in Esent persistent dictionary gives: Not supported for SetColumn Parameter error

I am trying to save an Object which implements an Interface say IInterface.
private PersistentDictionary<string, IInterface> Object = new PersistentDictionary<string, IInterface>(Environment.CurrentDirectory + #"\Object");
Since many classes implement the same interface(all of which need to cached), for a generic approach I want to store an Object of type IInterface in the dictionary.
So that anywhere I can pull out that object type cast it as IInterface and use that object's internal implementation of methods etc..
But, as soon as the Esent cache is initialized it throws this error:
Not supported for SetColumn
Parameter name: TColumn
Actual value was IInterface.
I have tried to not use XmlSerializer to do the same but is unable to deserialize an Interface type.Also, [Serializable] attribute cannot be used on top of a Interface, so I am stuck.
I have also tried to make all the implementations(classes) of the Interface as [Serializable] as a dying attempt but to no use.
Does any one know a way out ? Thanks in advance !!!
The only reason that only structs are supported (as well as some basic immutable classes such as string) is that the PersistentDictionary is meant to be a drop-in replacement for Dictionary, SortedDictionary and other similar classes.
Suppose I have the following code:
class MyClass
{
int val;
}
.
.
.
var dict = new Dictionary<int,MyClass>();
var x = new MyClass();
x.val = 1;
dict.Add(0,x);
x.val = 2;
var y = dict[0];
Console.WriteLine(y.val);
The output in this case would be 2. But if I'd used the PersistentDictionary instead of the regular one, the output would be 1. The class was created with value 1, and then changed after it was added to the dictionary. Since a class is a reference type, when we retrieve the item from the dictionary, we will also have the changed data.
Since the PersistentDictionary writes the data to disk, it cannot really handle reference types this way. Serializing it, and writing it to disk is essentially the same as treating the object as a value type (an entire copy is made).
Because it's intended to be used instead of the standard dictionaries, and the fact that it cannot handle reference types with complete transparency, the developers instead opted to support only structs, because structs are value types already.
However, if you're aware of this limitation and promise to be careful not to fall into this trap, you can allow it to serialize classes quite easily. Just download the source code and compile your own version of the EsentCollections library. The only change you need to make to it is to change this line:
if (!(type.IsValueType && type.IsSerializable))
to this:
if (!type.IsSerializable)
This will allow classes to be written to the PersistentDictionary as well, provided that it's Serializable, and its members are Serializable as well. A huge benefit is that it will also allow you to store arrays in there this way. All you have to keep in mind is that it's not a real dictionary, therefore when you write an object to it, it will store a copy of the object. Therefore, updating any of your object's members after adding them to the PersistentDictionary will not update the copy in the dictionary automatically as well, you'd need to remember to update it manually.
PersistentDictionary can only store value-structs and a very limited subset of classes (string, Uri, IPAddress). Take a look at ColumnConverter.cs, at private static bool IsSerializable(Type type) for the full restrictions. You'd be hitting the typeinfo.IsValueType() restriction.
By the way, you can also try posting questions about PersistentDictionary at http://managedesent.codeplex.com/discussions .
-martin

How to auto-generate early bound properties for Entity specific (ie Local) Option Set text values?

After spending a year working with the Microsoft.Xrm.Sdk namespace, I just discovered yesterday the Entity.FormattedValues property contains the text value for Entity specific (ie Local) Option Set texts.
The reason I didn't discover it before, is there is no early bound method of getting the value. i.e. entity.new_myOptionSet is of type OptionSetValue which only contains the int value. You have to call entity.FormattedValues["new_myoptionset"] to get the string text value of the OptionSetValue.
Therefore, I'd like to get the crmsrvcutil to auto-generate a text property for local option sets. i.e. Along with Entity.new_myOptionSet being generated as it currently does, Entity.new_myOptionSetText would be generated as well.
I've looked into the Microsoft.Crm.Services.Utility.ICodeGenerationService, but that looks like it is mostly for specifying what CodeGenerationType something should be...
Is there a way supported way using CrmServiceUtil to add these properties, or am I better off writing a custom app that I can run that can generate these properties as a partial class to the auto-generated ones?
Edit - Example of the code that I would like to be generated
Currently, whenever I need to access the text value of a OptionSetValue, I use this code:
var textValue = OptionSetCache.GetText(service, entity, e => e.New_MyOptionSet);
The option set cache will use the entity.LogicalName, and the property expression to determine the name of the option set that I'm asking for. It will then query the SDK using the RetrieveAttriubteRequest, to get a list of the option set int and text values, which it then caches so it doesn't have to hit CRM again. It then looks up the int value of the New_MyOptionSet of the entity and cross references it with the cached list, to get the text value of the OptionSet.
Instead of doing all of that, I can just do this (assuming that the entity has been retrieved from the server, and not just populated client side):
var textValue = entity.FormattedValues["new_myoptionset"];
but the "new_myoptionset" is no longer early bound. I would like the early bound entity classes that gets generated to also generate an extra "Text" property for OptionSetValue properties that calls the above line, so my entity would have this added to it:
public string New_MyOptionSetText {
return this.GetFormattedAttributeValue("new_myoptionset"); // this is a protected method on the Entity class itself...
}
Could you utilize the CrmServiceUtil extension that will generate enums for your OptionSets and then add your new_myOptionSetText property to a partial class that compares the int value to the enums and returns the enum string
Again, I think specifically for this case, getting CrmSvcUtil.exe to generate the code you want is a great idea, but more generally, you can access the property name via reflection using an approach similar to the accepted answer # workarounds for nameof() operator in C#: typesafe databinding.
var textValue = entity.FormattedValues["new_myoptionset"];
// becomes
var textValue = entity.FormattedValues
[
// renamed the class from Nameof to NameOf
NameOf(Xrm.MyEntity).Property(x => x.new_MyOptionSet).ToLower()
];
The latest version of the CRM Early Bound Generator includes a Fields struct that that contains the field names. This allows accessing the FormattedValues to be as simple as this:
var textValue = entity.FormattedValues[MyEntity.Fields.new_MyOptionSet];
You could create a new property via an interface for the CrmSvcUtil, but that's a lot of work for a fairly simple call, and I don't think it justifies creating additional properties.

Resources