Hyperledger composer modeling language, how to import concepts into another model file - hyperledger-fabric

In my project, I have 2 asset namespaces
namespace org.example.grid
namespace org.example.workload
both of them uses a abstract structure called metrics, I want to create 1 concept in a separate file and have both the assets use this concept.
So I made a file like this:
namespace org.example.concepts
concept Metrics {
o Integer metric1
o Integer metric2
o Integer metric3
}
Then I try to include the Metric concept to asset like so:
namespace org.example.grid
import org.example.concepts.Metrics
asset Grid identified by gridId {
o String gridId
o Metrics capacity
}
However, when trying to create a new grid asset, I get this error:
Error: transaction returned with failure: TypeNotFoundException: Type Metrics is not defined in namespace org.example.grid
Are concept imports not supported? Or is there a proper way to do this?

As per my understanding, I run your code. It successfully gives me an OutPut.
1) 1st model file org.example.cocepts
2) 2nd Model file org.example.workload
3) 3rd model file org.example.grid which contain Grid asset and I import org.example.concepts file which having Metrics concept.
4) Successfully created a Grid Asset.
Hope you will find an error in your structure. :)

Related

create dynamic object in GO operator controller

I am following the below blog which explains how to create operator and import another CR into existing one.
http://heidloff.net/article/accessing-third-party-custom-resources-go-operators/
here https://github.com/nheidloff/operator-sample-go/blob/aa9fd15605a54f712e1233423236bd152940f238/operator-application/controllers/application_controller.go#L276 , spec is created with hardcoded properties.
I want to import the spark operator types in my operator.
https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/apis/sparkoperator.k8s.io/v1beta2/types.go
This spark operator is having say - 100+ types/properties. By following the above blog , i could create the Go object but it would be hardcoded. I want to create the dynamic object based on user provided values in CR YAML. e.g. - customer can provided 25 attributes , sometimes 50 for spark app. I need to have dynamic object created based on user YAML. Can anybody please help me out ?
If you set the spec type to be a json object, you can have the Spec contain arbitrary json/yaml. You don't have to have a strongly typed Spec object, your operator can then decode it and do whatever you want with it during your reconcile operation as long as its you as you can serialize and deserialize it from json. Should be able to set it to json.RawMesage I think?
What do you mean by hardcoded properties?
If I understood it correctly, you want to define an API for a resource which uses both types from an external operator and your customs. You can extend your API using the types from specific properties such as ScheduledSparkApplicationSpec from this. Here is an example API definition in Go:
type MyKindSpec struct {
// using external third party api (you need to import it)
SparkAppTemplate v1beta2.ScheduledSparkApplicationSpec `json:"sparkAppTemplate,omitempty"`
// using kubernetes core api (you need to import it)
Container v1.Container `json:"container,omitempty"`
// using custom types
MyCustomType MyCustomType `json:"myCustomType,omitempty"`
}
type MyCustomType struct {
FirstField string `json:"firstField,omitempty"`
SecondField []int `json:"secondField,omitempty"`
}

Azure Cosmos DB SQL API - read/query document with unknown data structure

I have a Cosmos DB with a container that contains document with varying structure.
I am using the Java SQL API for reading the documents from this container.
The issue I am having is that the API methods for querying/reading the container expects a model class as input param and will return instances of the model class. Because my container contains documents that have varying fields and depth, it is not possible for me to create a model class to represent this.
I need to be able to read/query the documents and then parse it myself and extract the values that I am looking for.
Any ideas? I have used "Object" in the API methods for e.g. queryItem and then it returns a LinkedHashMap that I can parse myself. Is this the way to do it? It looks a bit "raw" but I have not found a better way.
Below is a typical example from the SDK doc. I cannot create a "Family" model class in my code, because the structure can vary from document to document - both which fields are stored and the depth.
private void queryItems() {
CosmosQueryRequestOptions queryOptions = new CosmosQueryRequestOptions();
queryOptions.setQueryMetricsEnabled(true);
CosmosPagedIterable<Family> familiesPagedIterable = container.queryItems(
"SELECT * FROM Family WHERE Family.lastName IN ('Andersen', 'Wakefield', 'Johnson')", queryOptions, Family.class);
familiesPagedIterable.iterableByPage(10).forEach(cosmosItemPropertiesFeedResponse -> {
logger.info("Got a page of query result with {} items(s) and request charge of {}",
cosmosItemPropertiesFeedResponse.getResults().size(), cosmosItemPropertiesFeedResponse.getRequestCharge());
logger.info("Item Ids {}", cosmosItemPropertiesFeedResponse
.getResults()
.stream()
.map(Family::getId)
.collect(Collectors.toList()));
});
}
Per my understanding, it's determined by the sdk funtion's input parameters and output data type. And exactly, we can find that both sample code for java or spring are depends on the data model. So it's really good for you to use Object in your code because of the various documents.
And it's true that we can't design a data model to contain all the properties in the documents but I think it's also a good idea to set a model which contains all the properties required. I mean that maybe you have a useless property in a query, so the query model should exclude it.
I think I found the proper solution:
Create model class. Define the members with unknown depth and structure as JsonNode.
Then the model class could be used and the values of the JsonNode accessed using nice methods.

How to add a specific generation phase that would do Model-To-MS Word?

Assume that I have developed a set of behaviours in MPS that allow me to convert an instance of a WordDocumentconcept (and children), which describes a Word processor document, into a MS Word document using POI and that I have been able to implement an action in an MPS plugin that allows me to generate my desired MS Word document by a right click on my root node.
I would like to add this as a phase in the generation process, so that after the Model-To-Model phases, the generation process of MPS does Model-to-MS Word generation instead of Model-to-Text.
Is MPS customizable that way, and what would be the set of concepts to use?
When you run make or rebuild on models in MPS, MPS will start a so called MakeSession in this session MPS executes multiple steps. One step in the make session is for instance "generate", which runs the model to model transformation, and a second one is "textgen", which then writes the resulting model of the generate step to disk by executing the textgen definition of the languages.
These individual steps are called a "facet". You can contribute your own facets into the over all make process. To do so you need to create a plugin aspect in your language and then create a facet in there. In the facet your can declare it's dependencies and priorities. In your case you want to run before textgen but after generation, so that you can access the result of the generation.
The facets can stat their input data in a declarative way. In your case you need the GResource which represents the output of the generator facet. Then you can access the model(s) on it and run your POI code on it.
A minimal example would look like this:
facet RunPoi extends <none> {
Required: Generate, TextGen
<no optional facets>
Targets:
target genWord overrides <none> weight default {
resources policy: transform GResource -> <no output>
Dependencies:
after generate
before textGen
before textGenToMemory
<no properties>
<no queries>
<no config>
(progressMonitor, input)->void {
foreach resource in input {
SModel mdl = resource.model;
// run poi code with mdl
}
}
}
}

How to compare 2 xsd schema files for equivalent functionality

I would like to compare 2 XSD schemas A and B to determine that all instance documents valid to schema A would also be valid to schema B. I hope to use this to prove that even though schema A and B are "different" they are effectively the same. Examples of differences this would not trigger would be Schema A uses types and Schema B declares all of it's elements inline.
I have found lots of people talking about "smart" diff type tools but these would claim the two files are different because they have different text but the resulting structure is the same. I found some references to XSOM but I'm not sure if that will help or not.
Any thoughts on how to proceed?
Membrane SOA Model - Java API for WSDL and XML Schema
package sample.schema;
import java.util.List;
import com.predic8.schema.Schema;
import com.predic8.schema.SchemaParser;
import com.predic8.schema.diff.SchemaDiffGenerator;
import com.predic8.soamodel.Difference;
public class CompareSchema {
public static void main(String[] args) {
compare();
}
private static void compare(){
SchemaParser parser = new SchemaParser();
Schema schema1 = parser.parse("resources/diff/1/common.xsd");
Schema schema2 = parser.parse("resources/diff/2/common.xsd");
SchemaDiffGenerator diffGen = new SchemaDiffGenerator(schema1, schema2);
List<Difference> lst = diffGen.compare();
for (Difference diff : lst) {
dumpDiff(diff, "");
}
}
private static void dumpDiff(Difference diff, String level) {
System.out.println(level + diff.getDescription());
for (Difference localDiff : diff.getDiffs()){
dumpDiff(localDiff, level + " ");
}
}
}
After executing you get the output shown in listing 2. It is a List of
differences between the two Schema documents.
ComplexType PersonType has changed: Sequence has changed:
Element id has changed:
The type of element id has changed from xsd:string to tns:IdentifierType.
http://www.service-repository.com/ offers an online XML Schema Version Comparator tool that displays a report of the differences between two XSD that appears to be produced from the Membrane SOA Model.
My approach to this was to canonicalize the representation of the XML Schema.
Unfortunately, I can also tell you that, unlike canonicalization of XML documents (used, as an example, to calculate a digital signature), it is not that simple or even standardized.
So basically, you have to transform both XML Schemas to a "canonical form" - whatever the tool you build or use thinks that form is, and then do the compare.
My approach was to create an XML Schema set (could be more than one file if you have more namespaces) for each root element I needed, since I found it easier to compare XSDs authored using the Russian Doll style, starting from the PSVI model.
I then used options such as auto matching substitution group members coupled with replacement of substitution groups with a choice; removal of "superfluous" XML Schema sequences, collapsing of single option choices or moving minOccurs/maxOccurs around for single item compositors, etc.
Depending on what your XSD-aware comparison tool's features are, or you settle to build, you might also have to rearrange particles under compositors such as xsd:choice or xsd:all; etc.
Anyway, what I learned after all of it was that it is extremely hard to build a tool that would work nice for all "cool" XSD features out there... One test case I remember fondly was to deal with various xsd:any content.
I do wonder though if things have changed since...

WCF DataService EF entities not found

I have an EDMX model with a generated context.
Now i generated a Self Tracking Entities library is separate project and referenced this from the EDMX model.
Also set the correct namespace in the context to the same namespace as the entities.
Now working with this all works except when i try to create a WCF data service with this context.
So just create new ObjectContext and working with it directly works fine.
But having referenced the context + model lib and the entities lib i get the following error when loading the service
The server encountered an error processing the request. The exception message is 'Value cannot be null. Parameter name: key'. See server logs for more details. The exception stack trace is:
Now i found that this could happen when using data service with external entity lib and fix was overriding the createcontext
with code
Collapse
System.Data.Metadata.Edm.ItemCollection itemCollection;
if (!context.MetadataWorkspace.TryGetItemCollection
(System.Data.Metadata.Edm.DataSpace.CSSpace, out itemCollection))
{
var tracestring = context.CreateQuery<ClientDataStoreContainer>("ClientDataStoreContainer.DataSet").ToTraceString();
}
return context;
Now the error is gone but i get the next one and that is:
Object mapping could not be found for Type with identity 'ClientDataStoreEntities.Data'.
This error occurs on the .toTraceString in the createcontext
the ssdl file has the defined type
Collapse
<EntitySetMapping Name="DataSet">
<EntityTypeMapping TypeName="IsTypeOf(ClientDataStoreEntities.Data)">
So it has to load the ClientDataStoreEntities.Data type which is the namespace and type of the STE library that i have generated from the model.
EDIT: with
var tracestring = context.CreateQuery<Data>("ClientDataStoreContainer.DataSet").ToTraceString();
It does seem to load all types , however now the service does not have any methods that i can call.
there should be 2 DataSet and PublishedDataSet but:
<service xml:base="http://localhost:1377/WcfDataService1.svc/">
−
<workspace>
<atom:title>Default</atom:title>
</workspace>
</service>
is what i get.
I ran into the same issue (the first one you mention). I have worked around using the suggestion by Julie Lerman in this thread. The other suggestion didn't work for me although I will experiment with them more since Julie's solution may have performance implications since it's executed (and has some cost) for every query.
MSDN Fail to work with POCO ModelContainer which entities are located in other assembly
Edit: Sorry, just realized you utilized the other solution mentioned in this thread.

Resources