Is there a way to convert a CICS bms file to EPIScreenRecord JAVA class? - cics

Is there a way to convert a CICS bms file to EPIScreenRecord JAVA class similar to how BMSMapConvert converts a bms file to xxxMap and xxxScreenHandler JAVA classes?
Thanks.

Sorry, there is no utility to convert BMS maps to records for use with the EPI JCA resource adapter as BMSMapConvert only works with the EPI support classes.

Related

How to convert WebAnno Name Entity annotation to use in OpenNLP?

Based in this issue I need to export in XMI format and use DKPro Core to convert to Brat format:
https://github.com/webanno/webanno/issues/328
I tried this code but did not have success
public void convert() throws Exception {
SimplePipeline.runPipeline(CollectionReaderFactory
.createReaderDescription(XmiReader.class, XmiReader.PARAM_SOURCE_LOCATION, "/tmp", XmiReader.PARAM_PATTERNS,
XmiReader.INCLUDE_PREFIX + "*.xmi"), AnalysisEngineFactory
.createEngineDescription(BratWriter.class, BratWriter.PARAM_TARGET_LOCATION, "/tmp"));
}
The dialect of the brat format may be different between what the DKPro Core BratWriter produces and what OpenNLP expects - the brat file format is quite flexible.
If you are using the built-in Named Entity layer in WebAnno, then I would propose an alternative route:
Stay with the XMI export
Load the XMI with DKPro Core 1.9.0-SNAPSHOT and feed it to the OpenNlpNamedEntityRecognizerTrainer component
That should avoid the need for the additional conversion step.
Disclosure: I am a WebAnno and DKPro Core developer.
Suggestions that didn't work:
Export as CoNLL 2002 in WebAnno
OpenNLP supports training the NER tool directly from CoNLL 2002 files.
=> The Conll02NameSampleStream supports only certain languages and named entity types... sigh

C# class generated from the proto file using custom tool ProtBufTool does not have the parseFrom and mergeFrom methods

I using the custom tool 'ProtoBufTool' in Visual studio to generate the C# class files from the .proto files. However, the generated output class does not have the parseFrom and the mergeFrom methods. Am I missing providing some option or something in the proto file or in the tool settings? I did not find anything online that would give me any clues to solve this. Also, apart from the messages, my proto file just has option *optimize_for = SPEED;* at the beginning of the file. I don't have any build action on the proto file.
Any help on this will be greatly appreciated.
From the name of the tool, it sounds like you are using protobuf-net. That is just one of several protobuf implementations for c# / .net, but it is not a direct port and has a different API - instead it tried to be idiomatic .net first, and a serializer second (for example, you dont even need a .proto - you can use regular POCO types). For example, typical usage might be:
var obj = Serializer.Deserialize<YourType>(inputStream);
If you want an implementation with the same API as the java etc implementations, then protobuf-csharp-port may be more to your liking. This is a more direct port of the java API.

Lunching atl files using ATL plugin

I am using ATl plugin to lunch atl using java class.
Before i was running ATL files by using ATL configuration wizard.
The input i was giving in the configuration were:
ATL Module: sample.atl
Metamodel UML: sampleprofile.uml
Source Model system: samplemodel.uml
Target: output.uml
After running the output was the correct and the one i wanted.
The problem is that when i use the ATL plugin to lunch the atl files it only requires me as input:
Name of the ATL file and Name of the metamodel.The problem is that i dont know where to specify the samplemodel.uml. Because this should be also as input. Therefore the output.uml i am getting is not the one i am expecting to get.
Does anyone know how can i specify this second file inside the generated java class ?
Thank you in advance!
you don't need to change the generated java class. Just import the generated class (for instance Families2Persons) from your java program and launch the transformation like this:
Families2Persons runner = new Families2Persons();
runner.loadModels("/pathto/samplemodel.uml");
runner.doFamilies2Persons(new NullProgressMonitor());
runner.saveModels("/pathto/output.uml");
If you want you can also launch the transformation from command line passing the two paths as arguments.

complex jaxb scenario on generation of java objects

I have a project that does JAXB generation with framework.xsd. This generates a jar with the xsd and the jaxb objects and other classes around that stuff.
Then another group(two different groups) will be extending framework.xsd and subxmling using the schema extends stuff to extend objects in framework.xsd. They also want to generate jaxb objects BUT they want their SomeClass.java to obviously extend my Framework.java and don't want to end up with a whole new heirarchy.
Is this even possible?
How to do something like this? as the solution would need to
tell the jaxb compiler that the namespace yy is already generated so do not generate
tell the jaxb compiler that it needs to refer to the classes in the package zzzzzz or to look at the xjb file from the framework jar file or something.
Is this possible?
thanks,
Dean
You want to use an episode file : http://weblogs.java.net/blog/kohsuke/archive/2006/09/separate_compil.html when generating JAXB classes for your first schema.
$ xjc -episode framework.episode framework.xsd
Then the other group that consumes your framework.jar should:
1) import your schema in their own schema e.g.:
<xsd:import namespace="http://www.myorg.com/framework" schemaLocation="framework.xsd"/>
2) generate their JAXB classes
$ xjc extend.xsd -b framework.episode
(they'll need a copy of your xsd and episode file at xjc time, as well as the framework.jar in the classpath)
Note that according to the blog post above, you can also place the framework.episode file inside your jar (e.g. /META-INF/sun-jaxb.episode for JAXB RI at least - other JAXB impl may have other ways of accomplishing the same thing), so that the -b framework.episode option can be omitted. I personally find it a bit impractical, you still need the XSD anyway.

Importing csv files using groovy

I have developed a groovy application. Now it has been required
that for feeding the DB a CSV interface must be provided.
That is, I have to load a csv file, parse it and
insert records into the DB in a transactional way.
The question is if there exists for groovy something
like ostermiller utils (a configurable csv parser).
Thanks in advance,
Luis
Kelly Robinson just wrote a nice blog post about the different possibilities that are available to work with CSV files in Groovy.
Groovy and Java are interoperable. Take a look at the documentation for mixed Java and Groovy applications. Any Java class can easily be used in Groovy with no change (plus you have the groovy syntax). If you are interested in the ostermiller utils to do your CSV parsing, you can use it directly from Groovy.
If the ostermiller library does what you want you can call it directly from Groovy. Just put the necessary jars in your groovy\lib directory and you should be ready to go.

Resources