Does KEY/KEYREF work with JAXB - jaxb

I have an XSD file declaring lots of classes (using unmarshalling). Upto now, there was one relationship between two of the generated classes, which is modelled by using ID and IDREF. This works perfectly well.
I have to change the app. Now I have two independent relationships and ID/IDREF doesn't work anymore.
I guess, I have to use xs:key/xs:keyref.
But, all I found on JAXB2 suggests, that unmarshalling of xs:key/xs:keyref just validates, but doesn't generate the relationship in Java code.
Is this still true?
Regards, Claus

Related

Why is Dita's schema split into topic's and map's?

I'm new to Dita, so I apologize for any ignorance.
I'm using XJC to compile the base (and only the base) Dita 1.3 schema into Java classes. When I attempted to compile all the XSD files, I received errors with elements and groups being redefined. None of the XJC bindings I attempted to write would fix it.
After digging through the schema, I found that mapGrp.xsd/mapMod.xsd and topicGrp.xsd/topicMod.xsd contained the same group and element definitions. This explains why XJC would fail when including all of the XSD files. The XSD parser itself cannot handle these duplicate entries.
So I generated basemap.xsd and basetopic.xsd separately and cleaned up the generated code so I could run a diff against the two directories.
I found that the two schema's have some elements specific to maps and topics. For example, the map schema has DitavalmetaClass and DvrKeyscopePrefixClass while the topic schema doesn't. And the topic schema contains AbstractClass and BodyClass while the map schema doesn't. But the majority of classes are shared between the two schema's.
As for the classes that are shared, there are only three that have some differences between the two schema's (LinktextClass, MetadataClass, and SearchtitleClass). Even then, they aren't big changes, just some differences in what they can contain.
My question is, why couldn't the shared classes go under one common Grp/Mod schema that's shared between topic's and map's and redefine those three classes? Can I change the two schema's so they share the same elements and groups without breaking any of the other schema's that extend the base schema?
Maps and topics are two distinct document types. They share some element types in common but are otherwise completely independent document types.
Note also that DITA does not have "a single grammar" in the way that other XML applications do (or appear to do).
DITA is explicitly architected to allow controlled extension from the base grammars so that you can do any of the following:
Configure a given map or topic type to include or exclude specific elements, either other topic types (in the case of topics) or specific element "domains" (sets of "mix-in" elements). Thus there can be two different working grammars for "topic" documents that allow different sets of elements.
Add "constraint" module that restrict existing content models or attribute lists in some way (for example, disallow a base element type in a specific context).
Define your own new element types and attributes via "specialization"
Thus any attempt to generate things like Java classes or database schemas for DITA in any sort of static way are doomed to fail in the general case.
If you are implementing code that needs to operate on any conforming DITA document then it needs to be more flexible and operate on elements in terms of their #class values, not their tag names.
If it is sensible for you to generate Java classes from the XSDs you must treat each top-level map and topic type (map, bookmap, subjectScheme, learningMap, topic, concept, task, general-task, reference, glossentry, etc.) as a distinct class hierarchy--you cannot combine them in a single hierarchy because they will have different content model rules for the same element types.
You should definitely read the DITA Architecture specification:
http://docs.oasis-open.org/dita/v1.2/os/spec/architectural_specification.html#architectural_specification
Cheers,
Eliot
The DITA 1.3 XML Schemas are not manually put together anymore. They are being generated automatically from Relax NG schemas which are the officially supported schemas by the specification. Maybe you should also take a look at the DITA 1.2 XML Schemas, those were manually written and they might be better modeled to reuse more element definitions.

Jaxb Generates Objects for Unused Elements from Imported Schema

I have several schemas that inherit one or more elements from a collection of 'common' schemas. In this particular instance, I'm importing one of these schemas to make use of a single complex type defined in it.
When I generate the java objects from the schema, I get my schema types, and the element I referenced as expected, however I also get objects generated for the 30+ other types from the common schema.
I want to use the common schema, because I want to rely on automated builds for updating my schema when the common schema changes, but I do not want the extra java classes generated.
Suggestions ?
There's no out of the box approach to achieve what you want. The reason I am offering an opinion here is rather to point out (maybe for others) some issues one needs to take into account no matter which route ones go.
The 'extra' label is not always straightforward. Substitution group members are interesting. In Java, think about a class (A) using an interface (I), and a class (B:I) implementing (I). Some may say there's no dependency between A and B, while others would require B in the distribution. If you replace (I) with a concrete class, things become even less clear - consider that the substitution group head doesn't need to be abstract; or if the type of the substitution group head is anyType (Object in Java).
More so, if the XML processing was designed to accommodate xsi:type then it is even harder to tell (by looking at the schema) what is expected to work where.
Tools such as QTAssistant (I am associated with it) have a default setting that will pull in all strict dependencies (A and I above); and either ALL that might work (B above), or nothing else. Anything in between, the user needs to manually define what goes in the release. This is called automatic XSD refactoring and could be used easily in your scenario.

CoreData: Mogenerator vs. Categories

I have recently inherited a CoreData project from an outside developer. Where I expected to find all my auto-generated NSManagedObject sub-classes, I instead have (what some googling reveals to be) classes generated by Mogenerator, a tool that I have no experience with.
I understand that the purpose of using this tool is to make it so that you can add custom code into the classes corresponding to the CoreData entities without worrying about it being lost when the model changes and the classes are regenerated... but I can do this anyways by using categories.
I currently do not see a real advantage to using Mogenerator over categories.
What are the advantages/disadvantages of using Mogenerator vs. categories? Does Mogenerator serve any additional purposes?
An advantage of using classes vs categories, is that you can extend functionality by subclassing and overriding.
For instance, if your model has subentities, it is possible for them to inherit functionality from a common master class. Subclasses could define specific behavior by overriding the desired methods. On the other hand, it is possible to override methods defined in categories, but it is not recommended. This means logic implemented as categories would have to be repeated in every subclass.
A lot of code in managed objects is boilerplate, so it's really nice to have mogenerator do it automatically.
From their 'site' http://rentzsch.github.com/mogenerator/ :
mogenerator generates Objective-C code for your Core Data custom
classes
Unlike Xcode, mogenerator manages two classes per entity: one for
machines, one for humans
The machine class can always be overwritten to match the data model,
with humans’ work effortlessly preserved
So basically it got nothing to do with categories. Mogenerator (Model Object Generator) generates code which you've seen the results from in the project you've gotten handed over.

Castor Generated Classes (XML Marshalling) - XSD Unavailable

I recently moved to a project where I noticed there have a specific requirement to store some data as XML.
The prior team used Castor generated classes to Marshall and Unmarshall the data.
I have a new requirement now that requires me to add some additional (yet optional) fields to this XML. However I realized the prior team supposedly never checked in the XSD at all and I have no way to reach out to them.
The XSD for sure was large and complex since it is responsible for generating around 50 classes. So writing the XSD again is going to be error prone and also a risk that I might end up creating XMLs now that are in compatible with the old XML.
The other alternative I thought of was using a tool like XML Spy and try to reverse engineer the XSD from the XML, however that sounds a bit difficult too since I will need to reverse engineer 20 odd XMLs to generate XSDs and then merge all these XSDs into one, since the XML had several optional sections. There is still an element of error possible in this approach.
The best option I can think of is reverse engineering the classes to an XSD - however Castor supposedly does not support this feature. So I don't have the means to convert these Castor generated classes back to an XSD! While the classes generated by Castor do have some Castor specific methods, in essence they are Pojos if the Castor specific methods are ignored!
Do we have any suggestions here for getting or generating the XSD from java classes? Do we have any other suggestions to solve the issues I discussed?
Thank you.
Just an update, while I have not achieved 100% of what I was looking for, I was able to successfully reverse engineer the XSD using JAXB's schemagen tool.
Just note that castor generates an XXXDescriptor with each class since that does NOT map to the actual XSD do not pass the XXXDescriptor classes as input to the schemagen tool.
The schemagen tool works with the getter methods and ignores methods like Castor's validate, marshall and unmarshall.
So things look quite hopeful at this point, compared to the situation I was in when I first posted the question.
Thanks.

Repository Unit of work with ObjectContext

I am learning unit of work at google.
But most of the example I get is about with unit of work with DbContext.
So please could anyone give me example of Unit of work with ObjectContext.
[Just my thinking, May be not correct.]
I don't like code first pattern because it need me to write properties classes(with get set methods) myself.
But by using edmx file with objectContext, i don't need to create properties classes which take time.
There are plenty of articles on this.
Entity Framework POCO (EF4): Generic Repository and Unit of Work
Prototype
Entity Framework 4 POCO, Repository and Specification Pattern
These abstractions may not fit to your problem exactly. So understand the concepts behind the Entity Framework before going into making abstractions.

Resources