Automapper different Configuration - automapper

As suggested in this answer Using Profiles in Automapper to map the same types with different logic, can anybody provide an example of using Automapper with different Configuration object?
I need to have two or more configuration for mapping between two objects. How can I register many configuration objects and how to use them?

Related

Why is Dita's schema split into topic's and map's?

I'm new to Dita, so I apologize for any ignorance.
I'm using XJC to compile the base (and only the base) Dita 1.3 schema into Java classes. When I attempted to compile all the XSD files, I received errors with elements and groups being redefined. None of the XJC bindings I attempted to write would fix it.
After digging through the schema, I found that mapGrp.xsd/mapMod.xsd and topicGrp.xsd/topicMod.xsd contained the same group and element definitions. This explains why XJC would fail when including all of the XSD files. The XSD parser itself cannot handle these duplicate entries.
So I generated basemap.xsd and basetopic.xsd separately and cleaned up the generated code so I could run a diff against the two directories.
I found that the two schema's have some elements specific to maps and topics. For example, the map schema has DitavalmetaClass and DvrKeyscopePrefixClass while the topic schema doesn't. And the topic schema contains AbstractClass and BodyClass while the map schema doesn't. But the majority of classes are shared between the two schema's.
As for the classes that are shared, there are only three that have some differences between the two schema's (LinktextClass, MetadataClass, and SearchtitleClass). Even then, they aren't big changes, just some differences in what they can contain.
My question is, why couldn't the shared classes go under one common Grp/Mod schema that's shared between topic's and map's and redefine those three classes? Can I change the two schema's so they share the same elements and groups without breaking any of the other schema's that extend the base schema?
Maps and topics are two distinct document types. They share some element types in common but are otherwise completely independent document types.
Note also that DITA does not have "a single grammar" in the way that other XML applications do (or appear to do).
DITA is explicitly architected to allow controlled extension from the base grammars so that you can do any of the following:
Configure a given map or topic type to include or exclude specific elements, either other topic types (in the case of topics) or specific element "domains" (sets of "mix-in" elements). Thus there can be two different working grammars for "topic" documents that allow different sets of elements.
Add "constraint" module that restrict existing content models or attribute lists in some way (for example, disallow a base element type in a specific context).
Define your own new element types and attributes via "specialization"
Thus any attempt to generate things like Java classes or database schemas for DITA in any sort of static way are doomed to fail in the general case.
If you are implementing code that needs to operate on any conforming DITA document then it needs to be more flexible and operate on elements in terms of their #class values, not their tag names.
If it is sensible for you to generate Java classes from the XSDs you must treat each top-level map and topic type (map, bookmap, subjectScheme, learningMap, topic, concept, task, general-task, reference, glossentry, etc.) as a distinct class hierarchy--you cannot combine them in a single hierarchy because they will have different content model rules for the same element types.
You should definitely read the DITA Architecture specification:
http://docs.oasis-open.org/dita/v1.2/os/spec/architectural_specification.html#architectural_specification
Cheers,
Eliot
The DITA 1.3 XML Schemas are not manually put together anymore. They are being generated automatically from Relax NG schemas which are the officially supported schemas by the specification. Maybe you should also take a look at the DITA 1.2 XML Schemas, those were manually written and they might be better modeled to reuse more element definitions.

Are the multiple copies of schemas_microsoft_com_2003_10_Serialization needed?

Working on a BTDF deployment package, firstly the schema, and getting lots of warnings that ...
a previously deployed schema "" have the same target namespace "http://schemas.microsoft.com/2003/10/Serialization/".
which track back to the various copies of the ..._schemas_microsoft_com_2003_10_Serialization.xsd, presumably added by referencing multiple web services.
Are these all needed? ... especially as the content is identical
Is even one needed?
Similarly, there are multiple copies of the ...Serialization_Arrays.xsd
No, multiple copies of these schemas do not have to be Deployed. But...
These schema are included in Projects when you generate schemas for WCF Adapter services though with different .Net Type names and namespaces. They only thing they contain are element type definitions and are used by the referring schema.
But, not every definition is used and yes, it does tend to clutter the Schemas list in BT Admin.
So, there are some ways to mitigate this:
Have one "master" Serialization.xsd, and reference it from any other WCF schema. One per Solution is an option too, whichever is most practical.
Remove the reference completely by changing to native xml types. Most of the definitions are redeclarations of native types and likely aren't used. Checking a couple schemas I have at hand, I see only the "guid" type is used. It still works fine if I change it to xs:string and remove ...Serialization.xsd from the Includes list.

How to generate POCO proxies from an existing database

I recently switched to Entity Framework 5. Now, I want to generate the POCO classes from an existing database and also I need both lazy loading and change tracking. So all the scalar properties should be virtual as well as navigation properties.
Adding a new ADO.Net Entity Data Model ends in an .edmx file and some other .cs and .tt files.
Firstly, I wonder why the generated POCO classes by default do not meet the requirements of change tracking proxy, i.e scalar properties are not virtual.
Secondly, how can I genrate proxy-enabled poco classes?
PS: I accepted the Slauma's answer as the best and the only answer so far but I don't agree with the first part of it. Here is my argument
Slauma talks about two problems with proxy: restrictions and performance:
About the restrictions on the proxy-enabled entities:
When the classes are generated in DB First method by Entity Framework, the rules that the classes must follow to enable change-tracking proxies are not that much important becuase they are not restrictive at all. Who really cares whether the navigation collections are IList or HashSet? Talking about the restrictions is sensible only when there are perior designed classes in the application and tables are to be generated from them.
Complex properties are not supported in DB first. So we can exclude them from our discussion.
About the perfomrance:
In the addressed article and also some other experiments I have studied so far the results are not very convincing to reject proxy in favor of snapshot. First, the experiments were done on a large number of entities a.k.a 10,000. It is not improbable that a batch process in your application(not in database) works on large number of entities, however better approaches are assumed such as stored procedure.
Second, depending on the type of the application and the needs, we usually deal with few number of entites for example when Repository pattern is impelemented and used; there is no difference between the performance of proxy and snapshot.
Interestingly, in the addressed experiment, re-assigning the same value to the properties was the only case when performance of proxy dramatically fails. But who really does this? It is very easy to be careful to avoid repeatedly notifying change tracker. Again, in this case significant problem arrises when large number of entites are dealt with.
Firstly, I wonder why the generated POCO classes by default do not
meet the requirements of change tracking proxy, i.e scalar properties
are not virtual.
Using change tracking proxies is not recommended as the default change tracking strategy. It is explained in more details in this blog post. In essence the main reason to use change tracking proxies - better performance compared to snapshot based change tracking - is not always guaranteed - and sometimes it's even worse - and the list of disadvantages is longer than for snapshot based change tracking.
In the past the T4 templates that generated POCO entities indeed marked all properties - including scalar properties - as virtual and prepared the entities for proxy based change tracking. For the reasons described in the blog this has been changed for the newer templates, including the DbContext Generator for EF 5, as mentioned in this comment below the blog post linked above. Now, only navigation properties are marked as virtual, but not scalar properties, which allows lazy loading but is not sufficient for change tracking proxies.
Secondly, how can I generate proxy-enabled poco classes?
I am not aware of any available T4 template that would do this, but it is quite easy to modify the default template to mark also the scalar properties as virtual:
In your project you should have two files with a .tt extension: YourModelContainer.tt and YourModelContainer.Context.tt. Open the YourModelContainer.tt file.
In this file you'll find a method called Property:
public string Property(EdmProperty edmProperty)
{
return string.Format(
CultureInfo.InvariantCulture,
"{0} {1} {2} {{ {3}get; {4}set; }}",
Accessibility.ForProperty(edmProperty),
_typeMapper.GetTypeName(edmProperty.TypeUsage),
_code.Escape(edmProperty),
_code.SpaceAfter(Accessibility.ForGetter(edmProperty)),
_code.SpaceAfter(Accessibility.ForSetter(edmProperty)));
}
Change the line with...
Accessibility.ForProperty(edmProperty),
...to...
AccessibilityAndVirtual(Accessibility.ForProperty(edmProperty)),
That's it.
Just to mention it, in case you are not familiar with it, but there is a second kind of Database-First approach available, that is Reverse Engineering an existing database to a Code-First model. This approach doesn't use a T4 template at all but creates a Code-First model and a context with Fluent API mapping. It is useful if you want to customize and extend the model classes (you could also add virtual modifiers then manually) and proceed with Code-First workflow (and Code-First Migrations) in future to update and evolve your database schema.

Automapper map from System.Dynamic.DynamicObject

I'm having a really difficult time mapping from a DynamicObject using Automapper -- the properties on my destination type always end up null even though they do map to a property of the same name. I assume this has to do with reflection issues on DynamicObjects... Is Automapper able to do this?
Similar question to Allow mapping of dynamic types using AutoMapper or similar?
In summary AutoMapper does not support this. It's easy to write your own mapper to do this using reflection though. The accepted answer to Allow mapping of dynamic types using AutoMapper or similar? has an example of this.

Subsonic Simple Repository String Lengths

I'm playing around with the SimpleRepository provider (with automigrations) in SubSonic 3 and I have an annoying problem:
The only way I can control the string length in my database tables is by adding the SubSonicStringLength or SubSonicLongString attributes to the properties of the objects that need to be persisted.
I don't really want a dependency on SubSonic anywhere except in my repository class, and certainly not in my model objects if I can avoid it.
Are there anyways to get round this or am I stuck using SubSonicStringLength and the other attributes?
Basically the only way around this would be to have a DTO object that you map to/from your SimpleRepository classes inside your repository. You could use a mapping tool like AutoMapper to convert to/from your DTOs to your SimpleRepo objects.
This would isolate your application from SubSonic dependencies outside of your repo but would obviously involve a non trivial amount of work.

Resources