Using enums in fluent API - entity-framework-5

This SO question asks about using enums with fluent API in Entity Framework 4.1, and it turns out that this wasn't supported. Is this now supported in Entity Framework 5?

Enums are supported in EF5 for .NET Framework 4.5 and EF6 for both .NET Framework 4 and .NET Framework 4.5 - you can find more details here: Enum type not being mapped to DB table. Enums are used in EF only for properties so you configure them in the same way you would configure primitive properties. Under the hood enum values are converted to the underlying enum type and stored in the database as values whose store type correspond to the underlying enum type (i.e. if an enum type has underlying type of byte each the values of properties using this enum type will be stored in SqlServer as smallint).

Related

How to generate Inference Schema for Dictionary with nested structure using Azure InferenceSchema package?

In Advanced Scoring Scripting for AzureML webservice, to automatically generate a schema for our web service, we provide a sample of the input and/or output in the constructor for one of the defined type objects. The type and sample are used to automatically create the schema.
To use schema generation, we include the open-source inference-schema package version 1.1.0 or above. The types that I can find include Numpy Type, Pandas Type, Abstract Parameter type.
How do we define the schema for a Nested Dictionary of (generalized) format:
{ "top_level_key": [
{"nested_key_1": "string_1",
"nested_key_2": <float_number>,
"nested_key_3": <True/False>}
]
}
we don’t have a good way to extend the handling for generic Python class objects. However, we are planning to add support for that, basically by providing more information on the necessary hooks, and allowing users to extend a base class to implement the hook to match the desired class structure.
These types are currently supported:
pandas
numpy
pyspark
Standard Python object
https://learn.microsoft.com/en-us/azure/machine-learning/how-to-deploy-advanced-entry-script#automatically-generate-a-swagger-schema

How to generated extra annotations on JAXB2 generated Java from XSD

I have the following type of structures in my XSD's..
And this is generating code that enumerations that serialize correctly back-and-forth from XML.
Now, those same XML/SOAP services are being implemented as REST using Jackson and RestEASY. It all works correctly with no further additional settings, except for the enumerations which are constrained values that we want to serialize to those generated Java enums. However, the classes are devoid of JSon annotations, causing them serialize and deserialize the string values of the generated enumerations instead.
What I need it to generate the code with #JsonValue and #JsonCreator so that we can process the enumerations based on their assigned values instead...
How can I modify the JaxB2 generator to have it add add the Json annotations as well? There are several hundred enumerations in the system, so a systemic solution is called for.

Sharp Architecture Value Objects

I'm checking out Sharp Architecture's code. So far it's cool, but I'm having problems getting my head around how to implement DDD value objects in the framework (doesn't seem to be anything mentioning this in the code). I'm assuming the base Entity class and Repository base are to be used for entities only. Any ideas on how to implement value objects in the framework?
In Sharp Arch there is a class ValueObject in namespace SharpArch.Domain.DomainModel. This object inherits from BaseObject and overrides the == and != operators and the Equals() and GetHashCode() methods. The method overrides just calls the BaseObject versions of those two methods which in turn uses GetTypeSpecificSignatureProperties() method to get the properties to use in the equality comparison.
Bottom line is that Entity's equality is determined by
Reference equality
Same type?
Id's are the same
Comparison of all properties decorated with the [DomainSignature] attribute
For ValueObjects, the BaseObject's Equals method is used
Reference equality
Same type?
Compare all public properties
This is a little bit simplified, I suggest you get the latest code from github and read through the code in the mentioned 3 classes yourself.
Edit: Regarding persistence, this SO question might help. Other than that, refer to the official NH and Fluent NH documentation
Value objects are simple objects that don't require a base class. (The only reason entities have base classes is to provide equality based on the identity). Implementing a value object just means creating a class to represent a value from your domain. A lot of times value objects should be immutable and provide equality comparison methods to determine equality to other value objects of the same type. Take a look here.

Why .NET 4 variance for generic type arguments not also for classes? [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicates:
Why isn't there generic variance for classes in C# 4.0?
Why does C# (4.0) not allow co- and contravariance in generic class types?
The new .NET 4.0 co- and contravariance for generic type arguments only works for interfaces and delegates. What is the reason for not supporting it for classes too?
For type safety, C# 4.0 supports covariance/contravariance ONLY for type parameters marked with in or out.
If this extended to classes, you'd also have to mark type parameters with in our out and this would end up being very restrictive. This is most likely why the designers of the CLR chose not to allow it. For instance, consider the following class:
public class Stack<T>
{
int position;
T[] data = new T[100];
public void Push (T obj) { data[position++] = obj; }
public T Pop() { return data[--position]; }
}
It would be impossible to annotate T as either in our out, because T is used in both input and output positions. Hence this class could never covariant or contravariant - even in C# supported covariance/contravariant type parameters for classes.
Interfaces solve the problem nicely. We can have define two interfaces as follows, and have Stack implement both:
public interface IPoppable<out T> { T Pop(); }
public interface IPushable<in T> { void Push (T obj); }
Note that T is covariant for IPoppable and contravariant for IPushable. This means T can be either covariant or contravariant - depending on whether you cast to IPoppable or IPushable.
Another reason that covariance/contravariance would be of limited use with classes is it would rule out using type parameters as fields - because fields effectively allow both input and output operations. In fact, it would be hard to write a class that does anything useful at all with a type parameter marked as in or out. Even the simplest case of writing a covariant Enumerable implementation would present a challenge - how would you get source data into the instance to begin with?
The .NET team along with the C# and VB.NET team has limited resources, the work they have done on co- and contravariance solves most of the real world problem. Type systems are very complex to get right – a solution that works in 99.9999% of cases is not good enough if it leads to unsafe code in the other cases.
I don’t think the cost/time of supporting co- and contravariance specs (e.g. “in”/”out”) on class methods is of a great enough value. I can see very few cases when they would be useable – due to the lack of multiply class inheritance.
Would you rather had waited for another 6 months for .net so as to get this support?
Another way to think of this is that in .net
Interfaces / delegates – are used to model the conceptual type system of an application
Class are used to implement the above types
Class inheritance is used to reduce code duplication while doing the above
co- and contravariance is about the conceptual type system of an application

How to Marshall C++ Native Objects to Managed C++ CLI

I have bunch of native C++ objects and classes contains DTL maps, maps of maps and lists and vectors.
I need to call managed C++ functions from C++ native code and need to pass these native objects and STL containers(lists,maps , maps of maps) to C++/CLI. It needs to marshal or some how serialize these objects. How can I do that with out any problem. So that After marshalling and serializing back to managed C++/CLI , maps should marshalled with dictionaries and dictionaries of dictionaries, stl list with managed List<> and so on..
how can I achieve this for all cases? Is it requires complex handling of marshalling issues...?
Regards
Usman
STL memory layout is implementation specific. E.g. sizeof(std::vector) is 16 in release and 20 in debug mode when you use the implementation comes with Visual C++. And you have pointers in STL classes that can't be marshaled meaningfully to managed memory. You can switch to platform-independent C or COM types in the interface (e.g. pass an array with a count parameter or a safe array) if you want to do marshaling as .Net has better understanding on these types. I recommend COM because it has richer types and support other languages in case you need it.
Alternatively if you need speed you can write a marshal_as template function to do the conversion so you can reuse the marshaling code or even the marshaling buffer, or write a managed wrapper for your C++ objects.
If the data being marshaled is too large to fit in the memory you can also serialize the data to temp file or database and read them back from managed code in chunks.

Resources