Postgres enums conversion with manually created Record objects - jooq

We are using jooq with vert.x in a blend where jooq is building queries and the DB calls themselves are executed using vertx postgres client. The legacy system uses jooq 3.11.12 and now we are bumping it to 3.15.5. And this is where we hit the following backwards incompatible behavior.
We have an enum defined in DB:
CREATE TYPE model.days_of_week_enum AS ENUM (
'Sunday',
'Monday',
'Tuesday',
'Wednesday',
'Thursday',
'Friday',
'Saturday');
and a table column defined as an array of these enums:
...ALTER COLUMN days_of_week
TYPE model.days_of_week_enum[]
USING days_of_week::model.days_of_week_enum[]
Jooq autogenerates the enum class and the table record as expected and all seems fine.
When we read the query response from vertx client the array of enums arrive as an array of Strings, e.g.
"Monday", "Sunday"
We manually create a Record object and fill in the value that arrived from vertx (as an array of strings)
R record = dslContext.newRecord(table);
record.set(fieldDefinedAsOurEnumArray, valueFromVertxAsStringArray);
Till here all works ok. But later we are retrieving the values of this field from the Record object using
res = (T) Convert.convert(val, field.getConverter());
Jooq defines the converter of the field as IdentityConverter [ [Lcom.xyz.jooq.model.enums.DaysOfWeekEnum; ]
but with 3.11.12 the method flow continues to:
public static final <U> U convert(Object from, Converter<?, ? extends U> converter) throws DataTypeException {
return convert0(from, converter);
}
private static final <T, U> U convert0(Object from, Converter<T, ? extends U> converter) throws DataTypeException {
Convert.ConvertAll<T> all = new Convert.ConvertAll(converter.fromType());
return converter.from(all.from(from));
}
which ends up in doing the right conversion, while with 3.15.5 it ends up calling
public static final <U> U convert(Object from, Converter<?, ? extends U> converter) throws DataTypeException {
return converter instanceof IdentityConverter ? from : convert0(from, converter);
}
returning the array of strings as is, which causes a casting error in the Record's getter
public DaysOfWeekEnum[] getDaysOfWeek() {
return (DaysOfWeekEnum[]) get(4);// get retrieves String[]!
}
Is there any workaround for this issue?
TIA

Related

Spring Data Cassandra Custom Read Converter for Enum called for non-enum fields

I have a custom read converter implemented as follows:
public class ActivityTraceStageReadConverter implements Converter<String, ActivityTraceStage> {
#Override
public ActivityTraceStage convert(String stage) {
return ActivityTraceStage.valueOf(stage.toUpperCase());
}
}
The values for the specific column are stored in lowercase, hence the .toUpperCase() in the converter. I have the converter registered as follows:
#Override
public CustomConversions customConversions() {
return new CustomConversions(Arrays.asList(
new ActivityTraceStageReadConverter()
));
}
The problem I'm running into is that this converter is being executed for every varchar column on the table, not just for columns that map to ActivityTraceStage on the specific Table. Is there something I need to do to get this converter to only run for one specific field? I figured it would be smart enough to only execute it for ActivityTraceStage fields, but I must be missing something here.

How to reuse ValueResolver across different mappings in AutoMapper 5?

I just tried upgrading AutoMapper to 5.0.2 but hit a road block.
According to the migration docs, value resolvers have now access to the destination object:
The signature of a value resolver has changed to allow access to the source/destination models.
This has the consequence that each value resolver is tied to exactly one destination type.
However, some of our value resolvers are used for multiple destination types. We have e.g. a resolver that is used during the mapping of all the ID properties of our DTOs. The resolver modifies the ID by means of a service that is injected into the resolver.
How would I define reusable value resolvers in AutoMapper 5, so that I don't have to create dedicated resolvers for each destination type with exactly the same implementation?
Note: The main reason to use a value resolver instead of directly manipulating the values is dependency injection. As per this answer, value resolvers are the best way to use a dependency-injected service during the mapping.
The destination type can just be "object":
public class FooResolver : IValueResolver<object, object, string> {}
Or it can be more specific:
public class FooResolver : IValueResolver<IEntity, object, string> {}
Because of the variance defined for IValueResolver, you can put base types in the first two generic arguments.
Good day, I think the best way would be to use Generics as follows:
public class FooResolver<TSource, TDestination> : IValueResolver<TSource, TDestination, string>
{
private readonly Dictionary<Type, int> typeDictionary;
public FooResolver()
{
typeDictionary = new Dictionary<Type, int>
{
{typeof(FooA), 0},
{typeof(FooB), 1}
};
}
pulic string Resolve(TSource source, TDestination destination, string destMember,
ResolutionContext context)
{
switch (typeDictionary[source.GetType()])
{
case 0:
var fooA = ((FooA)Convert.ChangeType(source, typeof(FooA)));
//custom code
break;
case 1:
var fooB = ((FooB)Convert.ChangeType(source, typeof(FooB)));
//custom code
break;
}
return string_value;
}
}
During Mapping you simply provide the actual types for source and destination e.g.
act.MapFrom<FooResolver<FooA, FooADestination>>

Using Dapper.Net ORM, how do I cast stored procedure output to a concrete type?

Using Entity Framework I can create concrete classes from most of the sprocs in the database of a project I'm working on. However, some of the sprocs use dynamic SQL and as such no metadata is returned for the sproc.
So for a that sproc, I manually created a concrete class and now want to map the sproc output to this class and return a list of this type.
Using the following method I can get a collection of objects:
var results = connection.Query<object>("get_buddies",
new { RecsPerPage = 100,
RecCount = 0,
PageNumber = 0,
OrderBy = "LastestLogin",
ProfileID = profileID,
ASC = 1},
commandType: CommandType.StoredProcedure);
My concrete class contains
[DataContractAttribute(IsReference=true)]
[Serializable()]
public partial class LoggedInMember : ComplexObject
{
/// <summary>
/// No Metadata Documentation available.
/// </summary>
[EdmScalarPropertyAttribute(EntityKeyProperty=false, IsNullable=false)]
[DataMemberAttribute()]
public global::System.Int16 RowID
{
get
{
return _RowID;
}
set
{
OnRowIDChanging(value);
ReportPropertyChanging("RowID");
_RowID = StructuralObject.SetValidValue(value);
ReportPropertyChanged("RowID");
OnRowIDChanged();
}
}
private global::System.Int16 _RowID;
partial void OnRowIDChanging(global::System.Int16 value);
partial void OnRowIDChanged();
[EdmScalarPropertyAttribute(EntityKeyProperty=false, IsNullable=false)]
[DataMemberAttribute()]
public global::System.String NickName
{
get
{
return _NickName;
}
set
{
OnNickNameChanging(value);
ReportPropertyChanging("NickName");
_NickName = StructuralObject.SetValidValue(value, false);
ReportPropertyChanged("NickName");
OnNickNameChanged();
}
}
private global::System.String _NickName;
partial void OnNickNameChanging(global::System.String value);
partial void OnNickNameChanged();
.
.
.
Without having to iterate through the results and add the output parameters to the LoggedInMember object, how do I map these on the fly so I can return a list of them through a WCF service?
If I try var results = connection.Query<LoggedInMember>("sq_mobile_get_buddies_v35", ... I get the following error:
System.Data.DataException: Error parsing column 0 (RowID=1 - Int64)
---> System.InvalidCastException: Specified cast is not valid. at Deserialize...
At a guess your SQL column is a bigint (i.e. Int64 a.k.a. long) but your .Net type has a n Int16 property.
You could play around with the conversion and ignore the stored procedure by doing something like:
var results = connection.Query<LoggedInMember>("select cast(9 as smallint) [RowID] ...");
Where you are just selecting the properties and types you want to return your object. (smallint is the SQL equivalent of Int16)
The solution to this was to create a complex object derived from the sproc with EF:
public ProfileDetailsByID_Result GetAllProfileDetailsByID(int profileID)
{
using (IDbConnection connection = OpenConnection("PrimaryDBConnectionString"))
{
try
{
var profile = connection.Query<ProfileDetailsByID_Result>("sproc_profile_get_by_id",
new { profileid = profileID },
commandType: CommandType.StoredProcedure).FirstOrDefault();
return profile;
}
catch (Exception ex)
{
ErrorLogging.Instance.Fatal(ex); // use singleton for logging
return null;
}
}
}
In this case, ProfileDetailsByID_Result is the object that I manually created using Entity Framework through the Complex Type creation process (right-click on the model diagram, select Add/Complex Type..., or use the Complex Types tree on the RHS).
A WORD OF CAUTION
Because this object's properties are derived from the sproc, EF has no way of knowing if a property is nullable. For any nullable property types, you must manually configure these by selecting the property and setting its it's Nullable property to true.

Faking enums in Entity Framework 4.0

There are a lot of workarounds for the missing support of enumerations in the Entity Framework 4.0. From all of them I like this one at most:
http://blogs.msdn.com/b/alexj/archive/2009/06/05/tip-23-how-to-fake-enums-in-ef-4.aspx?PageIndex=2#comments
This workaround allows you to use enums in your LINQ queries which is what i exactly need. However, I have a problem with this workaround. I get for every complex type I'm using a new partial autogenerated class.Therefore the code does not compile any more because I already have a wrapper class with this name in the same namespace which converts betwen the backed integer in the database and the enum in my POCO classes. If I make my wrapper a partial class, the code still does not compile as it now contains two properties with the same name "Value". The only possibility is to remove the Value property by hand everytime I generate the POCO classes because the DB model changed (which during the development phase happens very often).
Do you know how to prevent a partial class to be generated out of complex property everytime the EF model changes?
Can you recommend me some other workarounds supporting enumerations in LINQ queries?
That workaround is based on the fact that you are writing your POCO classes yourselves = no autogeneration. If you want to use it with autogeneration you must heavily modify T4 template itself.
Other workaround is wrapping enum conversion to custom extension methods.
public static IQueryable<MyEntity> FilterByMyEnum(this IQueryable<MyEntity> query, MyEnum enumValue)
{
int val = (int)enumValue;
return query.Where(e => e.MyEnumValue == val);
}
You will then call just:
var data = context.MyEntitites.FilterByMyEnum(MyEnum.SomeValue).ToList();
I am using an approach based on the one described in your link without any modifications of the T4 templates. The contents of my partial wrapper classes are as follows:
public partial class PriorityWrapper
{
public Priority EnumValue
{
get
{
return (Priority)Value;
}
set
{
Value = (int)value;
}
}
public static implicit operator PriorityWrapper(Priority value)
{
return new PriorityWrapper { EnumValue = value };
}
public static implicit operator Priority(PriorityWrapper value)
{
if (value == null)
return Priority.High;
else
return value.EnumValue;
}
}
I've only changed that instead of a back store variable with enum value I am using the autogenerated int typed Value property. Consequently Value can be an auto-implemented property and EnumValue property needs to do the conversion in getter and setter methods.

Problem using Lazy<T> from within a generic abstract class

I have a generic class that all my DAO classes derive from, which is defined below. I also have a base class for all my entities, but that is not generic.
The method GetIdOrSave is going to be a different type than how I defined SabaAbstractDAO, as I am trying to get the primary key to fulfill the foreign key relationships, so this function goes out to either get the primary key or save the entity and then get the primary key.
The last code snippet has a solution on how it will work if I get rid of the generic part, so I think this can be solved by using variance, but I can't figure out how to write an interface that will compile.
public abstract class SabaAbstractDAO<T> :ISabaDAO<T> where T:BaseModel
{
...
public K GetIdOrSave<K>(K item, Lazy<ISabaDAO<BaseModel>> lazyitemdao)
where K : BaseModel
{
...
}
I am getting this error, when I try to compile:
Argument 2: cannot convert from 'System.Lazy<ORNL.HRD.LMS.Dao.SabaCourseDAO>' to 'System.Lazy<ORNL.HRD.LMS.Dao.SabaAbstractDAO<ORNL.HRD.LMS.Models.BaseModel>>'
I am trying to call it this way:
GetIdOrSave(input.OfferingTemplate,
new Lazy<ISabaDAO<BaseModel>>(
() =>
{
return (ISabaDAO<BaseModel>)new SabaCourseDAO() { Dao = Dao };
})
);
If I change the definition to this, it works.
public K GetIdOrSave<K>(K item, Lazy<SabaCourseDAO> lazyitemdao) where K : BaseModel
{
So, how can I get this to compile using variance (if needed) and generics, so I can have a very general method that will only work with BaseModel and AbstractDAO<BaseModel>? I expect I should only need to make the change in the method and perhaps abstract class definition, the usage should be fine.
UPDATE:
With a very helpful response I have a slightly improved example, but an interesting dilemna:
I have this defined now, and I don't have any in or out on T here because I get errors that contradict, if out T then I get that it must be contravariantly valid and if in T then covariantly valid, so I made it invariant, since it appears VS2010 can't figure it out.
public interface ISabaDAO<T> where T:BaseModel
{
string retrieveID(T input);
T SaveData(T input);
}
I get this error, though it did compile:
System.InvalidCastException: Unable to cast object of type 'ORNL.HRD.LMS.Dao.SabaCourseDAO' to type 'ORNL.HRD.LMS.Dao.ISabaDAO`1[ORNL.HRD.LMS.Models.BaseModel]'.
I fixed two code snippets above, but it appears that variance won't work as I hoped here.
I had tried this:
public delegate K GetIdOrSave<out K>(K item, Lazy<ISabaDAO<BaseModel>> lazyitemdao)
where K : BaseModel;
but I get the same problem as with the interface, if I put out it complains, so I put in and the opposite complaint.
I think I could get this to work if this was legal:
public delegate K GetIdOrSave<K>(in K item, out Lazy<ISabaDAO<BaseModel>> lazyitemdao)
where K : BaseModel;
C# 4.0 support for covariance and contravariance when working with delegates and interfaces.
How is Generic Covariance & Contra-variance Implemented in C# 4.0?
So if you can use the generic delegate Lazy with a Interface as parameter so try somthing like this:
//covariance then you can save Giraffe as SicilianGiraffe but you cannot save Giraffe as Animal; contr-variance realization is not imposible in your case(just theoreticaly)
public interface ISabaDAO<out T> where T: BaseModel{
int retrieveID(BaseModel);
T SaveData(BaseModel);
}
public abstract class SabaAbstractDAO<T> : ISabaDAO<T>{
...
// in this case Lazy should be covariance delegate too
// delegate T Lazy<out T>();
public K GetIdOrSave<K>(K item, Lazy<ISabaDAO<BaseModel>> lazyitemdao) where K : BaseModel
{
...
return (K)itemdao.SaveData(item);// is not safe
...
}
}
public class Course : BaseModel{}
public class SabaCourseDAO : SabaAbstractDAO<Course>{}
//so you can cast SabaCourseDAO to ISabaDAO<Course> and ISabaDAO<Course> to ISabaDAO<BaseModel>
// then next invoking should be valid
GetIdOrSave(new Course (), new Lazy<ISabaDAO<Course>>(() =>
{
return new SabaCourseDAO() { Dao = Dao };
})
Cannot check it. I have not got VS2010.

Resources