MissingFieldException when querying a table with ServiceStack.OrmLite ServiceStack - servicestack

I'm getting a MissingFieldException for multiple OrmLite operations:
using (var db = DbFactory.Open())
{
var exp = db.From<Product>();
if (filter.Field1 != null)
exp.Where(w => w.Field1 == filter.Field1);
if (filter.Field2 != null)
exp.Where(w => w.Field2 == filter.Field2);
return db.LoadSelect(exp);
}
Also occurs with a simple AutoQuery RDBMS service.
[Api("Query.")]
[Route(“/query, "GET")]
public class QueryTransaction : QueryDb<Transaction, TransactionQueryRecord>,
IJoin<Transaction, Application>
{
[ApiMember(IsRequired = false, ParameterType = "query")]
public string TimeZoneId { get; set; }
}
The stack trace is the following:
System.MissingFieldException: Field not found: 'ServiceStack.OrmLite.OrmLiteConfig.UseParameterizeSqlExpressions'.
at ServiceStack.OrmLite.SqlServer.SqlServerOrmLiteDialectProvider.SqlExpression[T]()
at ServiceStack.OrmLite.OrmLiteExecFilter.SqlExpression[T](IDbConnection dbConn)
at ServiceStack.OrmLite.OrmLiteReadExpressionsApi.From[T](IDbConnection dbConn)
at ServiceStack.TypedQuery`2.CreateQuery(IDbConnection db, IQueryDb dto, Dictionary`2 dynamicParams, IAutoQueryOptions options)
at ServiceStack.AutoQuery.CreateQuery[From,Into](IQueryDb`2 dto, Dictionary`2 dynamicParams, Request req
I think that OrmLite is trying to find the property configuration OrmLiteConfig.UseParameterizeSqlExpressions, but it doesn't exist in the version v.4.0.60
When I run my integration tests with AppSelfHostBase everything is ok, but when I try in the browser sometimes work and other times throw the exception.

Missing method or field exceptions like this is an indication that you're mixing and matching dirty .dlls with different versions together. OrmLiteConfig.UseParameterizeSqlExpressions was removed a while ago after OrmLite switched to use parameterized queries, this error indicates that you have an old .dll that references it.
When you upgrade your ServiceStack projects you need to upgrade all dependencies and make sure all ServiceStack dependencies are referencing the same version (e.g v4.0.60 or the current latest v4.5.0). You can check the NuGet /packages folder to see the different versions your Solution uses. Deleting all but the latest version and rebuilding your solution will show build errors showing which projects were still referencing the older packages, which you'll want to update so that all projects are using the same version.

Related

Automapper 8.1.1 using reflection not working after update to Abp 4.8.1

I upgraded Abp to 4.8.1. After this update my AutoMapper throws this error:
- Missing type map configuration or unsupported mapping.
The issue arises when i try to map a DTO to an Entity
Im how im configuring AbpAutomapper
var thisAssembly = typeof(dsimApplicationModule).GetAssembly();
IocManager.RegisterAssemblyByConvention(thisAssembly);
Configuration.Modules.AbpAutoMapper().Configurators.Add(
// Scan the assembly for classes which inherit from AutoMapper.Profile
cfg => cfg.AddMaps(thisAssembly)
);
My DTO:
[AutoMapTo(typeof(ServiceTemplate))]
public class UpdateServiceTemplateDto : EntityDto, IShouldNormalize, ICustomValidate
{
//properties
}
In AppService, Update Method:
var serviceTemplate = ObjectMapper.Map<ServiceTemplate>(input);
Worked fine before update. I Read that dynamic mapping is depricated in Automapper 8.1.1. Not sure this is considered dynamic though since i use reflection.

Azure Storage Tables - Update Condition Not Satisfied

I'm getting a random Exception when I try to update an entity on a storage table. The exception I get is
System.Data.Services.Client.DataServiceRequestException: An error occurred while processing this request. ---> System.Data.Services.Client.DataServiceClientException: {"odata.error":{"code":"UpdateConditionNotSatisfied","message":{"lang":"en-US","value":"The update condition specified in the request was not satisfied.\nRequestId:2a205f10-0002-013b-028d-0bbec8000000\nTime:2015-10-20T23:17:16.5436755Z"}}} ---
I know that this might be a concurrency issue, but the thing is that there's no other process accessing that entity.
From time to time I get dozens of these exceptions, I restart the server and it starts working fine again.
public static class StorageHelper
{
static TableServiceContext tableContext;
static CloudStorageAccount storageAccount;
static CloudTableClient CloudClient;
static StorageHelper()
{
storageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudClient = storageAccount.CreateCloudTableClient();
tableContext = CloudClient.GetTableServiceContext();
tableContext.IgnoreResourceNotFoundException = true;
}
public static void Save(int myId,string newProperty,string myPartitionKey,string myRowKey){
var entity = (from j in tableContext.CreateQuery<MyEntity>("MyTable")
where j.PartitionKey == myId
select j).FirstOrDefault();
if (entity != null)
{
entity.MyProperty= myProperty;
tableContext.UpdateObject(entity);
tableContext.SaveChanges();
}
else
{
entity = new MyEntity();
entity.PartitionKey =MyPartitionKey;
entity.RowKey =MyRowKey;
entity.MyProperty= myProperty;
tableContext.AddObject("MyTable", entity);
tableContext.SaveChanges();
}
}
The code you've posted uses the very old table layer which is now obsolete. We strongly recommend you update to a newer version of the storage library and use the new table layer. See this StackOverflow question for more information. Also note that if you're using a very old version of the storage library these will eventually stop working as the service version they're using is going to be deprecated service side.
We do not recommend that customers reuse TableServiceContext objects as has been done here. They contain a variety of tracking that can cause performance issues as well as other adverse effects. These kind of limitations is part of the reason we recommend (as described above) moving to the newer table layer. See the how-to for more information.
On table entity update operations you must send an if-match header indicating an etag. The library will set this for you if you set the entity's etag value. To update no matter what the etag of the entity on the service, use "*".
I suggest you can consider using the Transient Fault Handling Application Block from Microsoft's Enterprise Library to retry when your application encounters such transient fault in Azure to minimize restarting the server every time when the same exception occurs.
https://msdn.microsoft.com/en-us/library/hh680934(v=pandp.50).aspx
While updating your entity, set ETag = "*".
Your modified code should look something like this -
if (entity != null)
{
entity.MyProperty= "newProperty";
tableContext.UpdateObject(entity);
tableContext.SaveChanges();
}

ServiceStack.OrmLite equivalent of Single/SingleOrDefault from Entity Framework

Currently when using OrmLite library from ServiceStack if I want single entity selected I do:
AppUser user = db.First<AppUser>(q => q.Id == id);
However since Single is more precise (obviously I want exception thrown if somehow multiple users with same id ended up in database) I was wondering if there is overload that I can use. Currently when I do db.Single I just get that overload with manual filtering:
public static T SingleOrDefault<T>(this IDbConnection dbConn, string filter);
OK, I found what the issue is - the version I'm using (3.9.71) doesn't have that overload - it was added later:
https://github.com/ServiceStack/ServiceStack.OrmLite/commit/f2f5f80f150f27266bdcaf81b77ca60b62897719#diff-e9a84724e6a8315ec7f7fc5a5512a44b
Seems I'll need to extend that class from within my code.

Azure caching and entity framework deserialization issue

I have a web project deployed in azure using colocated caching. I have 2 instances of this web role.
I am using Entity framework 5 and upon fetching some entities from the db, I cache them using colocated caching.
My entities are defined in class library called Drt.BusinessLayer.Entities
However when I visit my web app, I get the error:
The deserializer cannot load the type to deserialize because type 'System.Data.Entity.DynamicProxies.Country_4C17F5A60A033813EC420C752F1026C02FA5FC07D491A3190ED09E0B7509DD85' could not be found in assembly 'EntityFrameworkDynamicProxies-Drt.BusinessLayer.Entities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null'. Check that the type being serialized has the same contract as the type being deserialized and the same assembly is used.
Also sometimes I get this too:
Assembly 'EntityFrameworkDynamicProxies-Drt.BusinessLayer.Entities, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null' is not found.
It appears that there is an error getting the entities out/deserialized. Since they are 2 instances of my web role, instance1 might place some entity objects in the cache and instance2 might get them out. I was expecting this to work, but I am unsure why I am getting this error....
Can anyone help/advise?
I ran into the same issue. At least in my case, the problem was the DynamicProxies with which the EF wraps all the model classes. In other words, you might think you're retrieving a Country class, but under the hood, EF is actually dynamically generating a class that's called something like Country_4C17F5A60A033813EC420C752F1026C02FA5FC07D491A3190ED09E0B7509DD85. The last part of the name is obviously generated at run-time, and it can be expected to remain static throughout the life of your application - but (and this is the key) only on the same instance of the app domain. If you've got two machines accessing the same out-of-process cache, one will be storing an object of the type Country_4C17F5A60A033813EC420C752F1026C02FA5FC07D491A3190ED09E0B7509DD85, but that type simply won't exist on the other machine. Its dynamic Country class will be something like Country_JF7ASDF8ASDF8ADSF88989ASDF8778802348JKOJASDLKJQAWPEORIU7879243AS, and so there won't be any type into which it can deserialize the serialized object. The same thing will happen if you restart the app domain your web app is running in.
I'm sure the big brains at MS could come up with a better solution, but the one I've been using is to do a "shallow clone" of my EF objects before I cache them. The C# method I'm using looks like this:
public static class TypeHelper
{
public static T ShallowClone<T>(this T obj) where T : class
{
if (obj == null) return null;
var newObj = Activator.CreateInstance<T>();
var fields = typeof(T).GetFields();
foreach (var field in fields)
{
if (field.IsPublic && (field.FieldType.IsValueType || field.FieldType == typeof(string)))
{
field.SetValue(newObj, field.GetValue(obj));
}
}
var properties = typeof(T).GetProperties();
foreach (var property in properties)
{
if ((property.CanRead && property.CanWrite) &&
(property.PropertyType.IsValueType || property.PropertyType == typeof(string)))
{
property.SetValue(newObj, property.GetValue(obj, null), null);
}
}
return newObj;
}
}
This takes care of two problems at once: (1) It ensures that only the EF object I'm specifically interested in gets cached, and not the entire object graph - sometimes huge - to which it's attached; and (2) The object that it caches is of a common type, and not the dynamically generated type: Country and not Country_4C17F5A60A033813EC420C752F1026C02FA5FC07D491A3190ED09E0B7509DD85.
It's certainly not perfect, but it does seem a reasonable workaround for many scenarios.
It would in fact be nice, though, if the good folks at MS were to come up with a way to cache EF objects without this.
I'm not familiar with azure-caching in particular, but I'm guessing you need to hydrate your entities completely before passing them to anything that does serialization, which is something a distributed or out-of-process cache would do.
So, just do .Include() on all relationships when you're fetching an entity or disable lazy initialization and you should be fine.

Nuget autoupdate in website

For a customer we are trying to build a webb application that they can build patches (new versions) and their customers can by them self by a click in the app update.
I have made some minor experioments on nuget before and had this as an reference:
http://haacked.com/archive/2011/01/15/building-a-self-updating-site-using-nuget.aspx
unfortunatly some of the nuget packages installed and used in this project where to new and not compatible with the nuget package of autoupdate 0.2.1 uses NuGet.Core 1.3.20419.9005.
So i took the autoupdate code and upgraded nuget to 2.5 and fixed all new issues with the new nuget core (changes in functions/parameters etc.).
Now it works so far as i can se wich package is installed, and i can see that there is a new version on the remote server. Howerver when i try to upgrade the local package to the version on the server i get an error:
System.EntryPointNotFoundException: Entry point was not found.
this is where the code goes wrong:
public IEnumerable<string> UpdatePackage(IPackage package)
{
return this.PerformLoggedAction(delegate
{
bool updateDependencies = true;
bool allowPrereleaseVersions = true;
this._projectManager.UpdatePackageReference(package.Id, package.Version, updateDependencies, allowPrereleaseVersions);
});
}
[EntryPointNotFoundException: Entry point was not found.]
NuGet.IProjectSystem.get_ProjectName() +0
NuGet.ProjectManager.UpdatePackageReference(String packageId, Func`1 resolvePackage, Boolean updateDependencies, Boolean allowPrereleaseVersions, Boolean targetVersionSetExplicitly) +1014
NuGet.ProjectManager.UpdatePackageReference(String packageId, SemanticVersion version, Boolean updateDependencies, Boolean allowPrereleaseVersions) +233
the package param is the package i want to upgrade to.
In my web app i got the folder \App_Data\packages that holds my .nupkg file that is installed. On my remote folder i got all installed packages + my new version package.
I dont understand what the entrypoint is and how to solve this issue.
After a lot of time googling and code inspect and re-engineering code, it looks like the problem is in the Microsoft.AspNet.WebPages.Administration wich is only compatible with NuGet.Core (≥ 1.6.2 && < 1.7). Re-engineered the WebProjectSystem class and now it kinda works.
string webRepositoryDirectory = WebProjectManager.GetWebRepositoryDirectory(siteRoot);
IPackageRepository sourceRepository = PackageRepositoryFactory.Default.CreateRepository(remoteSource);
IPackagePathResolver pathResolver = new DefaultPackagePathResolver(webRepositoryDirectory);
IPackageRepository localRepository = PackageRepositoryFactory.Default.CreateRepository(webRepositoryDirectory);
IProjectSystem project = new WebProjectSystem(siteRoot);
this._projectManager = new ProjectManager(sourceRepository, pathResolver, project, localRepository);
The UpdatePackage methods also dosent seam to do the job, only updates references or something. When uppgrading to a new version of my package (eg. deploy new images or html files) seams like i need to use the following method instead:
public IEnumerable<string> InstallPackage(IPackage package)
{
return this.PerformLoggedAction(delegate
{
bool ignoreDependencies = false;
bool allowPrereleaseVersions = true;
this._projectManager.AddPackageReference(package.Id, package.Version, ignoreDependencies, allowPrereleaseVersions);
});
}
this goes through all my files in the package and seams to replace them.
however i got some weird issues with some files get 0 bytes after upgrade (only binary dll files, what i have seen so far).
Need some more research on this.

Resources