Default values for properties in Azure Table Storage - azure

I am using Azure table storage and have questions about how nulls and default values for entities work.
Specifically, I have a class that extends TableServiceEntity. The default constructor for this class is setting default values for various properties like so:
public class MyEntity: TableServiceEntry
{
public MyEntity() : MyEntity("invalid", "invalid") {}
public MyEntity(string f1, string f2)
{
Field1 = f1;
Field2 = f2;
}
public string Field1 { get; set; }
public string Field2 { get; set; }
}
I tested this class locally (on the emulator) by constructing the following entity:
MyEntity e = new MyEntity("hello", null);
I uploaded the entity and then retrieved it locally and the two fields were set to "hello" and null, respectively, as expected.
However, when I uploaded the same entity to the Azure cloud, what I received back was "hello" and "invalid", respectively, for the two properties.
My code that saves the entity is below:
public class MyTable : TableServiceContext
{
...
public void AddEntry(MyEntity e)
{
this.AddObject("MyTable", e);
this.SaveChangesWithRetries(SaveChangesOptions.ReplaceOnUpdate);
}
}
I was able to fix this by making the default constructor take no arguments, but now I feel like I have a fundamental misunderstanding of how table storage works. Is it true that when you specify defaults for properties of a TableServiceEntry, those become the defaults for each row in the table in the cloud but not in the emulator (i.e. cloud vs. SQL Express)? If so, why can't I override those defaults with null in the cloud? Is there any documentation that explains how default constructors and nulls work in Azure table storage?

Yes, there is a difference between how table storage behaves in the emulator and in the cloud. The emulator implemented in SQL server, returns all columns defined for a table, even if not defined for a row, irrespective of the columns value (null / non-null). In the cloud, a property set to null is neither stored nor returned in the REST call.
A quick fix would be to check for null in property set, and only mutate the property if the value passed in, is not null.

Devstorage and real storage behave differently in some cases, but I've never seen them handle NULL values differently. And I've certainly never seen it change a value from NULL to "invalid", as you seem to be implying. Are you sure you didn't accidentally upload the wrong values to the cloud? You may want to try again, and use Fiddler to look at the actual request and response values.

Related

Override JsonDeserializer Behavior

The generator has created a field of type OffsetDateTime:
#Nullable
#ElementName("DocDate")
private OffsetDateTime docDate;
But the server actually returns dates in the format: YYYY-mm-dd i.e. 2021-03-07
When using the generated code I get the following warnings:
WARN - Not deserializable: 2021-03-07
What is the correct way to override the deserialization of these fields? Or have these fields deserialize correctly?
An OffsetDateTime should have both a date and a time. The data your service responds with is lacking the time part. As per the OData V4 ABNF, this is not allowed (assuming your service is a V4 service):
dateTimeOffsetValue = year "-" month "-" day "T" hour ":" minute [ ":" second [ "." fractionalSeconds ] ] ( "Z" / SIGN hour ":" minute )
One way to solve this is to change the property type. You could either:
Change it to Edm.Date in the specification
Or change it to LocalDate in the generated code.
Of course this only makes sense if the service will always respond with a date.
Edit: If you really need to register a custom type adapter (e.g. because the service violates the JSON format) you could override the GsonVdmAdapterFactory:
public <T> TypeAdapter<T> create( #Nonnull final Gson gson, #Nonnull final TypeToken<T> type )
{
if( LocalDate.class.isAssignableFrom(rawType) ) {
return (TypeAdapter<T>) new CustomLocalDateTypeAdapter();
} else {
return super.create(gson, type);
}
}
However, this also requires changing the generated code, because there is currently no convenience to pass a custom type adapter as parameter. Change #JsonAdapter(com.sap.cloud.sdk.datamodel.odatav4.adapter.GsonVdmAdapterFactory.class) to reference your custom factory.
Still, I'd recommend using one of the above workarounds until the service is fixed.

The performance issue of validating entity using value object

I have the following value object code which validates CustCode by some expensive database operations.
public class CustCode : ValueObject<CustCode>
{
private CustCode(string code) { Value = code; }
public static Result<CustCode> Create(string code)
{
if (string.IsNullOrWhiteSpace(code))
return Result.Failure<CustCode>("Code should not be empty");
// validate if the value is still valid against the database. Expensive and slow
if (!ValidateDB(code)) // Or web api calls
return Result.Failure<CustCode>("Database validation failed.");
return Result.Success<CustCode>(new CustCode(code));
}
public string Value { get; }
// other methods omitted ...
}
public class MyEntity
{
CustCode CustCode { get; }
....
It works fine when there is only one or a few entity instances with the type. However, it becomes very slow for method like GetAll() which returns a lot of entities with the type.
public async IAsyncEnumerable<MyEntity> GetAll()
{
string line;
using var sr = File.OpenText(_config.FileName);
while ((line = await sr.ReadLineAsync()) != null)
{
yield return new MyEntity(CustCode.Create(line).Value); // CustCode.Create called many times
}
}
Since data in the file was already validated before saving so it's actually not necessary to be validated again. Should another Create function which doesn't validate the value to be created? What's the DDD idiomatically way to do this?
I generally attempt not to have the domain call out to retrieve any additional data. Everything the domain needs to do its job should be passed in.
Since value objects represent immutable state it stands to reason that once it has managed to be created the values are fine. To this end perhaps the initial database validation can be performed in the integration/application "layer" and then the CustCode is created using only the value(s) provided.
Just wanted to add an additional point to #Eben Roux answer:
In many cases the validation result from a database query is dependent on when you run the query.
For example when you want to check if a phone number exists or if some product is in stock. The answers to those querys can change any second, and though are not suited to allow or prevent the creation of a value object.
You may create a "valid" object, that is (unknowingly) becoming invalid in the very next second (or the other way around). So why bother running an expensive validation, if the validation result is not reliable.

How to hardcode the entity varchar value?

The requirement is to store the hardcoded value for varchar which is in an entity file(.eti). I tried adding to the default option but it is not reflecting.
Default option works well with boolean values (true/false), typelists (you can choose a default typecode), monetary amounts too, but it looks like it is not allowed to specify a default varchar.
Therefore the easiest way would be to create a preupdate rule which inserts that default value every time when you create a new record in the database.
Preupdate rule example:
#gw.rules.RuleName("YourEntityAssignDefaultValue")
internal class YourEntityAssignDefaultValueRule {
static function doCondition(yourEntity : entity.YourEntity) : boolean {
return yourEntity.New
}
static function doAction(yourEntity : entity.YourEntity, actions : gw.rules.Action) {
yourEntity.yourColumn = "defaultValue"
}
}
you can achieve through getter and setter properties in an appropriate enhancement class.
public property get PolicyNumber(): String {
return this.PolicyNumber }
and somewhere class you must be assigned the value to the PolicyNumber field then it will reflect.

ServiceStack AutoQuery into custom DTO

So, I'm working with ServiceStack, and know my way around a bit with it. I've used AutoQuery and find it indispensable when calling for straight 'GET' messages. I'm having an issue though, and I have been looking at this for a couple of hours. I hope it's just something I'm overlooking.
I have a simple class set up for my AutoQuery message:
public class QueryCamera : QueryDb<db_camera>
{
}
I have an OrmLite connection that is used to retrieve db_camera entires from the database. this all works just fine. I don't want to return a model from the database though as a result, I'd like to return a DTO, which I have defined as another class. So, using the version of QueryDb, my request message is now this:
public class QueryCamera : QueryDb<db_camera, Camera>
{
}
Where the Camera class is my DTO. The call still executes, but I get no results. I have a mapper extension method ToDto() set up on the db_camera class to return a Camera instance.
Maybe I'm just used to ServiceStack making things so easy... but how do I get the AutoQuery request above to perform the mapping for my request? Is the data retrieval now a manual operation for me since I'm specifying the conversion I want? Where's the value in this type being offered then? Is it now my responsibility to query the database, then call .ToDto() on my data model records to return DTO objects?
EDIT: something else I just observed... I'm still getting the row count from the returned dataset in AutoQueryViewer, but the field names are of the data model class db_camera and not Camera.
The QueryDb<From, Into> isn't able to use your Custom DTO extension method, it's used to select a curated set of columns from the executed AutoQuery which can also be used to reference columns on joined tables.
If you want to have different names on the DTO than on your Data Model, you can use an [Alias] attribute to map back to your DB column name which will let you name your DTO Property anything you like. On the other side you can change what property the DTO property is serialized as, e.g:
[DataContract]
public class Camera
{
[DataMember(Name = "Id")] // serialized as `Id`
public camera_id { get; set; } // populated with db_camera.camera_id
[DataMember]
[Alias("model")] // populated with db_camera.model
public CameraModel { get; set; } // serialized as `CameraModel`
}

WCF Data Service with EF complex type

I'm just playing with EF5 and Data Services. Decided to test exposing SP. Mapped it to FirmInfo complex type. Running in this stupid error. Cannot seem to figure it out.
I have this complex type .tt template created for me
public partial class FirmInfo
{
public int FirmID { get; set; }
public string Name { get; set; }
}
I added this to expose it to MyDataService.svc.cs class:
[WebGet]
public IQueryable<FirmInfo> pSPTest(int id)
{
return CurrentDataSource.pSPTest(id).AsQueryable();
}
I can see it in browser as such:
- <pSPTest xmlns="http://schemas.microsoft.com/ado/2007/08/dataservices" xmlns:m="http://schemas.microsoft.com/ado/2007/08/dataservices/metadata">
- <element m:type="DB.FirmInfo">
<FirmID m:type="Edm.Int32">1</FirmID>
<Name>Firm Name</Name>
</element>
</pSPTest>
but when consuming by c# client app I keep getting this error:
The property 'element' does not exist on type 'Client.ServiceReference.FirmInfo'. Make sure to only use property names that are defined by the type.
any help appreciated
How are you consuming the result with the C# client app? If you're using the WCF Data Services client, you should be calling Execute<T>() on the DataServiceContext.
For guidance on how to use the WC Data Services client to call service operations, check out this documentation: http://msdn.microsoft.com/en-us/library/hh230677.aspx
You could also achieve this by stating the result of the operation as the collection type that you expect, like this.
var query = context.CreateQuery<ObservableCollection<wsAccountView.organisation>>("GetOrganisationsByUserName").AddQueryOption("UserName", #"'SFN\AO'");
var Organisations = query.ToList();

Resources