Azure Table Storage - GUID coming back empty - azure

I am trying to retrieve some entities from a table. I am successfully able to get back strings but when I try to get GUID, it comes back empty (all zeros).
[DataContract]
public class myEntity : TableEntity
{
[DataMember(Name = "ID")]
public Guid Id { get; set; }
[DataMember(Name = "Name")]
public string Name { get; set; }
[DataMember(Name = "City")]
public string City { get; set; }
}
...
var storageAccount = CloudStorageAccount.Parse(conStr);
var tableClient = storageAccount.CreateCloudTableClient();
var table = tableClient.GetTableReference(tblName);
TableQuery<myEntity> query = new TableQuery<myEntity>().Where(string.Empty);
How do I get the correct value of GUIDs in table storage? Is it related to TableEntity.ReadEntity?

I don't think there is any extra operation required to get GUID properties. Have you cross checked by other tools if there is indeed values stored in GUID propeties of that table?

#RotemVaron, you can use the function TableEntity.ReadEntity Method (IDictionary<String, EntityProperty>, OperationContext) to deserializes the entity using the specified IDictionary<TKey, TValue> that maps property names to typed EntityProperty values. Then, getting the Guid value from the EntityProperty object.
There is a blog shows the sample Reading existing entities which traverse the properties include GuidValue from entity.

Related

Azure TableStorage entity with a Complex property

I'm trying to get a HashSet<string> into Azure Table Storage. I want to model a project which has members:
public class DbProject : ITableEntity {
public string Name { get; set; } = default!;
public string Owner { get; set; } = default!;
public HashSet<string> Members { get; set; } = default!;
public DateTime CreatedOn { get; set; } = default!;
public string PartitionKey { get; set; } = default!;
public string RowKey { get;set; } = default!;
public DateTimeOffset? Timestamp { get;set; } = default!;
public ETag ETag { get;set; } = default!;
}
The 'insert logic' looks something like:
var dbProject = new DbProject {
Name = projectName,
Owner = owner,
CreatedOn = DateTime.SpecifyKind(DateTime.Now, DateTimeKind.Utc),
Members = new HashSet<string> { owner },
PartitionKey = $"{owner}__{projectName}",
RowKey = $"{owner}__{projectName}"
};
try {
this.tableClient.AddEntity<DbProject>(dbProject);
}
catch (Exception ex) {
Console.WriteLine(ex.Message);
}
The error I'm getting is:
ErrorCode: InvalidInput
Content:
{"odata.error":{"code":"InvalidInput","message":{"lang":"en-US","value":"An error occurred while processing this request.\nRequestId:6019edb7-c002-001c-1be9-bd1379000000\nTime:2021-10-10T15:15:24.5069601Z"}}}
If I remove the HashSet it works like a charm, so I guess there is something wrong with using complex types when creating a record in Azure Table Storage.
I would like to run simple queries like:
public bool IsMember(string owner, string projectName) {
var query = TableClient.CreateQueryFilter<DbProject>(u => u.Name == projectName && u.Members.Contains(owner));
return tableClient.Query<DbProject>(query).Any();
}
Since you are asking for alternative solutions in your comment: The most obvious choice would be to use CosmosDB instead of Table Storage. Table storage is extremely limited in many scenarios. If you don't need to use table storage for a specific reason, using CosmosDB with SQL API is the recommended way for new projects.
As mentioned in Jeremy's comment, you can use HashSet just fine there: How to search on Cosmos DB in a complex JSON Object
If you want to keep Table Storage and don't need to do queries on your HashSet: You can serialize your HashSet to a string like Gaurav mentioned. Maybe something like this:
[IgnoreProperty]
public HashSet<string> Members { get; set; }
public override void ReadEntity(IDictionary<string, EntityProperty> properties, OperationContext operationContext)
{
Members = JsonConvert.DeserializeObject<HashSet<string>>(properties[nameof(Members)].StringValue);
}
public override IDictionary<string, EntityProperty> WriteEntity(OperationContext operationContext)
{
var properties = new Dictionary<string, EntityProperty>();
properties.Add(nameof(Members), new EntityProperty(JsonConvert.SerializeObject(Members)));
return properties;
}
You could also look at this: Adding Complex Properties of a TableEntity to Azure Table Storage
The reason you're running into this issue is because Azure Table Storage does not support complex data types (HashSet is one of them).
For a list of supported data types, please see this link: https://learn.microsoft.com/en-us/rest/api/storageservices/understanding-the-table-service-data-model#property-types.
What you will have to do is somehow serialize the complex data type into one of the supported data types (string data type is the best fit) and save it. When you read the data, you will have to deserialize it back. Please note that in doing so you will lose the capability on querying on this particular attribute.

CloudFx equivalent of SkinnyEntity

We're switching over from Lokad to CloudFx to handle putting things in/out of table storage.
With Lokad, we had 3 entities:
1) Object 1, which added partition key and row key to
2) Object 2 which had a bunch of properties as well as an instance of
3) Object 3, which had its own set of properties
As far as I can tell, the only way for CloudFx to input that info into table storage is to flatten that whole thing out with one massive object that has all the properties of the previous three objects. With Lokad, we could just useSkinny=true.
Thoughts?
This is the method we ended up implementing. Short version: serialize object 2 and stuff it into the Value property of Object 1, then put in table storage.
Object 1:
public class CloudEntity<T>
{
public CloudEntity()
{
Timestamp = DateTime.UtcNow;
}
public string RowKey { get; set; }
public string PartitionKey { get; set; }
public DateTime Timestamp { get; set; }
public T Value { get; set; }
}
Object 2:
public class Store
{
public string StoreId { get; set; }
public string StoreName { get; set; }
public string StoreType { get; set; }
public Address MailingAddress { get; set; }
public PhoneNumber TelephoneNumber { get; set; }
public StoreHours StoreHours { get; set; }
}
Object 3 can be whatever...the address in this case perhaps...it all gets serialized.
So in code you can get the table as follows (more than 1 way to do this):
var tableStorage = new ReliableCloudTableStorage(connection string you're using);
Then let's say you have an instance of store() you want to put in table storage:
var myStore = new Store(
{
storeId = "9832",
storeName = "Headquarters"
...
});
You can do so in the following way:
var cloudEntity = new CloudEntity<string>
{
PartitionKey = whatever you want your partition key to be,
RowKey = whatever you want your row key to be,
Value = JsonConvert.SerializeObject(myStore) // THIS IS THE MAGIC
};
tableStorage.Add<CloudEntity<string>>(name of table in table storage, cloudEntity);
The entity put in table storage will have all the properties of the CloudEntity class (row key, partition key, etc) and in the "Value" column there will be the json of the object you wanted to store. It's easily readable through Azure Storage Explorer which is nice too.
To get the object back out, use something like this:
var cloudEntity = tableStorage.Get<CloudEntity<string>>(name of your table, partitionKey: whatever the partition key is);
Then you can deserialize the "Value" field of those into the object you're expecting:
var myStore = JsonConvert.DeserializeObject<Store>(cloudEntity.Value);
If you're getting a bunch back, just make myStore a list and loop through the cloudEntities, deserializing and adding each to your list.
Hope this helps!

Option for Include to only return foreign keys

Does Entity Framework provide an option to retrieve child objects that are only populated with fields that are foreign keys to the parent object?
Sample code might illustrate this better.
Assuming you have the following POCO classes...
public abstract class Base
{
public Guid Id { get; set; }
}
public class User : Base
{
public string FirstName { get; set; }
public string LastName { get; set; }
}
public class Photo : Base
{
public string Description { get; set; }
public User UploadedBy { get; set; }
}
... and assuming you've configured a DbContext correctly, how do you query for a list of all Photos including the UploadedBy object, but where that UploadedBy object only contains the Id property?
I know I can do this...
return await _dbContext.Photos.Include(p => p.UploadedBy).ToListAsync();
... but that returns the entire User object.
I'd like to do something like this...
return await _dbContext.Photos.Include(p => p.UploadedBy.Id).ToListAsync();
... to indicate that I only want the Id property back.
If we could chain those includes we would be able to pick each property on the child object that we want returned.
Or even better, I'd love to be able to configure a setting at a more global level that would make it so that anytime I ask for Photos, give me all members of photos, even child objects, but only populate their foreign keys and nothing more.
The last request is less important though because I could just create the following extension method for each POCO object...
public static IQueryable<Photo> IncludeForigenKeys(this PhotoAlbumDbContext context){
return context.Photos
.Include(photo => photo.UploadedBy.Id);
}
As far as I understand there is no way to partially load a Navigation Property.
However for foreign keys the standard way of accessing these without loading the Nav property is to include the actual key in your model. Eg:
public class Photo : Base
{
public string Description { get; set; }
public int UploadedById { get; set; }
public User UploadedBy { get; set; }
}
This id will be populated even if you don't actually load the whole navigation property.
In the case where you load both you can update either the value on the local or remote end of the nav property and that update will be persisted to the database on save. In my experience EF is very clever around this. The only scenario where it becomes a little more tricky is in unit tests where EF is not maintaining this state.

OrmLite: executing stored procedure and mapping the result onto model does not work when attributes are used

What are the actual requirements for ORMLite to project result of the call to stored procedure onto the model. I have a class that has some attributes and it will not map output of the sp correctly. If I remove attributes then it does map it correctly. For example:
public class Test
{
[Alias("InsuredId")]
public string Id { get; set; }
public string LastName { get; set; }
public string FirstName { get; set; }
public string MiddleInitial { get; set; }
}
SP returns these columns: InsuredId, LastName, FirstName, MiddleInitial and some more.
If I have Alias attribute all properties are populated with null. If I remove attribute, then all are fine except the Id. Following is the actual code.
var test =
db.SqlList<Test>(
"EXEC up_InsuredSearchTest #ItemId, #FirstName, #LastName, #DateOfBirth, #Max_Search_Records",
new
{
ItemId = memberId,
FirstName = firstName,
LastName = lastName,
DateOfBirth = dateOfBirth.HasValue? dateOfBirth.Value.ToShortDateString() : "",
Max_Search_Records = MAX_SEARCH_RECORDS
});
Not really an issue with ServiceStack. Value returned was an int but was mapped as string, and it was silently failing in ServiceStack, but error was logged. Error was not very informative, so I had to debug through ORMLite source to figure out what the problem was.

Azure Table Insert issue

I am facing an issue while inserting an entity. I am not sure what is wrong in there.
As I insert, I get an StorageClientException stating "value out of range".
My Table Service Entity looks like
public class Itinerary : TableServiceEntity
{
public string Name { get; set; }
public DateTime DOB { get; set; }
public int Sex { get; set; }
public string ToPNR { get; set; }
public string ReturnPNR { get; set; }
public string ContactNumber { get; set; }
public DateTime TravelDate { get; set; }
public DateTime ReturnDate { get; set; }
}
The entity gets inserted when the complete itinerary details are provided, but for itinerary having only one side details, the insert method gets failed with the given exception.
Any help would be appreciated.
The problem I guess lies with your DateTime fields. If you haven't initialized them before storing your data in the Table Storage, then the values that will be assigned to these fields will be that of .NET DateTime.Min value. Unfortunately this value is out of bounds of Azure Table Storage. Hence it is always advisable to provide some value to your DateTime field in the Azure Table Storage. In case, you want to assign a default value to the same, use the CloudTableClient.MinSupportedDateTime property. This will initialize the field with the min value supported by Azure Storage -
http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storageclient.cloudtableclient.minsupporteddatetime.aspx
Hope this helps

Resources