Azure Easy Tables - Load only one column - azure

Is there some way to get only one data column for one row from Azure Easy Tables?
For example Xamarin.Forms app will send name of item to Azure and get the item creation DateTime only.

Here's an example where we want to select just the Name Column from our Dog Table.
This sample uses the Azure Mobile Client and the Azure Mobile Client SQL NuGet Packages.
Model
using Microsoft.WindowsAzure.MobileServices;
using Newtonsoft.Json;
namespace SampleApp
{
public class Dog
{
public string Name { get; set; }
public string Breed { get; set; }
public int Age { get; set; }
[JsonProperty(PropertyName = "id")]
public string Id { get; set; }
[CreatedAt]
public DateTimeOffset CreatedAt { get; set; }
[UpdatedAt]
public DateTimeOffset UpdatedAt { get; set; }
[Version]
public string AzureVersion { get; set; }
[Deleted]
public bool IsDeleted { get; set; }
}
}
Logic
using System;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using System.Collections.Generic;
using Microsoft.WindowsAzure.MobileServices;
using Microsoft.WindowsAzure.MobileServices.Sync;
using Microsoft.WindowsAzure.MobileServices.SQLiteStore;
namespace SampleApp
{
public class MobileClientService
{
bool isMobileClientInitialized;
MobileServiceClient mobileClient;
public async Task<string> GetDogName(string id)
{
await InitializeMobileClient();
var dog = await mobileClient.GetSyncTable<Dog>().LookupAsync(id);
var dogName = dog.Name;
return dogName;
}
public async Task<IEnumerable<string>> GetDogNames()
{
await InitializeMobileClient();
var dogNameList = await mobileClient.GetSyncTable<Dog>().Select(x => x.Name).ToEnumerableAsync();
return dogNameList;
}
async Task InitializeMobileClient()
{
if(isMobileClientInitialized)
return;
mobileClient = new MobileServiceClient("Your Azure Mobile Client Url");
var path = Path.Combine(MobileServiceClient.DefaultDatabasePath, "app.db");
var store = new MobileServiceSQLiteStore(path);
store.DefineTable<Dog>();
//ToDo Define all remaining tables
await MobileServiceClient.SyncContext.InitializeAsync(store, new MobileServiceSyncHandler());
}
}
}

Related

CRUD operations for Table Storage from Function App

I have a small problem. I am trying to do a migration from Cosmos db to table storage in Azure. The problem is that I haven't dealt with table storage for a long time. It turned out that a large part of the packages that I used before were already out of date. And the offical documentation does not provide the needed information.
Currently I use this method to add item :
public static async Task AddDomain(CloudTable table, DomainEntity domain)
{
TableOperation insertOperation = TableOperation.Insert((Microsoft.WindowsAzure.Storage.Table.ITableEntity)domain);
await table.ExecuteAsync(insertOperation);
}
and this function App to add Domain:
[FunctionName("AddDomain")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function,"post", Route = null)] HttpRequest req,
[Table("Domains")] CloudTable domainsTable,
ILogger log)
{
DomainEntity domain = CreateDomain();
await AddDomain(domainsTable, domain);
return new OkResult();
}
But when i run the function its throw Exception :
Error indexing method 'AddDomain'
and
System.InvalidOperationException at Microsoft.Azure.WebJobs.Extensions.Tables.TableAttributeBindingProvider.TryCreate.
I have same problems with other CRUD operations too. Any idea what is going on ?
Here is how i did it last month
My service class
using Azure;
using Azure.Data.Tables;
using VPMS.FuncApp.EFCore.AuditSetup;
namespace VPMS.Persistence.AuditSetup;
internal class TableStorageService : ITableStorageService
{
private readonly TableServiceClient _tableServiceClient;
public TableStorageService(TableServiceClient tableServiceClient)
{
_tableServiceClient = tableServiceClient;
}
public async Task BulkInsert(string tableName, string partionKey, List<AuditTableStorageTable> audit)
{
var client = _tableServiceClient.GetTableClient(tableName);
List<TableTransactionAction> addEntitiesBatch = new List<TableTransactionAction>();
addEntitiesBatch.AddRange(audit.Select(e => new TableTransactionAction(TableTransactionActionType.Add, e)));
await client.SubmitTransactionAsync(addEntitiesBatch).ConfigureAwait(false);
}
public async Task CreateTableIfNotExistsAsync(string tableName)
{
var client = _tableServiceClient.GetTableClient(tableName);
await client.CreateIfNotExistsAsync();
}
}
My Function App
using Microsoft.Azure.WebJobs;
using Microsoft.EntityFrameworkCore;
using VPMS.FuncApp.EFCore;
using VPMS.FuncApp.EFCore.AuditSetup;
using VPMS.Persistence.AuditSetup;
namespace VPMS.FuncApp.ArchiveAppAuditLogs;
public sealed class ArchiveAppAuditLogTimerTrigger
{
private readonly VPMSDbContext _dbContext;
private readonly ITableStorageService _tableStorageService;
public ArchiveAppAuditLogTimerTrigger(VPMSDbContext dbContext, ITableStorageService tableStorageService)
{
_dbContext = dbContext;
_tableStorageService = tableStorageService;
}
[FunctionName(nameof(ArchiveAppAuditLogTimerTrigger))]
public async Task ArchiveAppAuditLogTrigger([TimerTrigger("%ArchiveAppAuditLogCron%")] TimerInfo myTimer)
{
var currentDate = DateTimeOffset.UtcNow.Date;
var auditTable = _dbContext.Set<Audit>();
await _tableStorageService.CreateTableIfNotExistsAsync(FuncAppConstants.AuditTableName);
var query = auditTable.Where(w => w.CreatedOn < currentDate);
int pageSize = 100;
bool hasMoreRecords = true;
for (int skip = 1; hasMoreRecords is true; skip++)
{
var recordsToSync = await query.Skip(0)
.Take(pageSize)
.ToListAsync();
hasMoreRecords = recordsToSync.Any();
if (hasMoreRecords is false) break;
var groupedAuditData = recordsToSync.GroupBy(b => b.TableName).ToList();
foreach (var groupedAuditDatem in groupedAuditData)
{
List<AuditTableStorageTable> auditStorageTableRecords = new();
foreach (var auditEntry in groupedAuditDatem.ToList()) auditStorageTableRecords.Add(AuditTableStorageTable.BuilData(auditEntry));
await _tableStorageService.BulkInsert(FuncAppConstants.AuditTableName, groupedAuditDatem.Key, auditStorageTableRecords);
}
auditTable.RemoveRange(recordsToSync);
await _dbContext.SaveChangesAsync();
}
}
}
My Azure function app startup class
using Azure.Identity;
using Microsoft.Azure.Functions.Extensions.DependencyInjection;
using Microsoft.EntityFrameworkCore;
using Microsoft.Extensions.Azure;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.DependencyInjection.Extensions;
using VPMS.FuncApp.EFCore;
using VPMS.FuncApp.Interfaces;
using VPMS.FuncApp.Services;
using VPMS.Persistence.AuditSetup;
using VPMS.SharedKernel.Interfaces;
[assembly: FunctionsStartup(typeof(VPMS.FuncApp.Startup))]
namespace VPMS.FuncApp;
public sealed class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
var configurations = builder.GetContext().Configuration;
builder.Services.AddAzureClients(builder =>
{
builder.AddTableServiceClient(configurations.GetValue<string>("AzureWebJobsStorage"));
builder.UseCredential(new DefaultAzureCredential());
});
}
}
The class the TableStorage inherits from
using Azure;
using Azure.Data.Tables;
using VPMS.Persistence.AuditSetup;
namespace VPMS.FuncApp.EFCore.AuditSetup;
public sealed class AuditTableStorageTable : ITableEntity
{
public string? UserId { get; set; }
public AuditType AuditType { get; set; }
public string TableName { get; set; } = null!;
public DateTimeOffset CreatedOn { get; set; }
public string? OldValues { get; set; }
public string? NewValues { get; set; }
public string? AffectedColumns { get; set; }
public string PrimaryKey { get; set; } = null!;
public Guid BatchId { get; set; }
// these comes from the implemented interface
public string PartitionKey { get; set; }
public string RowKey { get; set; }
public DateTimeOffset? Timestamp { get; set; }
public ETag ETag { get; set; }
public static AuditTableStorageTable BuilData(Audit audit)
{
return new()
{
PartitionKey = audit.TableName,
TableName = audit.TableName,
AffectedColumns = audit.AffectedColumns,
AuditType = audit.AuditType,
BatchId = audit.BatchId,
CreatedOn = audit.CreatedOn,
OldValues = audit.OldValues,
NewValues = audit.NewValues,
PrimaryKey = audit.PrimaryKey,
RowKey = Guid.NewGuid().ToString(),
UserId = audit.UserId,
};
}
}

Azure Table Storage: Ignoring a property of a TableEntity when using the Azure.Data.Tables package

I am using the new Azure.Data.Tables library from Microsoft to deal with Azure Table Storage. With the old library when you had an entity that implemented ITableEntity and you had a property that you did not want to save to the storage table you would use the [IgnoreProperty] annotation. However, this does not seem to be available on the new library.
What would be the equivalent on the Azure.Data.Tables package or how do you now avoid saving a property to table storage now?
This is the class I want to persist:
public class MySpatialEntity : ITableEntity
{
public int ObjectId { get; set; }
public string Name { get; set; }
public int MonitoringArea { get; set; }
//This is the property I want to ignore because table storage cannot store it
public Point Geometry { get; set; }
//ITableEntity Members
public virtual string PartitionKey { get => MonitoringArea.ToString(); set => MonitoringArea = int.Parse(value); }
public virtual string RowKey { get => ObjectId.ToString(); set => ObjectId = int.Parse(value); }
public DateTimeOffset? Timestamp { get; set; }
public ETag ETag { get; set; }
}
As of version 12.2.0.beta.1, Azure.Data.Tables table entity models now support ignoring properties during serialization via the [IgnoreDataMember] attribute and renaming properties via the [DataMember(Name="<yourNameHere>")] attribute.
See the changelog here.
I don't think there's anything like [IgnoreProperty] available as of now (at least with version 12.1.0).
I found two Github issues which talk about this:
https://github.com/Azure/azure-sdk-for-net/issues/19782
https://github.com/Azure/azure-sdk-for-net/issues/15383
What you can do is create a custom dictionary of the properties you want to persist in the entity and use that dictionary for add/update operations.
Please see sample code below:
using System;
using System.Collections.Generic;
using System.Drawing;
using Azure;
using Azure.Data.Tables;
namespace SO68633776
{
class Program
{
private static string connectionString = "connection-string";
private static string tableName = "table-name";
static void Main(string[] args)
{
MySpatialEntity mySpatialEntity = new MySpatialEntity()
{
ObjectId = 1,
Name = "Some Value",
MonitoringArea = 2
};
TableEntity entity = new TableEntity(mySpatialEntity.ToDictionary());
TableClient tableClient = new TableClient(connectionString, tableName);
var result = tableClient.AddEntity(entity);
}
}
public class MySpatialEntity: ITableEntity
{
public int ObjectId { get; set; }
public string Name { get; set; }
public int MonitoringArea { get; set; }
//This is the property I want to ignore because table storage cannot store it
public Point Geometry { get; set; }
//ITableEntity Members
public virtual string PartitionKey { get => MonitoringArea.ToString(); set => MonitoringArea = int.Parse(value); }
public virtual string RowKey { get => ObjectId.ToString(); set => ObjectId = int.Parse(value); }
public DateTimeOffset? Timestamp { get; set; }
public ETag ETag { get; set; }
public IDictionary<string, object> ToDictionary()
{
return new Dictionary<string, object>()
{
{"PartitionKey", PartitionKey},
{"RowKey", RowKey},
{"ObjectId", ObjectId},
{"Name", Name},
{"MonitoringArea", MonitoringArea}
};
}
}
}

Microsoft Power Automate Alternative for listening events in Azure DevOps

Goal:
Listen to events in Azure DevOps and automate workflows in Azure DevOps, like closing the tasks etc,.
Efforts:
I am using MS Power Automate to listen to events in Azure DevOps but it seems to work too slow (1-2 mins since the trigger).
Suggestion Required:
Do we have any alternative to MS Power Automate that can reduce the time ?
You may try to programmatically create a subscription using the Subscriptions REST APIs:
https://learn.microsoft.com/en-us/azure/devops/service-hooks/create-subscription?view=azure-devops
Here's a sample to help you get started:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Web.Mvc;
namespace Microsoft.Samples.VisualStudioOnline
{
public class ServiceHookEventController : Controller
{
// POST: /ServiceHookEvent/workitemcreated
[HttpPost]
public HttpResponseMessage WorkItemCreated(Content workItemEvent)
{
//Grabbing the title for the new workitem
var value = RetrieveFieldValue("System.field", workItemEvent.Resource.Fields);
//Acknowledge event receipt
return new HttpResponseMessage(HttpStatusCode.OK);
}
/// <summary>
/// Gets the value for a specified work item field.
/// </summary>
/// <param name="key">Key used to retrieve matching value</param>
/// <param name="fields">List of fields for a work item</param>
/// <returns></returns>
public String RetrieveFieldValue(String key, IList<FieldInfo> fields)
{
if (String.IsNullOrEmpty(key))
return String.Empty;
var result = fields.Single(s => s.Field.RefName == key);
if (result == null)
return String.Empty;
return result.Value;
}
}
public class Content
{
public String SubscriptionId { get; set; }
public int NotificationId { get; set; }
public String EventType { get; set; }
public WorkItemResource Resource { get; set; }
}
public class WorkItemResource
{
public String UpdatesUrl { get; set; }
public IList<FieldInfo> Fields { get; set;}
public int Id { get; set; }
public int Rev { get; set; }
public String Url { get; set; }
public String WebUrl { get; set; }
}
public class FieldInfo
{
public FieldDetailedInfo Field { get; set; }
public String Value { get; set; }
}
public class FieldDetailedInfo
{
public int Id { get; set; }
public String Name { get; set; }
public String RefName { get; set; }
}
}

When trying to override Partition Key using Azuren Search and Azure Table Storage with .NET getting bad request

I am using Azuren Search and Azure Table Storage and with .net and i am trying to index a table and make a partition key filterable now this works fine until i try to insert something in that table where i get a BadRequest with not much of additional info.
This is my class bellow
using System;
using Microsoft.Azure.Search;
using Microsoft.Azure.Search.Models;
using Microsoft.WindowsAzure.Storage.Table;
[SerializePropertyNamesAsCamelCase]
public class Asset : TableEntity
{
public Asset(string name)
{
Name = name;
}
public Asset()
{
}
public Asset(string name, DateTimeOffset toBePublished, string pkey)
{
Name = name;
ToBePublishedDate = toBePublished;
PartitionKey = pkey;
}
[System.ComponentModel.DataAnnotations.Key]
public string Id { get; set; } = DateTimeOffset.UtcNow.ToString("O")
.Replace("+", string.Empty)
.Replace(":", string.Empty)
.Replace(".", string.Empty);
[IsFilterable, IsSortable, IsSearchable]
public new string PartitionKey { get; set; }
[IsFilterable, IsSortable, IsSearchable]
public string Name { get; set; } = "TemptAsset " + new Guid();
[IsFilterable, IsSortable]
public int? Version { get; set; } = 1;
[IsFilterable, IsSortable]
public DateTimeOffset? ToBePublishedDate { get; set; } = DateTimeOffset.UtcNow;
[IsFilterable, IsSortable]
public DateTimeOffset? ToBeRetiredDate { get; set; } = null;
[IsFilterable, IsSearchable, IsSortable]
public string Company { get; set; } = "TempCompany";
[IsFilterable, IsSortable]
public bool IsApproved { get; set; } = false;
[IsFilterable, IsSortable]
public bool IsDraft { get; set; } = true;
}
This runs and the index is created successfully see bellow
Now if i try to add an entity to that table i get a BadRequest, but do the exact same thing with commenting out the PartitionKey in my entity and this works fine.
This is how i create my index
AzureSearch.CreateAssetNameIndex(AzureSearch.CreateSearchServiceClient());
and the methods called bellow
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using AssetSynch.Models;
using Microsoft.Azure.Search;
using Microsoft.Azure.Search.Models;
public static SearchServiceClient CreateSearchServiceClient()
{
string searchServiceName = "*****";
string adminApiKey = "********";
SearchServiceClient serviceClient = new SearchServiceClient(searchServiceName,
new SearchCredentials(adminApiKey));
return serviceClient;
}
public static async void CreateAssetNameIndex(SearchServiceClient serviceClient)
{
Index definition = new Index
{
Name = "assetname",
Fields = FieldBuilder.BuildForType<Asset>()
};
await serviceClient.Indexes.CreateAsync(definition);
}
If i return the error using postman this is the exception i get
{
"innerExceptions": [
{
"requestInformation": {
"httpStatusCode": 400,
"httpStatusMessage": "Bad Request",
"serviceRequestID": "59efbc9a-0002-002c-3570-d5d55c000000",
"contentMd5": null,
"etag": null,
"requestDate": "Thu, 25 May 2017 17:05:01 GMT",
"targetLocation": 0,
"extendedErrorInformation": {
"errorCode": "PropertiesNeedValue",
"errorMessage": "The values are not specified for all properties in the entity.\nRequestId:59efbc9a-0002-002c-3570-d5d55c000000\nTime:2017-05-25T16:05:06.5197909Z",
"additionalDetails": {}
},
"isRequestServerEncrypted": false
}
}
]
}
if i remove the Partition key from my entity and re run the same code to re-create the index the same piece of code this executes successfully.
What i did noticed is that there are now 2 Partition keys on my entity one of which will remain null see image bellow and that my property does not override the original.
Is there something i am missing here?
According to your codes, I find your Asset has used new keyword to modify the base class's partition property.
But this will just hidden the base.partition not override it.
public new string PartitionKey { get; set; }
After you set the value in the Asset class, you will find it contains two partition as below:
So if the base class's partition key value is null, it will return 400 error.
So if you want to add the new entity to the table, you need set the base class(TableEntity) partition key value.
So I suggest you could change your Asset as below:
[SerializePropertyNamesAsCamelCase]
public class Asset : TableEntity
{
public Asset(string name)
{
Name = name;
base.PartitionKey = this.PartitionKey;
}
public Asset()
{
base.PartitionKey = this.PartitionKey;
}
public Asset(string name, DateTimeOffset toBePublished, string pkey)
{
Name = name;
ToBePublishedDate = toBePublished;
PartitionKey = pkey;
base.PartitionKey = this.PartitionKey;
}
[Key]
[IsFilterable]
public string Id { get; set; } = DateTimeOffset.UtcNow.ToString("O")
.Replace("+", string.Empty)
.Replace(":", string.Empty)
.Replace(".", string.Empty);
[IsFilterable, IsSortable, IsSearchable]
public new string PartitionKey { get; set; }
[IsFilterable, IsSortable, IsSearchable]
public string Name { get; set; } = "TemptAsset " + new Guid();
[IsFilterable, IsSortable]
public int? Version { get; set; } = 1;
[IsFilterable, IsSortable]
public DateTimeOffset? ToBePublishedDate { get; set; } = DateTimeOffset.UtcNow;
[IsFilterable, IsSortable]
public DateTimeOffset? ToBeRetiredDate { get; set; } = null;
[IsFilterable, IsSearchable, IsSortable]
public string Company { get; set; } = "TempCompany";
[IsFilterable, IsSortable]
public bool IsApproved { get; set; } = false;
[IsFilterable, IsSortable]
public bool IsDraft { get; set; } = true;
}
If you want to use table storage as datasource, I suggest you could refer to this article.

Entity Type 'AstNode' has no key defined

I'm porting a data model from EF4 to EF6 Code First. I'm getting the following message when the database creation is attempted. I'm at a loss to understand what is causing this. I don't have any Context, AstNode or JSParser entities. It is also not looking in the Models namespace:
var context = QPDataContext.Create();
var session = context.DataSessions.FirstOrDefault(ds => ds.DataSessionId = sessionId);
Throws this exception:
{"One or more validation errors were detected during model generation:
QPWebRater.DAL.Context: : EntityType 'Context' has no key defined. Define the key for this EntityType.
QPWebRater.DAL.AstNode: : EntityType 'AstNode' has no key defined. Define the key for this EntityType.
QPWebRater.DAL.JSParser: : EntityType 'JSParser' has no key defined. Define the key for this EntityType.
(many more similar errors snipped).
"}
Here is my database context (I've simplified it a bit):
QPWebRater.DAL.QPDataContext.cs:
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Data.Entity.Core;
using System.Data.Entity.Validation;
using System.Diagnostics;
using System.Linq;
using System.Text.RegularExpressions;
using System.Web;
using Microsoft.Ajax.Utilities;
using QPWebRater.Models;
using QPWebRater.Utilities;
namespace QPWebRater.DAL
{
public class QPDataContext : DbContext
{
public QPDataContext()
: base("DefaultConnection")
{
Database.SetInitializer<QPDataContext>(new CreateDatabaseIfNotExists<QPDataContext>());
}
public static QPDataContext Create()
{
return new QPDataContext();
}
public DbSet<DataSession> DataSession { get; set; }
public DbSet<Document> Documents { get; set; }
public DbSet<Driver> Drivers { get; set; }
public DbSet<Location> Locations { get; set; }
public DbSet<Lookup> Lookups { get; set; }
public DbSet<Quote> Quotes { get; set; }
public DbSet<Vehicle> Vehicles { get; set; }
public DbSet<Violation> Violations { get; set; }
}
}
QPWebRater.Models.DatabaseModels.cs:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
namespace QPWebRater.Models
{
public partial class DataSession
{
public DataSession()
{
this.Vehicles = new HashSet<Vehicle>();
this.Drivers = new HashSet<Driver>();
...
}
public string DataSessionId { get; set; }
public System.DateTime Timestamp { get; set; }
...
}
public partial class Document
{
public int DocumentId { get; set; }
public int QuoteId { get; set; }
public string DocumentType { get; set; }
public string Url { get; set; }
public string Description { get; set; }
public virtual Quote Quote { get; set; }
}
public partial class Driver
{
public Driver()
{
this.Violations = new HashSet<Violation>();
}
public int DriverId { get; set; }
public string DataSessionId { get; set; }
...
}
}
I solved this by examining all of the DbSet definitions. I had pared down the data model while also upgrading it. I had removed the Lookup model class but neglected to also remove the DbSet<Lookup> Lookups { get; set; } property.
This resulted in the class being resolved as Microsoft.Ajax.Utilities.Lookup. At runtime, EntityFramework tried to add a corresponding database table which failed miserably. If you are running into a similar problem then double check the generic types in your DbSet properties.

Resources