Kusto data ingestion from an Azure Function App ends with a 403 - azure

I try to ingest data from azure function app into a ADX database. I followed the instruction found in the the article here.
The difference is, I'd like to insert data into the table. I struggle with a 403 error "Principal 'aadapp=;' is not authorized to access table"
What I did:
I have created a AAD App with the following API permissions:
AAD App configured permission
I configured the database via Kusto Explorer:
.add database myDB ingestors ('aadapp=;')
'theAADAppname'
.add table PressureRecords ingestors ('aadapp=;') 'theAADAppname'
.add table TemperatureRecords ingestors ('aadapp=;') 'theAADAppname'
My code:
var kcsbDM = new KustoConnectionStringBuilder($"https://ingest-{serviceNameAndRegion}.kusto.windows.net:443/").WithAadApplicationKeyAuthentication(
applicationClientId: "<my AD app Id>",
applicationKey: "<my App Secret from Certificates & secrets>",
authority: "<my tenant Id>");
using (var ingestClient = KustoIngestFactory.CreateQueuedIngestClient(kcsbDM))
{
var ingestProps = new KustoQueuedIngestionProperties(databaseName, tableName);
ingestProps.ReportLevel = IngestionReportLevel.FailuresAndSuccesses;
ingestProps.ReportMethod = IngestionReportMethod.Queue;
ingestProps.JSONMappingReference = mappingName;
ingestProps.Format = DataSourceFormat.json;
using (var memStream = new MemoryStream())
using (var writer = new StreamWriter(memStream))
{
var messageString = JsonConvert.SerializeObject(myObject); // maps to the table / mapping
writer.WriteLine(messageString);
writer.Flush();
memStream.Seek(0, SeekOrigin.Begin);
// Post ingestion message
ingestClient.IngestFromStream(memStream, ingestProps, leaveOpen: true);
}

The issue is that the mapping you are using in this ingestion command does not match the existing table schema (it has additional columns). In these cases Azure Data Explorer (Kusto) attempts to add the additional columns it finds in the mappings. Since the permission that the app has is 'ingestor', it cannot modify the table structure and thus the ingestion fails.
In your specific case, your table has a column that is written in a specific casing and in the ingestion mapping the same column has a different casing (for one character) so it is treated as a new column.
We will look into providing a better error message in this case.

Update: the issue is fixed in the system and now it works as expected.
Avnera thanks for your hint, potential it is an issue because of the Real vs double translation. In one of my first try I used double in the table and that worked. That is not longer possible, looks the supported data types changed.
My current configuration:
.create table PressureRecords ( Timestamp:datetime, DeviceId:guid, Pressure:real )
.create-or-alter table PressureRecords ingestion json mapping "PressureRecords"
'['
'{"column":"TimeStamp","path":"$.DateTime","datatype":"datetime","transform":null},'
'{"column":"DeviceId","path":"$.DeviceId","datatype":"guid","transform":null},'
'{"column":"Pressure","path":"$.Pressure","datatype":"real","transform":null}'
']'
public class PressureRecord
{
[JsonProperty(PropertyName = "Pressure")]
public double Pressure { get; set; }
[JsonProperty(PropertyName = "DateTime")]
public DateTime DateTime { get; set; } = DateTime.Now;
[JsonProperty(PropertyName = "DeviceId")]
[Key]
public Guid DeviceId { get; set; }
}

Related

Partial Data Being Ingested To Azure Data Explorer From Event Hub

I currently have an Azure Data Explorer setup to ingest data from Event Hub. For some reason unknown to me, my ingestion table is only seeing about 45% of events. I am testing this by sending 100 events to event hub individually at a time. I know my event hub is receiving these events because I setup a SQL table to also ingest these events, and that table is receiving 100% of them (under a separate consumer group). My assumption is that I have setup my Azure Data Explorer table incorrectly.
I have a very basic object I am sending
public class TestDocument
{
[JsonProperty("DocumentId")]
public string DocumentId { get; set; }
[JsonProperty("Title")]
public string Title { get; set; }
{
I have enabled streaming ingestion in Azure
Azure Data Explorer > Configurations > Streaming ingestion (ON)
I have enabled streaming ingestion in my table
.alter table TestTable policy streamingingestion enable
My Table mapping is as follows
.alter table TestTable ingestion json mapping "TestTable_mapping" '[{"column":"DocumentId","datatype":"string","Path":"$[\'DocumentId\']"},{"column":"Title","datatype":"string","Path":"$[\'Title\']"}]'
My data connection settings
Consumer group: Its own group
Event system properties: 0
Table name: TestTable
Data format: JSON
Mapping name: TestTable_mapping
Is there something I am missing here? Consistently, out of 100 events sent, I only see about 45-48 get ingested in my table.
EDIT:
Json payload of TestDocument
{"DocumentId":"10","Title":"TEST"}
Found out what is happening, I am adding a BOM to my serialized object, and it looks like ADX has issues with it. When I tried serializing my object without a BOM, I was able to see all data flow from event hub to ADX.
Here's a sample of how I am doing it:
private static readonly JsonSerializer Serializer;
static SerializationHelper()
{
Serializer = JsonSerializer.Create(SerializationSettings);
}
public static void Serialize(Stream stream, object toSerialize)
{
using var streamWriter = new StreamWriter(stream, Encoding.UTF8, DefaultStreamBufferSize, true);
using var jsonWriter = new JsonTextWriter(streamWriter);
Serializer.Serialize(jsonWriter, toSerialize);
}
What fixed it:
public static void Serialize(Stream stream, object toSerialize)
{
using var streamWriter = new StreamWriter(stream, new UTF8Encoding(false), DefaultStreamBufferSize, true);
using var jsonWriter = new JsonTextWriter(streamWriter);
Serializer.Serialize(jsonWriter, toSerialize);
}

Acumatica GetList error: Optimization cannot be performed.The following fields cause the error: Attributes.AttributeID

Developer's version of Acumatica 2020R1 is installed locally. Data for sample tenant MyTenant from training for I-300 were loaded, and WSDL connection established.
DefaultSoapClient is created fine.
However, attempts to export any data by using Getlist cause errors:
using (Default.DefaultSoapClient soapClient =
new Default.DefaultSoapClient())
{
//Sign in to Acumatica ERP
soapClient.Login
(
"Admin",
"*",
"MyTenant",
"Yogifon",
null
);
try
{
//Retrieving the list of customers with contacts
//InitialDataRetrieval.RetrieveListOfCustomers(soapClient);
//Retrieving the list of stock items modified within the past day
// RetrievalOfDelta.ExportStockItems(soapClient);
RetrievalOfDelta.ExportItemClass(soapClient);
}
public static void ExportItemClass(DefaultSoapClient soapClient)
{
Console.WriteLine("Retrieving the list of item classes...");
ItemClass ItemClassToBeFound = new ItemClass
{
ReturnBehavior = ReturnBehavior.All,
};
Entity[] ItemClasses = soapClient.GetList(ItemClassToBeFound);
string lcItemType = "", lcValuationMethod = "";
int lnCustomFieldsCount;
using (StreamWriter file = new StreamWriter("ItemClass.csv"))
{
//Write the values for each item
foreach (ItemClass loItemClass in ItemClasses)
{
file.WriteLine(loItemClass.Note);
}
}
The Acumatica instance was modified by adding a custom field to Stock Items using DAC, and by adding several Attributes to Customer and Stock Items.
Interesting enough, this code used to work until something broke it.
What is wrong here?
Thank you.
Alexander
In the request you have the following line: ReturnBehavior = ReturnBehavior.All
That means that you try to retrieve all linked/detail entities of the object. Unfortunately, some object are not optimized enough to not affect query performance in GetList scenarios.
So, you have to options:
Replace ReturnBehavior=All by explicitly specifying linked/detail entities that you want to retrieve and not include Attributes into the list.
Retrieve StockItem with attributes one by one using Get operation instead of GetList.
P.S. The problem with attributes will most likely be fixed in the next version of API endpoint.
Edit:
Code sample for Get:
public static void ExportItemClass(DefaultSoapClient soapClient)
{
Console.WriteLine("Retrieving the list of item classes...");
ItemClass ItemClassToBeFound = new ItemClass
{
ReturnBehavior = ReturnBehavior.Default //retrieve only default fields (without attributes and other linked/detailed entities)
};
Entity[] ItemClasses = soapClient.GetList(ItemClassToBeFound);
foreach(var entity in ItemClasses)
{
ItemClass itemClass= entity as ItemClass;
ItemClass.ReturnBehavior=ReturnBehavior.All;
// retrieve each ItemClass with all the details/linked entities individually
ItemClass retrievedItemCLass = soapClient.Get(itemClass);
}

How to call Azure Actions in xamarin forms using Azure Service Provider

I am using Azure Service provider (Azure SDK in xamarin forms) to download data from Azure cloud Server, I am using bellow code to fetch all data
var table = AzureServiceProvider.Instance.GetRemoteTable<T>();
var query = table.CreateQuery();
if (filter != null)
{
query = table.Where(filter);
}
List<T> azureDatas;
await query.ToListAsync();
when I use code above it hits following URL https://MyService.azurewebsites.net/tables/TableName
But now I have to pass id (i.e api/table/{TableName}/{controller}/{id}) to fetch only required data for matching that that id
using above same code its hitting above URL
https://MyService.azurewebsites.net/tables/TableName
Inst ed of that I want to use
EX:-
https://mobilddevservice.azurewebsites.net/tables/TableName/(methodName)/(ID)10338654
I don't know if you figures this out yet, but to target a specific method in your controller you can use the ".WithParameters" method on your table.
So lets say you have 2 methods in your controller:
// GET tables/TableName/id
public SingleResult<TableName> GetDataFromName(string name)
{
//Your logic here
}
// GET tables/TableName/id
public SingleResult<TableName> GetDataFromAddress(string address)
{
//Your logic here
}
You can access these methods individually by using the .WithParameters like:
Dictionary<string, string> parameters = new Dictionary<string, string>();
parameters.Add("name", name);
var query = Table.WithParameters(parameters);
var results = await query.ToEnumerableAsync();
To access the address method
Dictionary<string, string> parameters = new Dictionary<string, string>();
parameters.Add("address", personsAddress);
var query = Table.WithParameters(parameters);
var results = await query.ToEnumerableAsync();
So the important part is:
As far as I'm aware you can only send strings. But I might be
wrong
The name in the parameters dictionary has to be the exact
name of the variable in your controller!
Use the following for a lookup by Id:
var table = client.GetTable<T>();
var record = await table.LookupAsync(id);
It uses the different endpoint - https://site.azurewebsites.net/tables/Table/{id}.
For more info, check the book: https://adrianhall.github.io/develop-mobile-apps-with-csharp-and-azure/chapter3/client/

Access resources by Id in Azure DocumentDB

I just started playing with Azure DocumentDB and my excitement has turned into confusion. This thing is weird. It seems like everything (databases, collections, documents) needs to be accessed not by its id, but by its 'SelfLink'. For example:
I create a database:
public void CreateDatabase()
{
using (var client = new DocumentClient(new Uri(endpoint), authKey))
{
Database db = new Database()
{
Id = "TestDB",
};
client.CreateDatabaseAsync(db).Wait();
}
}
Then later sometime I want to create a Collection:
public void CreateCollection()
{
using (var client = new DocumentClient(new Uri(endpoint), authKey))
{
DocumentCollection collection = new DocumentCollection()
{
Id = "TestCollection",
};
client.CreateDocumentCollectionAsync(databaseLink: "???", documentCollection: collection).Wait();
}
}
The api wants a 'databaseLink' when what I'd really prefer to give it is my database Id. I don't have the 'databaseLink' handy. Does DocumentDB really expect me to pull down a list of all databases and go searching through it for the databaseLink everytime I want to do anything?
This problem goes all the way down. I can't save a document to a collection without having the collection's 'link'.
public void CreateDocument()
{
using (var client = new DocumentClient(new Uri(endpoint), authKey))
{
client.CreateDocumentAsync(documentCollectionLink: "???", document: new { Name = "TestName" }).Wait();
}
}
So to save a document I need the collection's link. To get the collections link I need the database link. To get the database link I have to pull down a list of all databases in my account and go sifting through it. Then I have to use that database link that I found to pull down a list of collections in that database that I then have to sift through looking for the link of the collection I want. This doesn't seem right.
Am I missing something? Am I not understanding how to use this? Why am I assigning ids to all my resources when DocumentDB insists on using its own link scheme to identify everything? My question is 'how do I access DocumentDB resources by their Id?'
The information posted in other answers from 2014 is now somewhat out of date. Direct addressing by Id is possible:
Although _selflinks still exist, and can be used to access resources, Microsoft have since added a much simpler way to locate resources by their Ids that does not require you to retain the _selflink :
UriFactory
UriFactory.CreateDocumentCollectionUri(databaseId, collectionId))
UriFactory.CreateDocumentUri(databaseId, collectionId, "document id");
This enables you to create a safe Uri (allowing for example for whitespace) - which is functionally identical to the resources _selflink; the example given in the Microsoft announcement is shown below:
// Use **UriFactory** to build the DocumentLink
Uri docUri = UriFactory.CreateDocumentUri("SalesDb", "Catalog", "prd123");
// Use this constructed Uri to delete the document
await client.DeleteDocumentAsync(docUri);
The announcement, from August 13th 2015, can be found here:
https://azure.microsoft.com/en-us/blog/azure-documentdb-bids-fond-farewell-to-self-links/
I would recommend you look at the code samples here in particular the DocumentDB.Samples.ServerSideScripts project.
In the Program.cs you will find the GetOrCreateDatabaseAsync method:
/// <summary>
/// Get or create a Database by id
/// </summary>
/// <param name="id">The id of the Database to search for, or create.</param>
/// <returns>The matched, or created, Database object</returns>
private static async Task<Database> GetOrCreateDatabaseAsync(string id)
{
Database database = client.CreateDatabaseQuery()
.Where(db => db.Id == id).ToArray().FirstOrDefault();
if (database == null)
{
database = await client.CreateDatabaseAsync(
new Database { Id = id });
}
return database;
}
To answer you question, you can use this method to find your database by its id and other resources (collections, documents etc.) using their respective Create[ResourceType]Query() methods.
Hope that helps.
The create database call returns a the database object:
var database = client.CreateDatabaseAsync(new Database { Id = databaseName }).Result.Resource;
And then you can use that to create your collection
var spec = new DocumentCollection { Id = collectionName };
spec.IndexingPolicy.IndexingMode = IndexingMode.Consistent;
spec.IndexingPolicy.Automatic = true;
spec.IndexingPolicy.IncludedPaths.Add(new IndexingPath { IndexType = IndexType.Range, NumericPrecision = 6, Path = "/" });
var options = new RequestOptions
{
ConsistencyLevel = ConsistencyLevel.Session
};
var collection = client.CreateDocumentCollectionAsync(database.SelfLink, spec, options).Result.Resource;
The client.Create... methods return the objects which have the self links you are looking for
Database database = await client.CreateDatabaseAsync(
new Database { Id = "Foo"});
DocumentCollection collection = await client.CreateDocumentCollectionAsync(
database.SelfLink, new DocumentCollection { Id = "Bar" });
Document document = await client.CreateDocumentAsync(
collection.SelfLink, new { property1 = "Hello World" });
For deleting the document in partitioned collection, please leverage this format:
result = await client.DeleteDocumentAsync(selfLink, new RequestOptions {
PartitionKey = new PartitionKey(partitionKey)
});

How to get Azure service pricing details programmatically?

Can anybody please tell me how can I programmatically get Azure service pricing details (pricing for Compute, Data Services , App Services, Network Services) from Azure website?
Does Azure provide the pricing details in JSON format?
Windows Azure does'not provide any such API as of today, although it is a much asked feature and hopefully they are working on it.
Check here:
http://feedback.windowsazure.com/forums/170030-billing/suggestions/1143971-billing-usage-api#comments
The only way for now could be to build your own data store with details mentioned here : http://azure.microsoft.com/en-us/pricing/calculator/
Unit wise price will be mentioned in the usage data csv, but unfortunately the only way for now is to download this csv for your subscription here: https://account.windowsazure.com/Subscriptions
Azure now provides API's to get usage and billing data. You can have a look at this blog which gives an overview of these API's and the feedback form here which contains links to some useful pages.
In summary use the following API's to get usage and billing data:
Resource usage
Resource ratecard
Not sure, if i am too late to answer.
I was looking for the same thing and stumble upon this post on stack overflow: Azure pricing calculator api. I was able to generate JSON string using this git hub repo: https://github.com/Azure-Samples/billing-dotnet-ratecard-api.
Hope this helps!
Late to the party but I found myself looking for this and nothing here got me what I wanted. Then I found this https://learn.microsoft.com/en-us/rest/api/cost-management/retail-prices/azure-retail-prices
It is pretty straight forward. Add the reference to the Json.NET .NET 4.0 to your project It shows up in your references as Newtonsoft.Json
//You will need to add these usings
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Net.Http;
private void btnGetRates_Click(object sender, EventArgs e)
{
string strUrl = "https://prices.azure.com/api/retail/prices?$filter=serviceName eq 'Virtual Machines' and skuName eq 'E64 v4' and reservationTerm eq '3 Years'";
string response = GetDataFromAPI(strUrl);
// Here is am turning the Json response into a datatable and then loading that into a DataGridView.
//You can use the Json response any way you wish
DataTable dt = Tabulate(response);
dgvAzureSKU.DataSource = null;
dgvAzureSKU.DataSource = dt;
}
public string GetDataFromAPI(string url)
{
using (var httpClient = new HttpClient())
{
httpClient.DefaultRequestHeaders.Add("Accept", "application/json");
var response = httpClient.GetStringAsync(new Uri(url)).Result;
return response;
}
}
public static DataTable Tabulate(string json)
{
var jsonLinq = JObject.Parse(json);
// Find the first array using Linq
var srcArray = jsonLinq.Descendants().Where(d => d is JArray).First();
var trgArray = new JArray();
foreach (JObject row in srcArray.Children<JObject>())
{
var cleanRow = new JObject();
foreach (JProperty column in row.Properties())
{
if (column.Value is JValue) // Only include JValue types
{
cleanRow.Add(column.Name, column.Value);
}
}
trgArray.Add(cleanRow);
}
return JsonConvert.DeserializeObject<DataTable>(trgArray.ToString()); //This is what loads the data into the table
}
You can find some examples for that here https://learn.microsoft.com/en-us/azure/billing/billing-usage-rate-card-overview. Azure provides invoice, usage and ratecard APIs which can help you to do things like:
Azure spend during the month
Set up alerts
Predict bill
Pre-consumption cost analysis

Resources