Upload an entire XML file to Azure Mobile Services - azure

I have an XML with 25,000 objects that I want to transition to the cloud instead of keeping it local. Buuuuuut I have no idea how to get the XML document into the service, preferably I want it in the SQL database that comes with the service. I know how to upload from the app but with 25,000 objects the service times out. I am sure there is something I just cant find the documentation on.
private async void bw_Worker(object sender, DoWorkEventArgs e)
{
foreach(card x in cardsList)
{
await App.MobileService.GetTable<card>().InsertAsync(x);
}
}
It gets just past 7500 then times out. I am running it in a background worker. I couldn't find any such limits on that so I just imagined its a process that takes too long to complete.

Related

Azure Blob Trigger function app : running same instance for multiple blobs upload

I have created a Blob triggered function app in Python. My requirement is to run a separate instance for each blob upload (for parallel processing), but it's not happening. Even I have modified the host.json as below as per the below link:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob
{
"version": "2.0",
"extensions": {
"blobs": {
"maxDegreeOfParallelism": "4"
}
}
}
Still, the same instance is running and processing files one by one. Am I missing something here?
I'm afraid we can't implement this requirement. As far as I know, we can just set the function app to scale out to maximum n(in your case is 4) instances, but we can't scale out instances manaually.
When you modify the configuration to allow the function app to scale out for multiple instances, it can just scale out automatically when lots of requests coming. If there are only 4 request, only one instance will be started in most cases.
Here is another post I did research in the past which is similar problem with this case for your reference.

Timer based Azure function with Table storage, HTTP request, and Azure Service Bus

I have a process written in a console application right now that fires on a scheduled task to read data from Azure table storage and based on that data, make API calls to a third party vendor we use, deserialize the response data, loop over an array in the results, save the individual iterations of the loop into a different table in Azure table storage, and then publish messages for each iteration of the loop to Azure service bus where those messages are consumed by another client.
In an effort to get more of our tasks into the cloud, I've done some research and it seems that an Azure function would be a good candidate to replace my console application. I spun up a new Azure function project in Visual Studio 2019 as a "timer" function and then dove into some reading where I got lost really fast.
The reading I've done talks about using "bindings" in my Run() method arguments decorated with attributes for connection strings etc but I'm not sure that is the direction I should be heading. It sounds like that would make it easier for authentication to my table storage, but I can't figure out how to use those "hooks" to query my table and then perform inserts. I haven't even gotten to the service bus stuff yet nor looked into making HTTP calls to our third party vendor's api.
I know this is a very broad question and I don't have any code to post because I'm having a tough time even getting out of the starting blocks with this. The MS documentation is all over the map and I can't find anything specific to my needs and I promise I've spent a fair bit of time trying.
Are Azure functions even the right path I should be travelling? If not, what other options are out there?
TIA
You should keep with Azure Functions with the Time Trigger to replace your console app.
The bindings (which can be used for input /output) are helpers to save you some lines of code, for example:
Rather than using the following code to insert data into azure table:
// Retrieve storage account information from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(storageConnectionString);
// Create a table client for interacting with the table service
CloudTableClient tableClient = storageAccount.CreateCloudTableClient(new TableClientConfiguration());
// Create a table client for interacting with the table service
CloudTable table = tableClient.GetTableReference("MyTable");
//some code to populate an entity
var entity = new { PartitionKey = "Http", RowKey = Guid.NewGuid().ToString(), Text = input.Text };
// Create the InsertOrReplace table operation
TableOperation insertOrMergeOperation = TableOperation.InsertOrMerge(entity);
// Execute the operation.
TableResult result = await table.ExecuteAsync(insertOrMergeOperation);
you would use:
[FunctionName("TableOutput")]
[return: Table("MyTable")]
public static MyPoco TableOutput([HttpTrigger] dynamic input, ILogger log)
{
log.LogInformation($"C# http trigger function processed: {input.Text}");
return new MyPoco { PartitionKey = "Http", RowKey = Guid.NewGuid().ToString(), Text = input.Text };
}
PS: the input trigger in the previous code is a HTTP Trigger, but was only to explain how to use output binding.
you can find more information in here:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table
and you should watch: https://learn.microsoft.com/en-us/learn/modules/chain-azure-functions-data-using-bindings/

Refreshing in-memory cache across multiple instances in Azure

I have an Azure web app which uses in-memory caching, adding keys as follows
public static void Add(object item, string key)
{
var wrapper = new CacheItemWrapper()
{
InsertedAt = DateTime.Now,
Item = item
};
MemoryCache.Default.Add(key, wrapper, ObjectCache.InfiniteAbsoluteExpiration);
}
Occasionally the user of our application will make a change which requires the cache to refresh. I can call a method which clears the cache, the problem is, it only works on the instance which picks up the request. Other instances still have the old values in memory.
Is there any way I can do either of these things
a) run a method across multiple instances, or
b) raise an event which all instances listen for?
The code above could be changed to expire within a short time so that all instances could pick this up. However, it's quite a long process to update the cache and this might affect performance. Given the application knows when it needs to refresh the cache, it would be much better and more responsive if it could be done programmatically.

Handling large number of same requests in Azure/IIS WebRole

I have a Azure Cloud Service based HTTP API which is currently serving its data out of an Azure SQL database. We also have a in role cache at the WebRole side.
Generally this model is working fine for us but sometimes what happening is that we get a large number of requests for the same resource within a short period of time span and if that resource is not there in the cache, all the requests went directly to our DB which is a problem for us as many time DB is not able to take that much load.
By looking at the nature of the problem, it seems like it should be a pretty common problem which most of the people build API would face. I was thinking if somehow, I can send only 1st request to DB and hold all the remaining till the time when 1st one completes, to control the load going to DB but I did get any good of doing it. Is there any standard/recommended way of doing it in Azure/IIS?
The way we're handling this kind of scenario is by putting calls to the DB in a lock statement. That way only one caller will hit the DB. Here's pseudo code that you can try:
var cachedItem = ReadFromCache();
if (cachedItem != null)
{
return cachedItem;
}
lock(object)
{
cachedItem = ReadFromCache();
if (cachedItem != null)
{
return cachedItem;
}
var itemsFromDB = ReadFromDB();
putItemsInCache(itemsFromDB);
reurn itemsFromDB;
}

API Breaks past 8-9 Devices?

It would seem that streaming breaks when there is too many devices in the account. After about 8 or 9 it just stops streaming data to me all together.
Are you using Firebase? I came on here to post a similar issue myself. If I change the temp through the nest device or the web tool, my Firebase listeners are updated. If I try to set a value, the value that I try to set is echoed back to my listener (like there has been an update on the thermostat even though it wasn't changed successfully) then, the correct value (unchanged) comes immediately after.
The weird thing is that it works.... then it just doesn't. Is this similar to what you've been experiencing?
Update:
Now it appears as if my listeners are not working either. I can query the server using REST successfully.
Update #2:
Now my listeners are working again but still no control.
Update#3:
Well... I think I see my problem at least. I don't know if it will help you (or me for that matter) but here it is...
protected void setHighTemp(int value){
fb.child("target_temperature_high_f").setValue(value, new CompletionListener() {
public void onComplete(FirebaseError arg0, Firebase arg1) {
System.out.println("Communicaiton error: " + arg0);
}
});
Output:
Communicaiton error: FirebaseError: Too many requests
I remember reading the following paragraph in https://developer.nest.com/documentation/glossary#client
Client
An integration of your application or service with Nest
devices. When you create a Nest account and sign up for the Developer
Program, you can add up to 10 clients to the account.
This might be your problem.

Resources