I have build a Windows Phone app. I send data to a database in Azure. I need this data to be processed and the results to be send back to the user, so the app remains 'light'.
Sorry if the question is a bit general, but what service should I use to process data in the cloud?
Related
I am using GCP to manage IOT devices using IOT Core. The incoming telemetry triggers a cloud function which stores the data in Firestore.
I have been asked to send the telemetry to an Azure SQL database. I am not familiar with Azure but, with the products that both GCP and Azure provide, there must be a way to get this right.
The device sends an update to GCP once per minute.
My initial thought was to use a cloud function to "pass" the data on to Azure when it is received in GCP.
Thanks in advance
Azure have many IoT services but for a message per minute, they are overkill for your scenario. While they are many ways to achieve your goal, one would be to create a Google Cloud Function that will send the data each minute to an Azure Function. That function would then save the data to your Azure SQL Database.
I want to send images from my system continuously to Azure cloud and process the image on the cloud using Azure stream analytics.
Following are my requirements:
Send images from a client(my desktop) continuously to Azure.
Run my ML algorithm on the cloud on the received images.
Send the result(output image and metadata) back to the client(my system)
Which services/product of Azure would help me in doing this task in real-time??? And what the steps??
Azure Stream Analytics would most likely not be a good fit for this (not saying that you couldn't use it) since I would assume your image data would be rather large. You should rather look into Azure Machine Learning scoring web services for that. That way you can expose a web service that you can send your image to (e.g. with a POST request), score it, and get the result back.
However: there are many possible ways here and it really depends on your actual problem you are trying to solve. (I almost voted to close this question)
I have several Azure WebJobs (.Net Framework, Not .Net Core) running which interact with an Azure Service Bus. Now I want to have a convenient way to store and analyze their Log-Messages (incl. the related Message from the Service Bus). We are talking about a lot of Log Messages per Day.
My Idea is to send the Logs to an Azure Event Hub and store them in an Azure SQL Database. Later I can have for example a WebApp that enables Users to conveniently browse and analyze the Logs and view the Messages.
Is this a bad Idea? Should I instead use Application Insights?
Application insight charges are more than your implementation. So i would say this is good idea. Just one change i would send each logs to logic apps and do some processing like sorting error logs, info logs etc differently. Also why are you thinking about SQL when this can be stored in non SQL Azure tables and fetch them from there.
Following is the proposed transition in our application:
Web Application is deployed in on-premises IIS (Web Server 1).
Web Application has one functionality (for example, Generate Invoice for selected customer).
For each new request of Generate Invoice, the web application is writing message to the Azure Service Bus Queue.
Azure function gets triggered for each new message in Azure Service Bus Queue.
Azure function triggers Web API (deployed on-premises).
Web API generates Invoice for the customer and stores in the local file storage.
As of now, we have everything setup on-premises, and instead of Service Bus and Azure function, we directly consume Web API. With this type of infrastructure in place, we are currently logging all events in an MongoDB collection, and providing single consolidated view to the user. So they can identify what happened to the Generate Invoice request, and at which level and with which error it got failed (in case of failures).
With the new proposed architecture, we are in process of identifying ways for logging and tracing here, and display consolidated view to the users.
The only option, I can think of is to log all events in Azure Cosmos DB from everywhere (i.e., Website, Service bus, function, Web API), and then provide consolidated view.
Can anyone suggest if the suggested approach looks OK? Or if anyone has some better solution?
Application Insights monitors the availability, performance, and usage of your web applications whether they're hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application's operations and diagnose errors without waiting for a user to report them.
Workbooks combine data visualizations, Analytics queries, and text into interactive documents. You can use workbooks to group together common usage information, consolidate information from a particular incident, or report back to your team on your application's usage.
For more details, you could refer to this article.
Im trying to create a complete solution to present data from IoT devices on to a webpage.
The data and devices will never be in the millions so using Stream Analytics, Machine Learning, Big Data etc. is costly and unnecessary.
I've looked at docs, blogs, forums for weeks now, and im stuck with the part on how to process the messages that the IoT hub receives, i want to save them to a SQL database and then build a website that will present them to the users.
What i have so far:
1. Device part
Raspberry Pi 3 has Windows IoT Core installed
Messages are sent and recieved on both Hub and Device ends successfully
(verified with Device Explorer and IoT hub dashboard)
2. Processing part
The most similar approach is detailed here but i don't want to use NoSQL, ive tried to use the Azure Function with the External Table (experimental) but there is zero documentation for that and all my attempts failed with function error.
Now im trying to connect a WebJob to process IoT Hub messages but i cant find any relevant samples or docs. Essentially id want to convert a Console App to a WebJob which will be triggered when a message arrives to the IoT hub
3. Webpage part
Once i get the messages to the SQL database i will create my custom portal for managing and registering devices, issuing one-off commands to devices and for request-response data.
The telemetry will be queried from the database and presented statically or near real time (with SignalR) by device type, location, by user privilages etc. this part is preety clear to me.
Please can anyone help me out with the processing part??
I found a solution by using Azure WebJobs and this article explains how to tie an EvenHub (IoT Hub) to the WebJob.