I have a static dot net web application with application insights sdk. How do I send application insights data to Azure Event Hub? I have successfully used the Azure Continuous Export feature but I would rather like to send the telemetry data to the Event Hub.
To explicitly send data to eventhub you will need to use EventHub SDK, which is currently available in .NET/C#, Java, REST, and Node.js. For your case which is a web application, sending via REST APIs might be the easiest way. Take a look at API reference for more information: https://msdn.microsoft.com/en-us/library/azure/dn790674.aspx
One catch is that receiving events is not currently supported using REST, you would still need a .NET or Java application to be on the receive side.
If you are looking for a common logging framework - which can be configured to send data to "whatever data destination" you want to - you should consider looking at log4net.
Here's a good implementation of log4net-appender for EventHubs.
#greypanda,
As you know Continuous Export currently only exports Application Insights to blob storage, from which you can pick up the data for use in any workstream you want. Exporting directly in an Event Hub is something that could be a future feature, so please log this at our UserVoice site: https://visualstudio.uservoice.com/forums/357324-application-insights.
We will also have a set of REST APIs for Application Insights soon (see https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/4999529), which might help you.
I would like to learn more about your scenario so I can better help you in this instance and improve our export and API features. Feel free to reply here or if you want, shoot me a mail offline.
Thank you
Dale Koetke (dalek#microsoft.com)
We don't really support that. It's a lot easier to let the SDK send the data to App Insights portal, then you can use Continuous Export to move it out into Storage. If you want, you can use Stream Analytics to move it from there.
What do you plan to ultimately do with the data? (I mean, why event hub...?)
Related
I am having trouble to identify the best "tool" to solve the problem. I am using a python library which publishes to its data via Sever-Sent-Events (SSE) (see https://github.com/wattsight/wapi-python/blob/development/wapi/events.py).
I would like to constantly listen for new events. However, I am not sure which tool in Azure in appropriate. An Azure Function would have to run continously which seems like a misuse, SignalR requires control over the "sender" of events and I don't know if EventHub would be able to manage that job.
Thank you for letting my learn from your experience.
The azure Eventhub is the right service to receive new events. Besides that, it can also provide other benefits like scalability, events storage etc.
And you can also consider using azure function with eventhub trigger. But it has a limitation about the incoming message max size.
I am exploring the Azure IoT central rest APIs to create a custom Angular client. Is this possible or does it have any limitations? IoT Central is attractive due to its pricing. Specifically, I found that retrieving multiple telemetry values isn't possible as per the following documentation page. Which means you have to send individual "get" requests to fetch multiple telemetry.
Azure IoT Central (get telemetry value)
Is there a possibility to register a call back and receive regular updates of the values like in event hubs? Basically I want to develop a custom client facing app with the IoT Central Pricing. Is it possible?
It is possible, to receive regular updates on telemetry you can use continuous data export. You can export to Service Bus, Event Hubs and Blob Storage. See the documentation here on how to set that up. You can receive those events in any JavaScript application.
Please be aware that continuous data export will give you updates from all devices. If you need to filter them out you will need to build something to filter that out. One example I have built in the past is a .NET Core application that listens to the messages and sends that to the different clients over SignalR.
I am looking forward to build an endpoint capable of receiving JSON objects and saving them into ADLS. So far I have tried several different combinations using Functions, Event Hubs, and Stream Analytics. The problem is: no solution so far seems ideal.
TL;DR In my scenario, I have a few set of users that will send me JSON data through an API, and I need to save it inside ADLS, separated by user. What is the best way of doing so?
Could anyone shed me some light? Thanks in advance.
WARNING: LONG TEXT AHEAD
Let me explain my findings so far:
Functions
Advantages
single solution approach - solving the scenario with a single service
built-in authorization
organization - saving user's files to separate folders inside ADLS
HTTP endpoint - to send data only a POST is required
cheap & pay-as-you-go - charged per request
Disadvantages
bindings & dependencies - Functions doesn't have ADLS bindings. To authorize and use ADLS, I need to install extra dependencies and manually manage its credentials. I was only able to do it with C#, but haven't tested with other languages. May also be a drawback, although I can't confirm.
File management - saving 1 file per request is not suggested by ADLS. The alternative would be to append to files and manage its size. This means more code compared to the other solutions.
Event Hub
Advantages
no code at all - all I need is enabling data capture
Disadvantages
one event hub per user - the only way of separating data inside ADLS through event hub's capture capability requires using one event hub per user
price - capturing one-event-hub-per-user increases the prices drastically
authorization - sending events are not as trivial as doing a POST
Functions + Event Hub
Using Event Hub with Functions mitigate Functions disadvantages, but have the same drawbacks (except auth) of Event Hub
Functions + Event Hub + Stream Analytics
Although I would be able to have a single event hub without capture, using Stream Analytics SQL as a filter to direct each user's data to its specific folder, it would be a limiting factor. I have tried it and it gets slower as the SQL gets bigger.
IoT Hub
IoT Hub has routing, but it is not as dynamic as I require.
Could anyone shed me some light? Thanks in advance.
I don't quite see the disadvantages of using only Azure Functions to write data to ADLS.
As long as you don't write lots of small files, writing 1 file per request should not really be an issue
Use the .NET SDK should be pretty straightforward even without an existing binding
To solve the authentication piece: Use Managed Service Identity (MSI) and KeyVault to store your client secrets there. MSI support in the SDK is apparently on the roadmap and would then make this very easy indeed.
You save yourself the extra cost of an Event Hub and I don't see a real value add through it
I'd like to build a small solution on Azure for practice. I'd be sending data using IOT-HUB from some devices and what I need is some way to interpret this data and do appropriate query to Azure SQL.
Basically I would need a way to have my program running all the time being able to:
listen to events from iot-hub
interpret event information and save/get data to/from database
send a message to some device using iot-hub
Which service would be good for that? Am I able to use Entity Framework?
In ideal solution I'd create a C# program to do what I need and have it running in Azure, waiting for events from iot-hub, having access to my database - is it even possible?
I'm very new (rather completely new) to cloud solutions, so I'd be really grateful for any advices. Currently I feel completely lost in all these Azure services.
There is quite some documentation on Azure IoT and how one can possible architect an IoT solution.
The IoT documentation is the obvious first step to get an overview of what Azure offers. There are some nice 'Getting Started' walkthroughs also
Take a look at this IoT reference architecture. Quite helpful to get an overview.
There are tons of links and interesting examples for Azure IoT. Just google around.
My problem is a simple one. I'm working on a project that uses the service bus from Microsoft Azure to send messages asynchronously between different modules on different virtual machines. And a lot of messages are sent through this bus, so we want to have some indicators about it's performance and other usage information. Why? Because when everything is working, users are happy. When the system is slow, we want to show the user some interesting graphs, statistics, meters and other gadgets to give them an indication if there's a problem within Azure or with something else. And to do this, I need data about the usage of the Azure service bus.
So, which Azure API's are available to display what kind of (diagnostic) information about the service bus?
(Users should have no access to Azure itself! They should just see some performance data to re-assure them Azure is working fine. Or else I could look at it and discover some problem with it, fix it and then make users happy again.)
To elaborate what I'm looking for, the Azure website has some nice chart when you click the Monitor of the Azure bus showing you overviews of the number of incoming messages, the number of errors and their types, size information and the number of succesful operations, all based on a specified period. It would be nice if I could receive this data within my project.
The entity metrics API will give you the exact data the portal is using:
http://msdn.microsoft.com/en-us/library/windowsazure/dn163589.aspx
Here's a Subscribe! episode I recorded with Rajat on the topic http://channel9.msdn.com/Blogs/Subscribe/Service-Bus-Namespace-Management-and-Analytics
I've spent quite some time to make the entity metrics API work, so I decided to share the results.
Here is a full C# code example of how to consume those API:
github repository.
It's a small library which wraps the HTTP request into strongly typed .NET classes. You can also grab it from NuGet.
Finally, here is my blog post with the walkthrough.