just wondering if i could get some help about azure iot hub and stream analytics.
I'm using this guide to build my simulated device https://learn.microsoft.com/en-us/azure/iot-hub/quickstart-send-telemetry-python
however, whenever i try to extend the json telemetry message to include more key pairs, stream analytics always gives me this error
Source '<unknown_location>' had 1 occurrences of kind 'InputDeserializerError.InvalidData' between processing times '2020-07-14T02:35:47.4125308Z' and '2020-07-14T02:35:47.4125308Z'. Could not deserialize the input event(s) from resource 'Partition: [2], Offset: [806016], SequenceNumber: [1698], DeviceId: [testdevicereal]' as Json. Some possible reasons: 1) Malformed events 2) Input source configured with incorrect serialization format
I've checked my json formatting and it seems fine, any clues?
Deserialization issues are caused when the input stream of your Stream Analytics job contains malformed messages. For example, a malformed message could be caused by a missing parenthesis, or brace, in a JSON object or an incorrect timestamp format in the time field.
Enable resource logs to view the details of the error and the message (payload) that caused the error. There are multiple reasons why deserialization errors can occur. For more information regarding specific deserialization errors, see Input data errors. If resource logs are not enabled, a brief notification will be available in the Azure portal.
Please see Troubleshoot input connections for more details.
Related
I followed the link below, and created an input, like below:
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-tutorial-visualize-anomalies
The Test button in the red box confirms the connection is ok.
However, on the Query page, it shows error below:
Unable to connect to input source at the moment. Please check if the
input source is available and if it has not hit connection limits.
Below shows that the incoming messages are already in the event hub.
I tried using both Connection String and MI for the input, but am still getting the error.
I can send messages to and receive them from the event hub by following the link below:
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-dotnet-standard-getstarted-send
To test your query against a specific time range of incoming events, select Select time range.
You may checkout the MS Q&A thread addressing similar issue.
For more details, refer to Test an Azure Stream Analytics job in the portal.
all, Can anybody help me with the solution for capturing the malformed json coming from event hub to azure stream analytics.
My use case is : I am getting json records on the event hub i.e. input is event hub -> parsing the json in Azure Stream analytics -> placing the parsed data in azure SQL DWH.
Now whenever the malformed json comes on the eventhub the ASA drops that event and parses only the correct json. I need to capture those malformed event ans report it to the Source application. Could you please let me know how to do this ?
Thanks,
Aditya
Please take a look at this article, seems that it can meet your needs.
When a Stream Analytics job receives a malformed message from an input, it drops the message and notifies you with a warning. Then follow this What caused the deserialization error to find the JSON data with the incorrect format.
We've been having a few issues with Azure IotHub. I have a stream analytics job listening to an IotHub. My stream analytics job which was working perfectly fine just started showing no input and output. On restart it came up with the following error "Stream Analytics job has validation errors: Querying EventHub returned an error: ProtocolName." Which sort of indicates to me that it can't listen to IotHub anymore. Has anyone else had similar issues?. Help on troubleshooting this would be great.
stream analytics job error
There was an issue with EventHub that has since been addressed which should fix this. If this problem persists, please contact support.
Is there a way to capture and redirect data error events/rows to a separate output?
For example, say I have events coming through and for some reason there are data conversion errors. I would like to handle those errors and do something, probably a separate output for further investigation?
Currently in stream analytics error policy, if an event fails to be written to output we only got two options
Drop - which just drops the event (or)
Retry - retries writing the event until it succeeds
Collecting all error events is not supported currently. You can enable diagnostic logs and get a a sample of every kind of error at frequent intervals.
Here is the documentation link.
If there is a way for you to filter such events in the query itself, then you could redirect such events to a different output and reprocess that later.
I'm outputting a Stream Analytics job to PowerBI.com. It successfully sends the first 11-100 messages just fine, but after which it fails. In the operational log it says the operation "failed to send events" and is categorized as a "PowerBIOutputAdapterTransientError" without much other information. What are the symptoms of this type of error?
Messages are still going through Event Hubs but all operations seem to be haulted on the PowerBI side.
Looks like this was a transient service issue.