Application insights and Stream analytics - azure

I am trying to read application insights export data using stream analytics. Here is how my blob looks like.
In my stream analytics I reference this blob and try to read these files using the download sample data functionality. I do not get any data.
I am also setting the PATH PREFIX PATTERN
As democenteralinsightscollector_5cc2f280d52d47bdbe186e87d8037fc0/Requests/{date}/{time}

the following links will help you with the process: http://azure.microsoft.com/en-us/documentation/videos/export-to-power-bi-from-application-insights/
https://azure.microsoft.com/en-us/documentation/articles/app-insights-export-power-bi/

Actually I was trying to do the same and was able to get Stream Analytics to read the Application Insights blob export, but where this fails is that the Json that is emitted by Application Insights has the following entries:
"anonAcquisitionDate":"0001-01-01T00:00:00Z","authAcquisitionDate":"0001-01-01T00:00:00Z"
which causes the Stream Analytics Input to fail.
The Stream Analytics operation logs has the following:
First Occurred: 04/30/2015 03:06:22 | Resource Name: appinsightsevents | Message: Failed to convert values to match the column type for column anonAcquisitionDate
So, basically Stream Analytics cannot process the input

Related

Azure Node Function w/ Eventhub output binding to dynamically routed ADX table

Having difficulties outputting from my function to an eventhub and finally into ADX when I want to target a table to route to. I have had no issues hitting target tables with the Node SDK via EventHubProducerClient in which case, you simply specify those properties next to the body of the event your sending:
{
body: {
some:"fieldValue"
},
properties: {
Table: 'table_name',
Format: 'JSON',
IngestionMappingReference: 'table_name_mapping'
}
}
But doing this in the same manner when using the output binding documented in Azure Event Hubs output binding for Azure Functions where the messages that I would push would take the form of the above JS object does not work. The SDK documentation is non-helpful.
I can confirm that the data is in fact flowing from the Function to the Eventhub and into ADX if and only if I change the adx data connection for said eventhub to target a specific table (the opposite of the behavior I want) which is documented in Ingest data from event hub into Azure Data Explorer.
Any help would be greatly appreciated, this seems so silly!
Edit: grammar
The returned object is set as the data payload of the Event Hub Message that the Azure Functions runtime returns. Unfortunately, there is no way to change this from the JS Function itself.
In a C# Function you can return an EventData object but this isn't supported in non-C# languages.
You're only option, if you need your function to be in JS, is to use the Event Hub SDK directly.

Azure Media Services -- Create Live Output and Streaming Locator with Python SDK

I am working on a project that uses the Azure Media Services Python SDK (v3). I have the following code which creates a live output and a streaming locator once the associated live event is running:
# Step 2: create a live output (used to reference the manifest file)
live_outputs = self.__media_services.live_outputs
config_data_live_output = LiveOutput(asset_name=live_output_name, archive_window_length=timedelta(minutes=30))
output = live_outputs.create(StreamHandlerAzureMS.RESOUCE_GROUP_NAME, StreamHandlerAzureMS.ACCOUNT_NAME, live_event_name, live_output_name, config_data_live_output)
# Step 3: get a streaming locator (the ID of the locator is used in the URL)
locators = self.__media_services.streaming_locators
config_data_streaming_locator = StreamingLocator(asset_name=locator_name)
locator = locators.create(StreamHandlerAzureMS.RESOUCE_GROUP_NAME, StreamHandlerAzureMS.ACCOUNT_NAME, locator_name, config_data_streaming_locator)
self.__media_services is an object of type AzureMediaServices. When I run the code above, I receive the following exception:
azure.mgmt.media.models._models_py3.ApiErrorException: (ResourceNotFound) Live Output asset was not found.
Question: Why is Azure Media Services throwing this error with an operation that creates a resource? How can I resolve this issue?
Note that I have managed to authenticate the SDK to Azure Media Services using a service principal and that I can successfully push video to the live event using ffmpeg.
I suggest that you take a quick look at the flow of a live event in this tutorial, which unfortunately is in .NET. We are still working on updating Python samples.
https://learn.microsoft.com/en-us/azure/media-services/latest/stream-live-tutorial-with-api
But it should help with the issue. The first issue I see is that it's likely you did not create the Asset for the Live output to record into.
You can think of Live Outputs as "tape recorder" machines, and the Assets at the tapes. They are locations in your storage account that the tape recorder is going to write to.
So after you have the Live Event running, you can have up to 3 of these "tape recorders" operating and writing to 3 different "tapes" (Assets) in storage.
Create an empty Asset
Create a live output and point it to that Asset
get the streaming locator for that Asset - so you can watch the tape. Notice that you are creating the streaming locator on the asset you created in step 1. Think of it as "I want to watch this tape" and not "I want to watch this tape recorder".

Jira Rest API Calls in Azure Data Factory

Good Day
I configured a Pipeline Copy Data job in Azure Data Factory to extract data from Jira with an API call using the rest API connector in Azure.
When i configure and test the connection it is successful.
Now when i try to preview the data in the Copy container i get the following error.
Does anyone know what this error means and how do i bypass it?
I believe i am not the first one trying to extract data from Jira via Rest API.
Thank you and Regards
Rayno
Error occurred when deserializing source JSON file ".Check if the data
is in valid JSON object format.Unexpected character encountered while
parsing value:<.Path".....
I think the error already indicates the root cause.You data format is invalid JSON format,you could try to simulate rest api invoke to make sure if the situation exists.ADF can't help you handle the illegal deserialization.
In addition,according to the connector doc,ADF supports JIRA connector.Maybe you could try to have a try on that.

How do I query GCP log viewer and obtain json results in Python 3.x (like gcloud logging read)

I'm building a tool to download GCP logs, save the logs to disk as single line json entries, then perform processing against those logs. The program needs to support both logs exported to cloud storage and logs currently in stackdriver (to partially support environments where exports to cloud storage hasn't been pre-configured). The cloud storage piece is done, but I'm having difficulty with downloading logs from stackdriver.
I would like to implement similar functionality to the gcloud function 'gcloud logging read' in Python. Yes, I could use gcloud, however I would like to build everything into the one tool.
I currently have this sample code to print the result of hits, however I can't get the full log entry in JSON format:
def downloadStackdriver():
client = logging.Client()
FILTER = "resource.type=project"
for entry in client.list_entries(filter_=FILTER):
a = (entry.payload.value)
print(a)
How can I obtain full JSON output of matching logs like it works using gcloud logging read?
Based on other stackoverflow pages, I've attempted to use MessageToDict and MessageToJson, however I receive the error
"AttributeError: 'ProtobufEntry' object has no attribute 'DESCRIPTOR'"
You can use the to_api_repr function on the LogEntry class from the google-cloud-logging package to do this:
from google.cloud import logging
client = logging.Client()
logger = client.logger('log_name')
for entry in logger.list_entries():
print(entry.to_api_repr())

Streaming through .NET application in Azure

I have a .NET executable through which I want to stream data in Pig on my Azure HDInsight cluster. I've uploaded it to my container, but when I try to stream data through it, I get the following error:
<line 1, column 393> Failed to generate logical plan. Nested exception: java.io.IOException: Invalid ship specification: '/util/myStreamApp.exe' does not exist!
I define and use my action as follows:
DEFINE myApp `myStreamApp.exe` SHIP('/util/myStreamApp.exe');
outputData = STREAM inputData THROUGH myApp;
I try with and without the leading /, tried qualifying as wasb:///util/myStreamApp.exe and tried fully qualifying it as wasb://myContainer#myAccount.blob.core.windows.net/util/myStreamApp.exe, but in every case, I get the message that my file doesn't exist.
This page on uploading to HDInsight indicates you can use the Azure Blob Storage path of wasb:///example/data/davinci.txt in HDInsight as /example/data/davinci.txt, which indicates to me that there shouldn't be a problem with the paths.
It turns out the problem was that I wasn't declaring a dependency on the caller's side. I've got a console app that creates the Pig job:
var job = new PigJobCreateParameters()
{
Query = myPigQuery,
StatusFolder = myStatusFolder
};
But I needed to add to the job.Files collection a dependency upon my file:
job.Files.Add("wasbs://myContainer#myAccount.blob.core.windows.net/util/myStreamApp.exe");

Resources