I want to query data from Azure Application Insights. In Azure REST Api reference, they have two operations:
execute
get
The problem is that they both "Execute an Analytics query". What's the difference between them and what should one use?
The only difference is that to execute a query you need to do a POST with the query details in the body whereas getting a query result you need a GET with the query details in the url parameters.
Which one to use is a matter of preference. If you don't want executed queries in your logs you might prefer the POST since the query text is not a url parameter.
Related
I'm trying to query two different application insight instances (instance A and B) trough the REST API. I'm using postman to send a GET http request to the API and followed the answer in this post, which was aiming towards the same goal, join request data from different applications insights: https://stackoverflow.com/a/52248597/17161618
Just as it says in the above mentioned post I'm acessing the instance A and passed the authentication through the request header in the format keyA:appIdA,KeyB:appIdB.
I'm sending the following query:" union app("AppIdA").traces,app("AppIdB").traces " and receive the following error: enter image description here
Does anyone know how can I get access to read logs in resource B ?
I don't think you can run cross-component queries using API keys. You should use AAD (Azure Active Directory) auth. Give AAD app access to both resources. Then you should be able to run such queries.
I am trying to figure out if Azure LogicApp can be used for files/documents migration from Azure Blob Storage to a custom service, where we have REST API. Here is the shortlist of requirements I have right now:
Files/documents must be uploaded into Azure Storage weekly or daily, which means that we need to migrate only new items. The amount of files/documents per week is about hundreds of thousands
The custom service REST API is secured and any interaction with endpoints should have JWT passed in the headers
I did the following exercise according to tutorials:
Everything seems fine, but the following 2 requirements make me worry:
Getting only new files and not migrate those that already moved
Getting JWT to pass security checks in REST
For the first point, I think that I can introduce a DB instance (for example Azure Table Storage) to track files that have been already moved, and for the second one I have an idea to use Azure Function instead of HTTP Action. But everything looks quite complicated and I believe that there might be better and easier options.
Could you please advise what else I can use for my case?
For the first point, you can use "When a blob is added or modified" trigger as the logic app's trigger. Then it will just do operation on the new blob item.
For the second point, just provide some steps for your reference:
1. Below is a screenshot that I request for the token in logic app in the past.
2. Then use "Parse JSON" action to parse the response body from the "HTTP" action above.
3. After that, your can request your rest api (with the access token from "Parse JSON" above)
is it possible to start application insights search by URL parameter?
see the picture below: I like to generate a link to start the search:
something like: https://<applicatoinInsightsUrl>?search=b4eb0000-f22e-18db-fc92-08d81c2df34d
No, you cannot.
Actually, when you use search from portal, it calls the application insights get query rest api in the backend. And it also auto-generates a token for the authentication(if you already logged in azure portal).
So just a simple url plus search="xxx" will do nothing. You can consider using get query api.
I am trying to dynamically add/update linked service REST based on certain trigger/events to consume a RESP API to be authenticated using cookie which provides telemetry data. This telemetry data will be stored in Data Lake Gen2 and then will use Data Bricks to move to secondary data storage/SQL Server.
Have someone tried this? I am not able to find the cookie based Auth option while adding the linked service REST.
Also how to create data pipes dynamically or to have the parameters of the rest api to be dynamic ?
Currently, unfortunately this is not possible using Azure data factory native components/activities. For now at least, you cannot get access to the response cookies from a web request in data factory. Someone has put a feature request for this or something that might help, see here
It might be possible to do this via an Azure function to get/save the cookie and then send it as part of a following request. I had a similar problem but resorted to using Azure functions for all of it, but I guess you could just do the authentication part with a function! ;-)
EDIT: update
Actually, after I wrote this I went back to check if this was still the case and looks like things have changed. There now appears (never seen this before) in the web response output, a property called "ADFWebActivityResponseHeaders" and as you can see there is property for the "Set-Cookie"
See example below:-
Need values to configure in Rest API connector in Azure data factory recently getting only 1000 records. Need how to setup configuration so that the values can be looped.(Pagination rules to configure continuous token).
Based on the official document,ADF pagination rules only support below patterns.
I think you could adopt the pattern: Next request’s query parameter = property value in current response body to set the page size, then pass it into next request as parameter.
As mentioned in the rules, the connector will stop iterating when it gets HTTP status code 204 (No Content), or any of the JSONPath expression in "paginationRules" returns null.