Azure User delta API (beta) fields information - azure

While calling user delta API(beta version) for Azure
https://learn.microsoft.com/en-gb/graph/api/user-list?view=graph-rest-beta&tabs=http
I am getting the following fields,
"assignedPlans"
"assignedLicenses"
"licenseAssignmentStates". These fields are changing way too frequently because of which delta API keeps on getting more and more pages even for a small number of users.
https://graph.microsoft.com/beta/users/delta
I wanted to know what does these fields mean, however I couldn't find it in the API documentation

List Users operation returns a list of user resource type. You can find the properties of a user resource type here: https://learn.microsoft.com/en-gb/graph/api/resources/user?view=graph-rest-beta#properties.
Your read about the resources you mentioned in your question here:
assignedPlans: https://learn.microsoft.com/en-gb/graph/api/resources/assignedplan?view=graph-rest-beta.
assignedLicenses: https://learn.microsoft.com/en-gb/graph/api/resources/assignedlicense?view=graph-rest-beta
licenseAssignmentStates: https://learn.microsoft.com/en-gb/graph/api/resources/licenseassignmentstate?view=graph-rest-beta

Related

Logic Apps and CosmosDB

So I'm trying to do something I expected to be simple - using Logic App to insert a json object into Cosmos DB.
I've created a CosmosDB (based on Core SQL API) and created a container called Archive with the partition key /username (which is in my json object).
Now to Logic App.
First, I YOLO'ed the simple approach.
Which gave me error: "The input content is invalid because the required properties - 'id; ' - are missing"
But my json object doesnt have an id field. Maybe use the IsUpsert parameter? I dont feel like manipulating my input document to add a field called 'id'.
Which gave me error: "One of the specified inputs is invalid" Okay - feels even worse.
I now tried to use the other logic app connector (Not V2, but the original).
which gave me error: "The partition key supplied in x-ms-partitionkey header has fewer components than defined in the the collection."
I saw that this connector(unlike the V2 one) has a Partition Key Value parameter from UI, which I added to pass the value of my username
which gave me error "The input content is invalid because the required properties - 'id; ' - are missing".
At this point I thought, let me just give the bloody machine what it wants, and so I added "id" to my json object.
and yes that actually worked.
So questions are
With Logic Apps connectors, are you only able to insert json objects into Cosmos DB without that magic field "id" in the message payload?
If the partitioning key is required. Why is it not available from the V2 connector parameter UI?
Thanks.
The v1 version of this thing doesn't have partition key because it's so old, Cosmos didn't have partitioned containers. You will want to use the v2 preview. I think that will go GA at some point here soon because it's been in preview for a while now.
And yes, With Cosmos DB you can't insert anything without it's id so you will need to generate from whatever calls your endpoint and pass it in the body of your http request.

Azure Monitor: Query for disabled functions

Is there a clean way to query for the names for all disabled functions in Azure Monitor?
Sure, I can hard-code the names or the functions and check if there has been any logs for the functions, but I imagine there is a smarter way.
Thanks!
You should use Application insights for Azure function. For details, please follow this article. Then you can use the query below to get all the disabled function names.
Note: The query below can be run in Application insights, or Azure Monitor.
traces
//use sdkVersion to ensure it's an anzure function
| where sdkVersion contains "azurefunctions"
//then check if the message contains the word disabled
| where message contains "disabled"
//get the function name from message
| extend functionname=substring(message, 10,indexof(message,"'", 10)-10)
Explain about the query:
1.if the function is disabled, then the message field should contain the information like "function xxx is disabled."
2.to ensure it's an azure function, I check the field sdkVersion to see if it contains word "azurefunctions"
3.at last, fetch the function name from the message field.
The test result:
I tested it with function v3, if you're using azure function v2 or v1, you may(or may not) modify the query a little, but it should be easy.

Cosmos Db: SQL API + Graph API -- Really multi-model?

This is my scenario:
I want to have one single collection where I can insert and query documents using SqlApi and create vertices and edges using Graph Api, all in the same collection.
I believed that that was possible taken into account what I've read on the documentations.
My first try was using Microsoft.Azure.Graphs.dll
With this approach I was able to do CRUD operations with the Sql Api and execute gremlin scripts against the collection.
It is important to note that the documents created with the Sql api Insert command where used as vertices. Then I created edges using the graph api connecting the documents created with the Sql Api. This works as expected.
The only problem I had was that if the document contains an array property the graph Api returned and error : Invalid cast from 'System.String' to 'Newtonsoft.Json.Linq.JObject'.
After investigating I was told that the Microsoft.Azure.Graphs.dll was deprecated and I should use Gremlin.Net instead.
I have not read in any place that this assembly is deprecated and even in the official documentation and examples I can see that this assembly is being used.
Is it really deprecated? Should I not use it?
Then this was my second try:
I moved to Gremlin.net.
First issue: I was connecting to a collection created originally for Sql Api. When I tried to connect with Gremlin.Net Client it tolds me that can not connect to the server.
When I created another database and collection for graph Api I was able to connect.
Conclusion: It's not possible to use Gremlin.net with a collection created with Sql Api.
Or is it possible to activate the gremlin endpoint in a database with Sql Api?
Using the new collection for Graph Api I tried again to create documents with the Sql Api and then connect using Graph Api. I see that in this case both endpoints are created: SqlApi+Gremlin.
I had a few problems making this works. For example I had to set the GraphSon writer and reader to version 2, if not I received a null reference exception.
Second issue: I was able to create documents with Sql Api but I had the same problem with the array property (Example Document { "id":"aaa" "Prop": [ "1", "2"] } )
I get the same error when I query with gremlin: g.V('aaa') => Invalid cast from 'System.String' to 'Newtonsoft.Json.Linq.JObject'.
Conclusion: My first issue with previous library was not solved with the new one.
Third issue: The json returned when I query with SqlApi the edges or vertices are different from received when I used Microsoft.Azure.Graphs.dll. It looks like the cosmos db engine handles differently the gremlin scripts depending on the assembly. Which the json format I should expect?
Notes:
-Why I need to create vertices using SqlApi?
Because I want to create properties with custom objects and I'm not able to do it with graphApi:
Example: { "Id": "aaa", "Custom" : { "Id": 456, "Code" : { "Id": 555, "Name": "Hi"} }
-Why I want to query graph using SqlApi?
Because I want to access my custom properties.
Because I want to paginate using tokens. (Range gremlin function it is not performant. It traverse all the vertex/edges from 0 to the last page I want)
Has someone some information about this situations?
Help will be appreciated.
I know this post is a bit older, but it looks like it is possible to use graph and sql api. The key is in adopting the gremlin's underlying storage format when using the sql api for querying/manipulating. Please check this post out:
https://vincentlauzon.com/2017/09/05/hacking-accessing-a-graph-in-cosmos-db-with-sql-documentdb-api/

MoreLikeThis in Azure Search

I'm currently evaluating the new Azure Search feature in Windows Azure. I'm wondering if there's a way to do a MoreLikeThis query similar to lucene/elasticsearch?—pass in a document text and get a list of documents that are similar to the passed in document. I know Azure Search uses elasticsearch in the background (Source).
I haven't found this anywhere in the API, but maybe I'm missing something hidden in the parameters. I think this is a very useful feature and it would be shame if it's not included.
Yes it comes in the new version of azure search : 2015-02-28-Preview
see here : http://azure.microsoft.com/en-us/documentation/articles/search-api-2015-02-28-preview/
moreLikeThis=[key]
Here a sample:
GET /indexes/[index name]/docs/suggest?[query parameters]
Host: [search service url]
accept: application/json
api-key: [admin key]
C#
Uri uri = new Uri(_serviceUri, "/indexes/catalog/docs/suggest?$filter=discontinuedDate eq null&$select=productNumber&search=" + Uri.EscapeDataString(searchText));
There's a sample project on Codeplex:
https://azuresearchadventureworksdemo.codeplex.com/
Suggestions (Azure Search API):
http://msdn.microsoft.com/en-us/library/azure/dn798936.aspx
(Azure Search API)
http://msdn.microsoft.com/en-us/library/azure/dn798927.aspx
Unfortunately this feature is not currently available in Azure Search.
See Pablo's comment on Scott Guthrie's blog.
I know that this is an antique question, but it's one of the first when googling for 'morelikethis azure search'.
Anyway, with the new API version 2019-05-06-Preview there is a new preview feature called moreLikeThis (not in the SDK yet) where you can pass in an id of an existing document (I know, not a text like the David asked for). E.g.
GET /indexes/[index]/docs?moreLikeThis=[documentId]&api-version=2019-05-06-Preview
You can filter the compared fields by defining a list of properties with the searchFields parameter, e.g.
GET /indexes/[index]/docs?moreLikeThis=[documentId]&searchFields=[field]&api-version=2019-05-06-Preview
Of course this can also be POSTed, for more details have a look here.

azure table storage query

when I Post data and Query a Table with the database as: Dev datastorage (emulator) it works.
When I Post data Table with the data in Azure data base (have account) it works.
When I Get data from Table with the data in Azure data base (have account) it does not works.
In both the cases the code is the same.except the key and account credentials.
Is it I should do anything to Query ?
var query = azure.TableQuery
.select().from('dummytable').where('PartitionKey eq ?', key);
can any one suggest why Query is not working.
should there be anything else that need to be done
From Storage Explorer it works, I am able to see the entities.
only from the program I am not able to get the response. But in the same program "PUT"operation is working.
Was happening the same to me.. did an upgrade from the azure npm package 0.6.1 to 0.6.7 and now works, hope this helps.
I would look at the value that is on your partition key. There are some values that aren't on the list of invalid characters that Azure has issue with. For example before SDK 1.7 you could safely insert a % in a key, but if you queried for it specifically it wouldn't work. To test if this is the problem try running your query but without the filter and make sure your row is returned.
After reading the msdn mailing lists, I have upgraded the azure npm with the latest package 0.6.7 and it works. looks to be an issue with azure

Resources