I was working with azure-search and able to map indexer with DB. Everything seems fine till here.
I want to update my indexer automatically/manually every second to keep azure search updated. Every time if data gets created/updated azure search indexer should get updated as soon as possible to give best user experience.
In DOC it's written that indexer can update every 5 minutes.
An indexer can re-index your table at most every five minutes. If your
data changes frequently, and the changes need to be reflected in the
index within seconds or single minutes, we recommend using the REST
API or .NET SDK to push updated rows directly.
I tried RESET API but it also has the limitation of 3 minutes. This if following error I am getting when tried updating indexer frequently. Is this limitation is because of free search plan?
{
"error": {
"code": "",
"message": "On-demand indexer invocation is permitted every 180 seconds for this service tier."
}
}
Any suggestions?
Is this limitation is because of free search plan?
You could get the answer from this document comment. Accroding to Eugenesh#MSFT metioned that it is the limit for free search plan.Paid tiers do not limit how frequently you can run an indexer.
paid tiers do not limit how frequently you can run an indexer.
Related
I am "connecting" CosmosDB to an Azure Function by change feed binding. I wonder if there is a way to trigger change feed only when certain property has some specific value.
For instance, a new user is inserted in CosmosDB. Then, run the Azure Function only when the user has user.email != null.
I could filter this out in the Azure Function of course. Just concern about the pricing filtering out potentially thousands of events I don't need.
No this is not currently possible.
There is a 5 year old request on the Feedback site and a response saying the "feature is now planned" but it is unclear when that response was posted as there is no date on it.
For the time being at least you need to filter out any documents not matching your criteria within the function itself rather than being able to get this done server side by CosmosDB when sending the batch of changes.
I tried to use the PowerShell sample https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api#powershell-sample without any changes.
It completes with status 200 (OK) and correctly creates a new table with LogType (MyRecordType) within the Custom Logs in the portal (Log Analytics Workspace->Logs).
However, the events that are submitted don't turn up there - there are always "No results from the last 24 hours". Also, within the new table, none of the custom properties are created.
Has anybody observed a similar problem? (Some people seem to be using the C# code successfully.) Thanks!
Crazy... on the next day, all the events had turned up in the Log Analytics workspace. Even new events that I generated turned up immediately.
It seems this was a glitch just on that day. Well, the API is in "preview"...
I've been working with Google Analytics for 2 months now. I created a custom dashboard with NodeJS (express/serverless), out of it with requesting data from the Core Reporting API and the Real Time Reporting API. I've managed to put it as a Lambda Function on AWS. While I'm very pleased about this, I have some issues I'm facing right now.
I get the following errors:
{
"error":{
"errors":[
{
"domain":"global",
"reason":"dailyLimitExceeded",
"message":"Quota Error: profileId ga:NNNNN has exceeded the daily request limit."
}
],
"code":403,
"message":"Quota Error: profileId ga:NNNNN has exceeded the daily request limit."
}
}
and
{
"error":{
"errors":[
{
"domain":"usageLimits",
"reason":"userRateLimitExceeded",
"message":"User Rate Limit Exceeded"
}
],
"code":403,
"message":"User Rate Limit Exceeded"
}
}
My dashboard looks like this:
When the dashboard get visited, it calls the realtime api 9 times (each block in the image is a querycall). I think I could combine the 'Online users', 'Users today' and 'pageviews today' call into one call. The search today, and orders today are specified by filters for searching for a specific event.
I have a build in a timechecker, which allows the dashboard to be viewed between 07:00 and 19:00. When it's earlier then 07:00 or later then 19:00 a variable checkTime is set to false, which makes the dashboard shows a div with text something like "dashboard offline". When someone has visited the dashboard in the allowed timerange, a variable checkTime is set to true and calls to the Google API's can be made.
The dashboard is running on a tv screen between 07:00 and 19:00. This means that the dashboard is up on a TV screen for 12 hours long. Every 20 seconds there is a function call to update all the data (so again 9 requests are being made).
So let's say there are
60 minutes x 3 = 180 x 12 = 2160 x 9(requests) = 19440 requests for a
day.
I don't think I should reach the 50.000 quota. But I am reaching the profile quota from 10.000.
However when I view the Developers console I can see the following:
I think my options are the following:
Increase the interval to 1 minute ( (60 x 12) x 9 requests each view = 6480), that way the profile quota shouldn't be exceeded. But this doesn't really make the dashboard realtime anymore.
Make a server, which runs the queries(with the increased interval of 1 minute), save the results to a database. The dashboard makes a GET request to the database. This way multiple tv screens should be able to request data.
QUESTION: Could I also make multiple service accounts, and switch to other service account when limit has been reached, or doesn't this fix the profileid limit?
DailylimitExceded can mean one of two things.
You can only make 10000 request against a single view a day. This quota you are sharing with other developers. So if i install your app and Someone elses app in total there can only be made 10000 requests a day against my Google analytics view and then both apps will get that error. If you are making these requests you should be storing the data in the database so that you don't need to request the same information again. Even though its a different user who is trying to view data on the same view. You are probably not going to be able to track this quota hit in the Google Developer console.
The second issue is that by default an application can make a max of 50000 requests a day across all views. That means that if you have 5 users and you are making 10000 requests a day for each of them you have reached the limit of your requests. I dont think this is what you are hitting.
The first quota the user based one there is nothing you can do about that you cant extend it. You need to limit your requests so that you dont block a users account. The second one you can apply for an extension in the Google Developer console it can take a while to get the extension you should apply for it when you have reached around 80% of your current daily quota.
The main thing here is that you should not be requesting the same data twice. If you have made a request you should be saving it and displaying stored data to your users rather than just requesting it again. That and the real time api you should not be trying to request from that more then every 5 minutes as you will be eating our quota.
I have suggested to Google several times that the realtime api should be on its own quota and not the same as the reporting api. I am still waiting for them to add this feature.
I am making a timer job in c# which will make a call to office 365 and fetch newly created users. For example, if I run it now it should fetch users since last run till current time (Delta). However I dont see any FILTER or API parameter where I can pass date and get ONLY those users who are updated or created after specific date.
Is there any API available something like this;
https://graph.windows.net/{MYORG}/users?api-version=1.6&[Filter=createdDate
gt 12/12/2016 or modifiedDate gt 12/12/2016]
Azure AD Graph API can't do this as far as I am aware of, the User entity does not contain creation or modification dates to query on.
However, the beta endpoint of the Microsoft Graph API should be able to do this. You can follow the instructions here: https://graph.microsoft.io/en-us/docs/concepts/delta_query_users. Simply put you must:
Call the users endpoint with the delta function
If you got a skip token, it means there are more pages
Fetch the next page until you no longer get a skip token but get a delta token instead
This delta token allows you to call the endpoint anytime later to get only the modified users (created/updated/deleted)
General guidance for delta queries
I found that blog already however its under delta and throwing errors to me. Here is what I got in response of first call;
https://graph.microsoft.com/beta/users/delta?$skiptoken=
when I did a GET to that link, I am getting error saying the resource you are trying to request is either does not exist or has been removed or...
I feel there is a bug as its beta API.
However, good news is I just found another blog which seems to have be fixed my problem using following query;
https://graph.windows.net/XYZ.onmicrosoft.com/directoryObjects?api-version=1.6&deltaLink=
I will anyway mark your answer as accepted assuming Microsoft will fix the issue I am getting from backend.
I've created a document and I'm sure it is saved in the Vault too since I can fetch it, but the TrueVault dashboard shows me "0 DOCUMENTS Stored In TrueVault". Is this a bug?
The Document count takes some time to load upon refresh of the dashboard. You should see the value updated after waiting some time (less than a minute). A revamp of the dashboard is on our roadmap.