I've created a document and I'm sure it is saved in the Vault too since I can fetch it, but the TrueVault dashboard shows me "0 DOCUMENTS Stored In TrueVault". Is this a bug?
The Document count takes some time to load upon refresh of the dashboard. You should see the value updated after waiting some time (less than a minute). A revamp of the dashboard is on our roadmap.
Related
In my application users create documents which are then saved in the database. The document has expireAt field which is set to 30 days ahead from the date it is created. After the expiration date the document is considered as inactive.
So, what I want is to send an email to the user after expiration date to notify him that his document is now inactive. The only solution I see is to create a cron job and periodically poll the database for expired documents.
But I'm not sure if periodical polling is a good approach and would like to know if there are other ways of doing this.
P.S. The app is built with nodejs + mongoDB
If your use case is just removing the expired doc, you could use TTL feature of mongodb.
Since you need to send an email, the best option is cron job as you already thought of. Yes, periodically polling is a good option and works in most of the use cases and you do have control over it.
As per your requirement, you could poll once per day. If you are still need to care till minute level of expiry, you could do that by subtracting 24 hours from your actual query and send an alert for user convenience.
Tykaty, you can try to use 2 features of mongodb:
Expire data https://docs.mongodb.com/manual/tutorial/expire-data/
Change events https://docs.mongodb.com/manual/reference/change-events/ https://docs.mongodb.com/manual/changeStreams/#watch-a-collection--database--or-deployment
The idea is simple. You store the documents as before. Add a collection that you will watch and add there objects which should expire, link each document in your original collection to a corresponding object to be expired(1 to 1 link). When the object is deleted by mongo engine(expired), you get a notification with it's _id. Find this _id in your original collection of documents to understand which doc was expired. Here you are.
Of course, you can start with polling, final solution depends on the data and it's usage, the load as well.
I've created a simple query using Kusto language in Application Insights. This query has been pinned to Azure Dashboard.
I've noticed that there is no any automatic update applied for the table, moreover refresh button on the tile work only first time when you click on it.
Here is the query pinned to Azure Dashboard:
traces
| order by timestamp desc nulls last
| take 10
Based on information specified here I expected that table should be refreshed every 5 minutes. But seems that the table is never refreshed.
Is there are automatic refresh exist for AI queries on Azure Dashboard? Actually I didn't find any mentioning about automatic refresh apart provided above link.
Metrics refresh depends on time range with minimum of five minutes. Logs refresh at one minute.
Related documentation reference
Regarding the refresh button on the tile, thanks for sharing the feedback on that. We have reached out to concerned team to look further into it. Will keep you updated on it.
If you're up for it, you can create a Chrome Extension which hunts the Dashboard DOM and selectively clicks each refresh DIV. Here's the pertinent jquery code:
function doRefresh(){
$( "div.azc-toolbarButton-container" ).each(
function() {
if ($( this ).attr( "title" ).indexOf("Refresh") >= 0)
$( this ).trigger('click');
});
}
I was able to do this. Added a 1 minute setInterval() and it appears to work well.
Those are AnalyticsPart tiles,
they refresh automatically every hour.
Of course, you can initiate a refresh by clicking the refresh button.
Azure dashboard is planning to expose an automatic refresh pile that will enable the user to customize the refresh rate of the dashboard parts. One could then chose to customize the rate to 1 hour, 30 min, etc. AnalyticsPart may onboard to this new pile.
It's still in the designing stages of the azure team.
I was working with azure-search and able to map indexer with DB. Everything seems fine till here.
I want to update my indexer automatically/manually every second to keep azure search updated. Every time if data gets created/updated azure search indexer should get updated as soon as possible to give best user experience.
In DOC it's written that indexer can update every 5 minutes.
An indexer can re-index your table at most every five minutes. If your
data changes frequently, and the changes need to be reflected in the
index within seconds or single minutes, we recommend using the REST
API or .NET SDK to push updated rows directly.
I tried RESET API but it also has the limitation of 3 minutes. This if following error I am getting when tried updating indexer frequently. Is this limitation is because of free search plan?
{
"error": {
"code": "",
"message": "On-demand indexer invocation is permitted every 180 seconds for this service tier."
}
}
Any suggestions?
Is this limitation is because of free search plan?
You could get the answer from this document comment. Accroding to Eugenesh#MSFT metioned that it is the limit for free search plan.Paid tiers do not limit how frequently you can run an indexer.
paid tiers do not limit how frequently you can run an indexer.
I have recently noticed that retrieving the LastDateTimeModified (through the WebService API) from Acumatica gives me the date and time in a very different time zone - I am guessing GMT time.
However when I view this through a Generic Inquiry it seems that it is showing the correct time - based on my Time Zone set up in the user profile.
Is there a way to get the LastDateTimeModified in the correct time zone when retrieving from the Web Service API. I have attempted changing the Time Zone for the SDK user with no success
Thanks,
G
For most screens, except a few CRM screens, the LastDateTimeModified and CreatedDateTime are stored in the same time zone as the database server machine. When reading it using web services, you are retrieving the raw value form the database, with no timezone conversion. It is up to you to convert it to the desired timezone.
The Help->Audit History panel does a manual conversion to the current user time zone. I have not been able to get the generic inquiry to show the time as you mention in your question; it is only showing the date.
I have real time subscription for really popular tag.
My app gets many subscription calls in second and then i am making post request to tag/media/recent, but i am getting duplicate and skipped images because it is very fast. How can i get sure that i request for image that subscription call is sent?
I tried even to set count to 1, and store last min_tag_id, but i am getting duplicates and missed images also.
my idea was to get last picture id, then on subscription call to set sleep for sometime and then call tag/media/recent with count=1 and min_tag_id and i will get picture for that subscription call, but there are still duplicates.
Why Instagram is not simply sending picture ids?
I have been having the same issue as you (working on this over the last week). My workaround, currently, is after Instagram sends a POST that an update for the hashtag has happened, I do a POST call for the most recent photos (currently setting count to 1 as you're doing but still working on getting all the newest photos).
I basically store all the latest photos returned in a database using the photos id as the primary key. If the key already exists in the db I issue an update otherwise the new photo gets inserted into the database. I then check whether an update was called or an insert was called. If an update, I simply exit the function. If an insert was called, I then issue a response to the browser with the new photo (this is very easy to do with web sockets; alternatively you can simply poll your database over a set interval to check if new photos were added).
I'm not even sure if this is a correct approach, it feels very hacky to me but Instagram's real time API is not very intuitive to work with.