How can I find out the duration of a SharePoint workflow? (IE: How long that workflow took to run from initialization to end?)
I could take the item created by date and subtract it with the date today, but that won't work because the workflow can be started manually.
What's the best way of achieving this.
Please, no "SharePoint designer workflow" solutions - this is a visual studio workflow.
About ticks:
“A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond, or 10 million ticks in a second.”
(Microsoft.com: http://msdn.microsoft.com/en-us/library/system.datetime.ticks(v=vs.110).aspx )
Here are your formulas:
Seconds: [Ticks] / 10,000,000
Minutes: [Ticks]/ 600,000,000 or [Ticks]/ 10,000,000/ 60
Hours: [Ticks] / 36,000,000,000 or [Ticks] / 10,000,000 /60 / 60
Days: [Ticks]/864,000,000,000 or [Ticks] / 10,000,000 /60 / 60 / 24
Use a metadata column on your list item to track start and end time of the workflow -- which the workflow can set whenever you want during execution.
A designer workflow could also use the same technique, for others who might view this.
If you need to eliminate the possibility of someone manually editing the workflow history you create this way, just use an account with elevated privileges (or use an impersonation step if using SPD) to write to a list that contributors don't have access to.
Related
Microsoft secure score API provides score for Office 365 configurations. It provides list of best practices to secure O365 account. If we fix the mentioned issue and when we retrieve the secure score results again, we are getting same old result. As per Microsoft documentation, secure score should be updated daily but it is not happening. Any idea about its refresh frequency?
https://learn.microsoft.com/en-us/graph/api/resources/securescores
The official document explains it like this:
The score is calculated once per day (around 1:00 AM PST). If you
make a change to a measured action, the score will automatically
update the next day. It takes up to 48 hours for a change to be
reflected in your score.
According to the documentation, the score should be calculated once a day, starting at about 1:00 AM Pacific time, but it will take several hours to run. There are also instances where the job fails and we need to restart it so this might be why you don't see it updated at the exact same time every day.
Moreover, it may take up to 48 hours to refresh, so I suggest you wait until 48 hours to see if it refreshes.
This line is in my Azure Application Insights Kusto query:
pageViews
| where timestamp between(datetime("2020-03-06T00:00:00.000Z")..datetime("2020-06-06T00:00:00.000Z"))
Each time I run it, I manually replace the datetime values with current date and the current date minus ~90 days. Is there a way to write the query in a way that no matter what day I run it, it uses that day minus 90 days by default?
The reason for 90 is I believe Azure Application Insights allows a maximum of the most recent 90 days to exported. In other queries I might choose to use minus 30 days or minus 7 days, if it's possible.
If this is easily spotted in Microsoft documentation and I have missed it in my exploration, I apologize.
Thank you for any insight anyone may have.
IIUC, you're interested in running something like this:
pageViews
| where timestamp between(startofday(ago(90d)) .. startofday(now()))
(depending on your requirement, you can omit the startofday()s, or use endofday(), or perform any other datetime-manipulation/arithmetics)
It should be easy to use ago operator. The query is as below:
pageViews
| where timestamp >ago(90d) //d means days here.
And for this The reason for 90 is I believe Azure Application Insights allows a maximum of the most recent 90 days to exported. You can take a look at Continuous Export feature, it's different from export via query. And you can choose the better one between them as per your requirement.
We have the Sharepoint 2016 hosted on prem with a minimum set of services running on the server. The resource utilization is very low and the user base is around 100. There are no workflows or any other resource consuming service running.
We use list to store and update information for certain users with the help of a form for the end user. Of recent, the time consumed for the update has increased to over 6 seconds for a list data update.
Example:
https://sitename_url/_api/web/lists/GetByTitle('WFListInfo')/items(15207)
This list has about 15 items, mostly numbers and single line text or number or DateTime.
The indexing is set to automatic.
As part of the review, we conducted a few checks and DB indexing on our cluster, however there is no improvement.
Looking forward to any help / suggestions. Thank you.
I am currently using microsoft projects 2016 with timesheets.
How can i create a task which is always shown in my timesheet with 0 planned work time?
I have administrative tasks beside my project tasks which i can't plan on an explicite date.
I have already tried to do this with an administrative time categorie, but this is shown for every user.
Is this currently supported?
How could i implement this feature by myself throug an addon?
I've found a workaround for this. You can create a task with any duration (for example from 01.10.2016 to 31.12.2100) and with a total work amount of 0.1 hours. Projects will plan 0 hours every day for this duration and the task is always shown in your timesheet.
Creating the task:
Appears in your timesheet:
I am searching for some data on splunk for a 5 minute time range. I want this query to run after every 5 minutes in splunk on it's own. How can this be done? I tried finding it on splunk but all I can see is how to schedule alerts and reports. And after the query is activated, how can we access the produced results generated by the query?
Technically you can have a scheduled search, but it only makes sense to talk about a report or an alert. Your scheduled approach is actually the best-practice (as there is also the possibility for a real-time search of the last 5 minutes).
If you just want a report, you tell Splunk to email it to you either as an HTML table or as a PDF document.
If you only want to be alerted if some condition matches (i.e. more than X results) then you want to set up an alert.
Scheduled searches are available, but they are a bit tricky to access (imho)
In the alerts/reports schedule options you have to set the following:
Earliest: -6m#m
Latest: -1m#m
Cron expression: */5 * * * *
Don't forget to set some trigger condition (for an alert) or a delivery method (for the report) ;)