I am currently using microsoft projects 2016 with timesheets.
How can i create a task which is always shown in my timesheet with 0 planned work time?
I have administrative tasks beside my project tasks which i can't plan on an explicite date.
I have already tried to do this with an administrative time categorie, but this is shown for every user.
Is this currently supported?
How could i implement this feature by myself throug an addon?
I've found a workaround for this. You can create a task with any duration (for example from 01.10.2016 to 31.12.2100) and with a total work amount of 0.1 hours. Projects will plan 0 hours every day for this duration and the task is always shown in your timesheet.
Creating the task:
Appears in your timesheet:
Related
With an Azure Data Factory "Tumbling Window" trigger, is it possible to limit the hours of each day that it triggers during (adding a window you might say)?
For example I have a Tumbling Window trigger that runs a pipeline every 15 minutes. This is currently running 24/7 but I'd like it to only run during business hours (0700-1900) to reduce costs.
Edit:
I played around with this, and found another option which isn't ideal from a monitoring perspective, but it appears to work:
Create a new pipeline with a single "If Condition" step with a dynamic Expression like this:
#and(greater(int(formatDateTime(utcnow(),'HH')),6),less(int(formatDateTime(utcnow(),'HH')),20))
In the true case activity, add an Execute Pipeline step executing your original pipeline (with "Wait on completion" ticked)
In the false case activity, add a wait step which sleeps for X minutes
The longer you sleep for, the longer you can possibly encroach on your window, so adjust that to match.
I need to give it a couple of days before I check the billing on the portal to see if it has reduced costs. At the moment I'm assuming a job which just sleeps for 15 minutes won't incur the costs that one running and processing data would.
there is no easy way but you can create two deployment pipelines for the same job in Azure devops and as soon as your winodw 0700 to 1900 expires you replace that job with a dummy job using azure dev ops pipeline.
We have the Sharepoint 2016 hosted on prem with a minimum set of services running on the server. The resource utilization is very low and the user base is around 100. There are no workflows or any other resource consuming service running.
We use list to store and update information for certain users with the help of a form for the end user. Of recent, the time consumed for the update has increased to over 6 seconds for a list data update.
Example:
https://sitename_url/_api/web/lists/GetByTitle('WFListInfo')/items(15207)
This list has about 15 items, mostly numbers and single line text or number or DateTime.
The indexing is set to automatic.
As part of the review, we conducted a few checks and DB indexing on our cluster, however there is no improvement.
Looking forward to any help / suggestions. Thank you.
I am integrating Asana project metrics with our help desk dashboard. I would like to show 3 numbers for each project:
- Total tasks in project
- Total completed tasks in project
- Total incomplete tasks in project
When I call the project/tasks api, I want to simply get a count, and not have to retrieve all the pages and programatically count the tasks. Is there any parameter for the API calls which just gets me a count of how many tasks match the criteria?
Thanks,
Craig
Unfortunately, the Asana API doesn't currently have the type of filtering where you can query to a subset of tasks that match an arbitrary pattern that you specify (i.e. "only the tasks where completed=true"). We also don't have an easy way to only get the completed tasks. You can get all incomplete tasks fairly easily by specifying completed_since=now on the tasks query endpoint - which is admittedly a bit strange, but works - but its converse (get only completed tasks) doesn't.
We are evaluating use cases for more filtering options, so you might see it at some point! For now, however, the only way to go about this is to get all of the tasks for a project and count them on your side.
I have a Client Request for my Data Factory Solution
They want to run my Data-Factory when ever the i/p file is available in the Blob Storage/any location.To be very clear they doesn't want to run the solution in an schedule basis,because some day the file won't shows up.So i want an intelligence to search whether the file is available to be process in the location or not.If yes then i have to run my Data factory Solution to process that file,else no need to run the Data factor
Thanks in Advance
Jay
I think you've currently got 3 options to dealing with this. None of which are exactly what you want...
Option 1 - use C# to create a custom activity that does some sort of checking on the directory before proceeding with other downstream pipelines.
Option 2 - Add a long delay to the activity so the processing retires for the next X days. Sadly only a maximum of 10 long retires is allowed currently.
Option 3 - Wait for a newer version of Azure Data Factory that might allow the possibility of more event driven activities, rather than using a scheduled time slice approach.
Apologies this isn't exactly the answer you want. But this gives you current options.
How can I find out the duration of a SharePoint workflow? (IE: How long that workflow took to run from initialization to end?)
I could take the item created by date and subtract it with the date today, but that won't work because the workflow can be started manually.
What's the best way of achieving this.
Please, no "SharePoint designer workflow" solutions - this is a visual studio workflow.
About ticks:
“A single tick represents one hundred nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond, or 10 million ticks in a second.”
(Microsoft.com: http://msdn.microsoft.com/en-us/library/system.datetime.ticks(v=vs.110).aspx )
Here are your formulas:
Seconds: [Ticks] / 10,000,000
Minutes: [Ticks]/ 600,000,000 or [Ticks]/ 10,000,000/ 60
Hours: [Ticks] / 36,000,000,000 or [Ticks] / 10,000,000 /60 / 60
Days: [Ticks]/864,000,000,000 or [Ticks] / 10,000,000 /60 / 60 / 24
Use a metadata column on your list item to track start and end time of the workflow -- which the workflow can set whenever you want during execution.
A designer workflow could also use the same technique, for others who might view this.
If you need to eliminate the possibility of someone manually editing the workflow history you create this way, just use an account with elevated privileges (or use an impersonation step if using SPD) to write to a list that contributors don't have access to.