How to get Start Date of current iteration of Azure DevOps? - azure

I would like to do a query on Azure Devops considering the condition:
work item = Task and
created date < ["Start date of #CurrentIteration"] and
iteration path = #CurrentIteration
How can I perform this contintion (["Start date of #CurrentIteration"]) on Azure Devops?

How can I perform this contintion (["Start date of
#CurrentIteration"]) on Azure Devops?
As I know there's no predefined macros that represents the start date of #CurrentIteration. We have to go Sprints tab manually to get the start date of current Iteration and then use that value in Query.
You can check this document for more details: To list work items based on when they were created, closed, resolved, or changed state—use #Today or specify dates. For queries that list work items based on their assignment to a team's current sprint, use #CurrentIteration.
For now we don't have such macros like #StartOfCurrentIteration. But it would be better if we have one macros used to represent the start date of #CurrentIteration. So I suggest you can post one feature request(Suggest a feature) of this macros in our User Voice forum-Developer Community Forum.
Thank you for helping us build a better Azure DevOps. Hope my answer makes some help:)

Related

can some one help me create a Power Automate flow

A power automate flow triggered on a daily schedule will check the list for any tickets that meet (or exceed) due date and submit a ticket to Orion via email. The last run and due date will then be updated. The system may be used to trigger tickets in supplier interface in the future - the destination system should be considered in the list schema.
i am having a hard time to develop a flow that executes the above requirement.
In Power Automate, create a scheduled flow that runs every day.
Declare all variables you will need later.
Assuming you store the value of the last run timestamp in a configuration SharePoint list for example, search and use the action Get item to get the list item where the last run time is stored in this list (you're have to create it if not existing though).
Search and use the function utcNow() in an expression to get the current datetime value.
Convert the previous datetime value with the action Convert time zone to an ISO-8601 datetime using FormatDateTime() function according to your timezone.
Search and use the action Get list items to get all the items of day from your SharePoint list, and filter the query via an OData expression according to the rule (today's date =< last run date + frequency).
Search and use Send an email V2 to send email.
Search and use the action Update item to update the last run time that is stored in a list item in the configuration list mentionned at the beginning.

Azure DevOps query for today and failed or resolved work items

How can i see on dashboard, a work item which today failed issues. Shortly, the query should give me state changed today to failed.
According to your description, you could try to create a shared query with the following clause:
And then add this shared query to the dashboard by using the "Query Results" widget.
To list work items based on when they were created, closed, resolved, or changed—you can specify a date or use a supported macro. Use the #Today macro and specify a plus or minus number of days for relative dates.
For more information, you could refer to this documentation.

How to create a report subscription in SSRS which passes today's date to paramters?

I created a report with a StartDate and EndDate parameter. If I want to see the information for a single day, I use the same Date in both parameters. I now want to create a subscription for this report so that it runs everyday. How can I use the current date and pass it to these parameters when the report runs? Thanks!
Step 1: You will have your defaultDates dataset as so. can be query or you can wrap it as a Stored proc.
Select TodaysDate = cast(getdate() as date)
Step 2: then , under default values for both the params you will anchor get value from dataset and point to this dataset, which is defaultDates.
Step 3: test it locally. Make sure to delete .DATA from your working directory to enforce fresh data.
Step 4: build and deploy to whatever test location.
EDIT: This will only work with Enterprise edition.
First, write a query that gets the current date and formats it to match the VALUE in your parameter (for example, is it DD-MM-YYYY, YYYY-MM-DD?). Make sure to name your column something meaningful like "CurrentDate".
select cast(current_timestamp as date) as CurrentDate
Then create a new subscription for your report. Instead of Standard Subscription, choose data driven subscription. Now select your SQL datasource and paste in your query. Press validate to make sure it runs fine. Hit OK.
Now you can go down to your subscription parameters at the very bottom of the page. Set Source of Value to be "Get value from dataset" then pick your "CurrentDate" from the drop down.
That's it, data driven subscription with current date.

Data import from new reports and automatically update existing records in existing data

I seek expert advice of all of you in accomplishing my work related task.
Task : Task it to perform Analysis on reports obtained to WorkSafe Monthly and weekly and getting valuable information out.
For example :
Number of injuries on monthly basis drilled down by department and divisions.
Total days lost in year
Count of type of claims
Possible return date.
So i receive these reports and i add some modified columns to it. Like Correct employee names and their ID's just to create a relationship between Employee database in Powerpivot so that i could get their position, dept and division.
Now every month in new report there could be 2 or 3 new claims added to it, and some existing claims with updates info. like Updated return to work date, Short term disability days etc.
Currently i go through them manually and it's really time consuming and tiring. if there there was older claims weren't getting updates i could've just imported from a folder using power query and added steps to remove duplicates. However, if i remove duplicate claims using powerquery now, basically i'll be removing same claims with updated info.
Could anyone you here suggest an efficient way to do it with power query or with other approach?
Thanks in Advance. I'd appreciate your time and effort.
If you use PowerQuery and select your source From File -> From Folder, when you choose to combine and edit you will get a table with its first column named Source.Name; which you could use to differentiate the updates.
For instance: If I start with two excel files in the same folder (theoretically, they could be different date source files for you)...
WS1.xlsx:
WS1 - Copy.xlsx:
Then I use the folder they are both in as the source...
(Navigate to your folder as appropriate.)
...and select Combine & Edit...
...and select the worksheet...
Then I get this:
...and it is clear what information came from what source file.

SharePoint Nintex Workflow Run Multiple Conditions

I was hoping you could help me. I have a calendar in SharePoint 2010 that I want to run a Nintex Workflow when the following conditions are true:
A new item is created
The "Type of Leave" field equals "Vacation". This field I created as just a category for the types of requests.
The date that the item was created for already has an entry with the "Type of Leave" field equaling "Vacation."
The use of this will be that the first person to enter vacation on a day will be automatically granted. But if more than one person enters vacation on the day that another vacation day is on, it will go through an approval process through their manager.
Here's what I've tried/reviewed:
- Using a Condition within a workflow. I can't find an option that would complete this.
- I did try looking on line searching for different solutions. I've watched some tutorials but nothing with this kind of solution was present.
- One thought I did have was to have an additional calculated column that counted how many entries for the same day with "Vacation" and then use that in a condition if the column was greater than 1 but I couldn't figure out the syntax.
This is on SharePoint 2010.
Thank you!
have you considered using the rest interface to query the list from nintex:
http://msdn.microsoft.com/en-us/library/office/ff521587(v=office.14).aspx
You can call this service from nintext using the call web service action.
Apply the filters on the url as per documentation above and count the records returned you can then include that in the nintext condition.
I am not sure if Nintext supports calls to Restfull services (from memory i think it does).
If not, you can use the Soap web service, same principle as above just the parameters to call it are slightly more complicated:
http://msdn.microsoft.com/en-us/library/lists.lists.getlistitems(v=office.12).aspx
I don't think there's an need for REST. Once your workflow starts, query the list for items matching the current item booking date and put the result in a collection. You can query the collection length, and if it's >0 you can use that condition to steer the logic of your workflow.

Resources