Upload Work Items using Bulk Import in non 'New' State in Azure DevOps - azure

We have a handful of User Stories and associated Tasks which we want to upload to Azure DevOps (ADO). For upload we are using the Bulk Work Item Import feature of ADO.
The issue is that ADO is allowing upload of these work items only in 'New' status, but we already have these User Stories and Tasks in progress or in closed status.
How to upload Work Items in non 'New' states, like 'Closed' or 'Active'?
I don't want to first upload them all in New State and then Update them using the sheet as it will be too much of a work given we have lot of data.
Thanks,
Shubham

All work items you import are created in a new state. This rule means that you can't specify field values that don't meet the field rules for the new state.
Please refer to doc:Import or update work items in bulk by using CSV file
I’m afraid that you can’t import works items in closedor Active state.
You could submit a feature request via :feature request
However, there is a flexible solution to solve your problem:
1 upload your work items in New state without manually to change the state to Closed or Active.(file:csv1)
2 export work items to CSV uploaded in step 1 and you can create Queries(file:csv2)
3 make sure that the work items are in same order in both files:csv1 and csv2
4 batch copy the state of all work items in csv1 to the column State in csv2 and update the state and save cvs2
5 update existing work items

Related

Azure DevOps CSV import issue [default state 'Done', Closed Date field]

currently I'm trying to import approx 1600 tasks from Azure DevOps Kanban board to another one (via browser). I've faced two problems by doing this:
It is not possible to import the task with Closed Date field, it says:
Value of a readonly field Closed Date was modified. Please revert the Closed Date values or remove the column from the input file and and try to import again.
Removing the column with "Closed Date" will cause issue with default state "Done", it says:
The field 'State' contains the value 'Done' that is not in the list of supported values.
I've copied (manually) the project settings from old Organization to new, such as:
Process \ State (and assigned it to the project):
Time and Locale as it was set in old Kanban
Also, I had to remove the ID column, because it caused the issue:
Still it is not possible to import all these tasks because of the issues. I'm looking for any tips how to fix this.
Below there's an example of CSV:
Work Item Type,Title,Created By,Created Date,Assigned To,State
Issue,Some test task number 1,TestCreator,02.06.2020 15:19:24,TestWorker,"Done"
Issue,Some test task number 2,TestCreator,02.06.2020 15:20:23,TestWorker,"Done"
When importing work items via CSV file in Azure DevOps, first make the following statements are true:
1.The source project to export and target project to import has the same work item type configuration, in common, they need to have the same process configuration, for example, work item type, state.
2.All work items you import are created in a new state. This rule means that you can't specify field values that don't meet the field rules for the new state. (That's why "Done" that is not listed as supported value)
3.Make sure you don't assign IDs to new work items that you're adding. Work item ID is unique in one Azure DevOps organization. (ID will cause the following error message)
With the above prerequisites, the import process loads the imported work items into the queries view in an unsaved state. No IDs are assigned. Verify the results are what you want. Then, choose Save Items to save the work items.
If you still get the error when importing, try reduce the work item counts for per import attempt. As you said, you have approx 1600 tasks, you could import by separating them into several queries.

LogicApp that returns newly generated ID back to original source

Hello I am trying to create a LogicApp that first:
Extracts data from CosmosDB, using a query
Loops over the results
Pushes the results data into CRM
Sidenote: Once this data is pushed into CRM, CRM automatically generates an ID for each record. How can I then:
My biggest dilemma is figuring out how can I return the newly generated ID back to the original CosmosDB container in which it was pulled from?
I have started on this and these are my questions:
Does this first part look correct in what I am doing? I must use this SQL query for this:
a. I am simply telling the Logic App to extract data from a particular container inside CosmosDB
b. Then I would like to loop over the results I just obtained, and push this to CRM
c. My dilemma is:
Once data is pushed to CRM, CRM then automatically generates an ID for each record that is uploaded. How would I then return the updated ID to cosmos?
Should I create a variable that stores the IDS, then replace the old IDs with the new ones?
I am not sure how to construct/write this logic within LogicApps and have been researching examples of this.
Any help is greatly appreciated.
Thanks
If the call to your CRM system returns the ID you are talking about then I would just add one additional action in your loop in Azure Logic App to update the read record in Azure Cosmos DB. Given that you are doing a SELECT * from the container you should have the whole original document.
Add the 'Create or update document' action as a step with a reference to the THFeature container along with your Database ID and then provide the new values for the document. I pasted an example below.
BTW. Your select query looks strange - you should avoid slow cross partition queries if you can.

SharePoint Online and prevent users from editing old list items after specific time

We have SharePoint Online site with list. Users can add, edit and delete own items in this list.
How I can resolve new issue - Users can't edit and delete own items older than one week (from creation date).
Is it possible without coding Remote Event Reciever or Azure function?
Per my knowledge, there is not an OOTB way to achieve it, we have to write some custom code.
If you only want to the user can't see some items in list view base on some condition, we can use list view Filter to achieve it. Note:User also access item form through URL.
I suggest you use remote event receiver or azure function to achieve this requirement.
In classic site, we can also add some JavaScript code into master page to prevent users access edit form page and hide the delete button in the page.
For this kind of requirement the solution which is 'no code' and with SP online is Power automate (previously MS Flow).
The best option in Your case would be to create a recurring flow which for example starts every day at 00:10 AM -> please see MS docs with description how to do this.
The Flow should get all items from a specific list for a specific site. You can do that with the 'Get Items' step. After that You can use the 'Apply to each' Step to check each item creation date with a 'Condition' Step. When some item will be older than one week You can use additional steps to break permission inheritance on item and remove all groups from permissions on this item or change the permissions on item from edit to read. Here I found an interesting article where are some predefined steps shown how to remove all permissions, remove and add a group. To tell You the truth I was not aware there already are this kind of steps in Power automate (maybe something new :) ) .
I would use SharePoint HTTP request and use SharePoint API to remove permissions on item and change some group role from edit to read. So I would do a similar procedure like here the only difference is not to break permission on all list but just on a specific item which I would get by ID. You can do all of those API request with a 'SharePoint HTTP request' Step
Please be aware there usually is a 5 000 unique permission levels threshold (50 000 is in on-prem) in list (place check here for latest info on limits in SP Online). So a better option would be to create a folder with view only permissions for the users before and when an item is older than one week You could only move this item to that folder also with Power Automate. That way all the files (items) in this folder would be read only and You would have only one unique item scope in this list and not as many as the old items.
Some other options You could use are:
Also code solution, but not RER and Azure function, would be to create a console app that uses CSOM to look through items in this list and either breaks permission on each item or moves to a folder (like the flow) and schedule this CSOM app on some local or remote server in Task Scheduler... Of Course You would need some local (or other) server for it which is always up and running :)
If You would like to use some 'little coding' and by any chance You are still using classic UI (not modern) You could use JSLink attached to CT or some column like Title and block editing of fields in edit forms etc (this is not the best option as always the users may create a view without a columns with this JSLink and go around this or modify the item using CSOM or Rest API)
You might consider using PowerApps (In case you have it configured in your subscription) to customize the edit form,
Then you use the validate() method, and disable a user from submitting the form in case ("Created" + 7 days) <= Today() or something like that.

Automatically copy list data from one list to another list after 1 month

I have list that having data stores dynamically once the infopath form has been submitted. I want to archive this data after 30 days of creation date of any data. can you please suggest how would i do this. see i can do this by workflow but how can i set the condition that once 30 days complete after creation of any list it will automatically copy into other list.
First I would want to question WHY you want to move it to another list?
Why not simply setup a view on your main list showing only records created in last 30 days :-
Created >= [Today]-30
(And yes, you can use Today in view filters)
The best way to proceed is to create a timer job and define the criteria (like when should it be running) while creating the job. We have a very helpful post by SharePoint MVP here on how to create custom jobs. Note, you can test the job and business logic for shorter duration and if it works fine, you can simply extend the job duration for 30 days.
Its very simple, all you have to do is to define your 'copying list' logic in the Execute method of job class.
Good luck
I believe you can simply use standard out-of-the-box workflow activities to do this - no need for timer jobs or console apps.
Using SharePoint Designer, create a new workflow (New -> SP Content -> Workflow). Specify the list to attach the workflow to.
Specify that the workflow should start on item creation only.
In the first Step, add an action: Pause for duration. Make the duration 30 days.
Add another action: Copy list item. Specify the list to copy to.
Optionally: create a third action: Delete list item if you want the original item to be deleted from the original list.
And I think that's it :)
you have to crate a timer job which runs daily and move the items which are one month old
Potentially the easiest solution is to create a console application that you schedule to run on the SharePoint server (an alternative to creating a timer job).
Depending on how you want to archive the data, you could:
Get the list items (probably using GetItems), save them to a file or a DB, and delete the old items
Use custom STSADM commands to export the list (backup just the list)
Backup the entire site using STSADM
There's a limit on number of items in a list of 5000 items and if you change this in administration you'll kill performance. There are perfectly valid reasons to want to archive old items off of a rapidly growing list. exceeds the list view threshold 5000 items in Sharepoint 2010

Access 2007 integration with Sharepoint 2007 Tasks list

A customer of ours has an Access 2007 application with a form for creating tasks for upload to a Sharepoint Task List. The user fills in the form (title, status, priority, start date, due date). The user then places check marks next to the sharepoint user names that this task must be assigned to (one task per sp user selected). This data is aggregeated into a TaskQueue table and the tasks are added to the Sharepoint list successfully (through a linked list - i think). The problem is that we need to include zero or more attachments for each task item. Is there a way to do this through a macro, VBA, or some other built in functionality that I haven't learned about yet?
My initial idea was to use a C# windows service that monitors this taskqueue table then uses the Lists.asmx Shareopint web service and the AddAttachment method when given the List item ID and NTFS path to the attached file to add the attachments to the task list item in Sharepoint.
After playing around with Access and setting up a linked table to a Task List in Sharepoint, I found that you can add attachments through the Access 2007 datasheet view. The problem is that you can only select one user or SP group in the Assigned TO field. They have a lot of repetitive tasks to assign to a bunch of separate people.. That's why they developed this form. If anyone has an idea on how to solve this issue please let me know. Also does anyone know of any good Access 2007/Sharepoint integration resources?
Thanks in advance!
have the attachments upload as part of the Access form.
load attachments into a Document Library
Check off users like they are currently being done
Add hyperlinks to the attachments uploaded in step 2 to the Description (rich text) field. (maybe done automatically in steps 1-2)
Leave TaskQueue table alone.
This way, 0..n documents can be included. The task list just stores structured data, and the documents are stored in a document library once, and you don't have runaway growth when attaching 1 document to 5 different tasks (resulting in 5 copies of the document).

Resources