How to copy workitem tree using API in azure devops - azure

How to copy workitem tree using API in azure devops ?
I need to copy workitems in azure devops by using API
I need to copy workitem tree ( child and sub child) for example, Feature with userstories and tasks
entire workitem tree need to copy
Is there any other method to copy workitem without API also fine.

Currently I don't find a rest api that can implement your idea. Maybe using API to get all of the information of workitems and relationship and then create the workitems. In addition, in Sprint 172 Update, you can copy work item to copy children. But This feature isn't recursive. The sub child need to be copied manually.

Related

Clone pipelines from one ADF to another

I need to clone a copy of existing pipelines (pipeline count: 10-20) from one subscription to another subscription's (another ADF). Is there any way to do this activity using Azure Devops?
Option1:
Using Git Configuration, you can publish the data factory to the GIT branch. Connect your new data factory to the same repository and build from the branch. Resources, such as pipelines, datasets, and triggers, will carry through. You can delete the pipelines which are not used.
Option2:
You can manually copy the code (JSON code) of each pipeline, dataset, linked service and use the same code in the new data factory. (Use same names for when creating pipelines/datasets/linked services).

Get files list after azure data factory copy activity

Is there a method that gives me the list of files copied in azure data lake storage after a copy activity in azure data factory? I have to copy data from a datasource and after i have to skip files based on a particular condition. Condition must check also file path and name with other data from sql database. any idea?
As of now, there's no function to get the files list after a copy activity. You can however use a get Metadata activity or a Lookup Activity and chain a Filter activity to it to get the list of files based on your condition.
There's a workaround that you can check out here.
"The solution was actually quite simple in this case. I just created another pipeline in Azure Data Factory, which was triggered by a Blob Created event, and the folder and filename passed as parameters to my notebook. Seems to work well, and a minimal amount of configuration or code required. Basic filtering can be done with the event, and the rest is up to the notebook.
For anyone else stumbling across this scenario, details below:
https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger"

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

How to Migrate Azure DataFactory(v2) Job to another DataFactory(2)

How to move all existing job to another Azure Datafactory.
I am trying to move existing job from one data Factory to another but not able to find the solution any suggestions, please.
As far as I know there is no easy import/export facility.
I recommend connecting your Data Factories to source control (GIT). You can then copy and paste the JSON definitions between the two repo's using a text editor.
For propagating pipelines between environments, you can look in to the documentation for CI/CD in Azure Data Factory.

Iterating Through azure SQL table in Azure Data Factory

I'm using ADF 2 and trying to grapple with the web activity.
The tasks are.
Copy file from blob storage and put the data in an azure SQL database
Iterate through the data and use a PUT call to a REST API to update the data
Okay so I can get the data in the table no problem. I can also make the call using a Web activity to the the API and put some hard coded data there.
But I've been trying to use a For Each to iterate through the table I have and call the web activity to pass that data to the API
This is where I'm stuck. I'm New to data factory and been through all their standard help information but not getting any where.
Any help is appreciated
I think you need to drive the foreach via a SQL lookup task that populates a data set and then call the activity for each row:
here are some posts to get you started:
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-for-each-activity
https://learn.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
replace the copy activity with the web call in the tutorial below:
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy-portal

Resources