Is there a recommended method to clean up Azure Data Factory (Datasets, linked services etc.)? - azure

Elaborating the title further, the current ADF I am working on has a lot of legacy code, i.e. multiple datasets, linked services. Unfortunately no naming convention or system of creating new items were defined.
I have tried to list down all the pipelines, the associated datasets (linked services as well), but this seems a lengthy approach and we have around 100 odd pipelines.
I tried by exporting the complete data factory as an ARM template and tried to create a parser which would automatically create the above list, but it seems that ARM templates are more interlinked than I had thought, I dropped this plan.
Is their a better approach for solving this problem?

You can pull the pipelines/dataflows that uses the particular dataset. this details will be available in the properties tab -> Related tab.
enter image description here
Also you get the list of datasets that uses a particular linked service by going to manage tab ->Linkedservice -> Related
enter image description here

Since you haven't mentioned data factory version (v1 or v2) and mentioned it has lot of legacy code, I am making an assumption that it could be a v1 factory. In that case, you can check the ADF v1 to ADF v2 migration tool which helps with basic migration tasks (listing resources, doing simple migration steps etc.)

Related

Report how long between two specific states of an Azure Devops work item - Release

I am trying to perform a simple task for Azure Devops work item type Release. All we need to know is how many days between State - Testing to State - Documentation for these items. If this was historical data in some SQL table, it's no question for me what to do but it seems there isn't any simple way to report on this within Azure Devops. I would like to be able to do this all within Devops itself without having to use excel, some external app. just simply be able to see it right in devops.
Generally this is accomplished by adding an extra field to store the state transition time and adding a rule that populates that field when the work item is put into that state. Then you can simply query these two dates and diff them.

What is the best way to create azure function that can read excel sheet and convert the data into POCO to push into Azure Table?

I am creating an Azure function that can read an excel file and push the data to an azure table. I have researched and found the following options to proceed with the solution
Use EP Plus Package. There is no native method or functionality in this package to map sheet data to POCO but I have come across a few solutions to custom build one as per the requirements.
Use OLEDB Connection to query the sheet data.
Interop dll. But this is out of the question considering the deployment on cloud since it needs to have MS Office installed on the server.
Which one of the above approach would be more suitable for Azure cloud platform? Please let me know if there is any other way apart from the two mentioned above. Thanks.

Accessing source code of PowerApps and hosting it elsewhere

We are looking into using PowerApps to develop apps quickly. However, we are concerned about dependency on Azure and inability to access the source code.
We are interested in understanding the transportability of an app in PowerApps.
1) Can we access the source code?
2) In a scenario where PowerApps does not satisfy our needs, can we take away the source code and manage everything ourselves?
3) Can we deploy the code in another cloud provider such as AWS or Google Cloud or our own server?
Thanks!
Can we access the source code for Power Apps?
Yes. You can use export to zip.
Inside the zip there's a .msapp file. You also unzip that.
There's a ton of json files. It includes your Power Apps code.
See also:
running a diff on two Powerapps: https://github.com/microsoft/powerapps-tools#powerapps-review-tool
A Flow / Power Automate that exports your code to github: https://github.com/SeaDude/seattlePowerAppers/blob/master/outlines/powerappsVersionControl.md
No, you cannot access the source code. The whole point of PowerApps is that it is a zero code environment. Therefore you cannot take the code and use it elsewhere.

SQL Azure Database Schema Patch system

I have been trying to find a standard way to include databases schema patches into my Azure continuous deployment flow.
So the problem I am looking for a solution to, is that as an application evolves, so does the database. Ever so often there are changes to the database to support new functionality etc.
In earlier work situations I have used proprietary solutions that hold changes to the database in a linked list in an Xml document. The database then knows the latest patch it applied, and if any new patches are present it will apply them. That way it is easy to keep all environments synchronised, and the changes follow the code.
While these proprietary solutions have worked great, I was thinking that before I implement yet another tool to do this, I would see if there was a standard solution provided by SQL Azure to solve this problem. But I haven't been able to find one.
Is there one or do I need to create a tool myself?
Visual Studio Database Projects support deploying to Azure SQL Database so this is a good way to incorporate it into a CI workflow. If you are used to traditional deployment methods it is a bit of a mindset change; these projects work out at deploy-time what to deploy. For example, if you want to create a table, add a Table to the project and fill out the columns. Then, say months later, you want to add a column, simply add the column to the CREATE TABLE script. When you deploy, it will work out that the only schema change is a new column and it will add it.
This is a nice little series on that topic:
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Creating-a-Database-Project-for-Artificial-Intelligence

How to add a resource to project using workflow in NetSuite

As the title says...
I've successfully got workflows working that create project tasks, so I have some idea how the workflow customization tools work. But I'm struggling to see how I can (or even if I can) use a workflow to auto-magically add resources to the project (and then assign them to the project tasks I dynamically create.
Regarding which users/employees to add as resources, I imagine sorting out an appropriate clause shouldn't be too hard.
If I recall correctly, Resouces on a Project record are sublists. If I am correct, then it is not possible via workflows. There is a limitation with Workflows that they cannot work on record sublists.
You will have to do this via SuiteScript.

Resources