What is the best way to create azure function that can read excel sheet and convert the data into POCO to push into Azure Table? - excel

I am creating an Azure function that can read an excel file and push the data to an azure table. I have researched and found the following options to proceed with the solution
Use EP Plus Package. There is no native method or functionality in this package to map sheet data to POCO but I have come across a few solutions to custom build one as per the requirements.
Use OLEDB Connection to query the sheet data.
Interop dll. But this is out of the question considering the deployment on cloud since it needs to have MS Office installed on the server.
Which one of the above approach would be more suitable for Azure cloud platform? Please let me know if there is any other way apart from the two mentioned above. Thanks.

Related

Is there a recommended method to clean up Azure Data Factory (Datasets, linked services etc.)?

Elaborating the title further, the current ADF I am working on has a lot of legacy code, i.e. multiple datasets, linked services. Unfortunately no naming convention or system of creating new items were defined.
I have tried to list down all the pipelines, the associated datasets (linked services as well), but this seems a lengthy approach and we have around 100 odd pipelines.
I tried by exporting the complete data factory as an ARM template and tried to create a parser which would automatically create the above list, but it seems that ARM templates are more interlinked than I had thought, I dropped this plan.
Is their a better approach for solving this problem?
You can pull the pipelines/dataflows that uses the particular dataset. this details will be available in the properties tab -> Related tab.
enter image description here
Also you get the list of datasets that uses a particular linked service by going to manage tab ->Linkedservice -> Related
enter image description here
Since you haven't mentioned data factory version (v1 or v2) and mentioned it has lot of legacy code, I am making an assumption that it could be a v1 factory. In that case, you can check the ADF v1 to ADF v2 migration tool which helps with basic migration tasks (listing resources, doing simple migration steps etc.)

Data Factory Help : CSV files in sharepoint

We are currently using PowerBI to connect to CSV files on sharepoint using 'Source = SharePoint.Files('
we now need to bring these files into the datawarehouse but i can't find a file sharepoint connector only a sharepoint list connect.
Is there a way to grab files from sharepoint using data factory, i just need to load them into a Azure SQL database?
thanks
I have not found an easy way for this. The authentification towards Sharepoint is challenging and then the download functionality is als challenging. As Andrii wrote about the use of Logic Apps, we are considering a similar approach as an alternative. Due to lack of time we have not investigated this direction, yet.

SQL Azure Database Schema Patch system

I have been trying to find a standard way to include databases schema patches into my Azure continuous deployment flow.
So the problem I am looking for a solution to, is that as an application evolves, so does the database. Ever so often there are changes to the database to support new functionality etc.
In earlier work situations I have used proprietary solutions that hold changes to the database in a linked list in an Xml document. The database then knows the latest patch it applied, and if any new patches are present it will apply them. That way it is easy to keep all environments synchronised, and the changes follow the code.
While these proprietary solutions have worked great, I was thinking that before I implement yet another tool to do this, I would see if there was a standard solution provided by SQL Azure to solve this problem. But I haven't been able to find one.
Is there one or do I need to create a tool myself?
Visual Studio Database Projects support deploying to Azure SQL Database so this is a good way to incorporate it into a CI workflow. If you are used to traditional deployment methods it is a bit of a mindset change; these projects work out at deploy-time what to deploy. For example, if you want to create a table, add a Table to the project and fill out the columns. Then, say months later, you want to add a column, simply add the column to the CREATE TABLE script. When you deploy, it will work out that the only schema change is a new column and it will add it.
This is a nice little series on that topic:
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Creating-a-Database-Project-for-Artificial-Intelligence

Read 1GB size excel with 1.5 Records per sheet to Dataset

I am trying to import large excel files to database using .NET application in which I will do some customized cleansing and processing of data. The excel file will have sheets with 255 columns and 150,000 rows. I tried different solutions such as Microsoft.JET/ACE provider, OpenXML/OfficeOpenXML and LinqToExcel. I get OutofMemory exceptions with both Microsoft adapter and openxml. Please let me know how to deal with it.
Have you tried Integration Services to import an Excel file into SQL Server 2005? You can use Integration Services (a wonderful tool) for extracting, transforming, and loading data. Common uses for Integration Services include: loading data into the database; changing data into to or out from your relational database structures.
You can call this services from you .net code again and again to perform the task in repetitive manor. If not from .net code you can schedule it as well.
see this http://www.techrepublic.com/blog/datacenter/how-to-import-an-excel-file-into-sql-server-2005-using-integration-services/205 sample application for the same.
If you dont want to use SSIS. You could use the following ready to use open tools for the same.
http://exceldatareader.codeplex.com/
http://www.codeproject.com/Articles/14639/Fast-Excel-file-reader-with-basic-functionality
http://www.codeproject.com/Articles/16210/Excel-Reader
This should help you.

comparing sharepoint list and sqlserver table

I have a list in sharepoint which maintains particular month OnCall list,and we are maintaining employee directory in sql server. My requirement is to get complete data from sql server and show it in sharepoint and compare with sharepoint list and show small icon for the employees who are On Call for that particular Month. Can anyone please suggest me the waus of implementing this.
Thanks in advance.
Update: I have finished the part where I have to connect to the sqlserver database and get the employees information. For this we are using 3rd party web part to connect to the sql server and pull the data from the table. Now I have to show some kind of image on the employee name to show that he is on-call for that week. We are going to cretae custom list for maintaing the list of people who are on-Call. Can anyone please advise me on how to accomplish this.
Write a custom webpart which will pull the data from the list using sharepoint object model and SQL server using ADO.NET and do the said comparison.
If you were looking for out of the box, I am afraid there i too little information given here to analyze if its feasible out of the box or not.
If you have the SharePoint Enterprise version, you can look at using the Business Data Catalog. This will let you bind columns to external data sources. This might provide you with the functionality you're looking for.
If you do not have the Enterprise features, do you have access to deploy WSP packages and custom code?
You will have to write your own data access to your external data source. Your options would be to have a job that pulls data from the external data source and populates SharePoint list(s) or create a custom view that pulls the external data on-demand.
You'll have to come up with synchronization strategies. Meaning, is the data in the external SQL data source static, reference information that does not need to be updated depending on what a user does in SharePoint? This seems to be the case based on your question. If you do need to update the external data source, you'll have to hook into the on save event (so probably a custom event handler that listens for ItemAdding) to update the data, validate, and optionally cancel the operation with an error message.
If you can't deploy WSP packages / DLLs, you could take a look at the jQuery SharePoint library. This will let you interact with lists using jQuery. If you also write a WCF or Web Service wrapper around the data you need access to from your external data source that is accessible from the SharePoint environment, you can use hack together a solution.
To accomplish this you'd need to place a Content Editor Web Part on the page you need custom data access. In there you will write the code to reference the jQuery javascript library and jQuery SharePoint library. The code will have to make the calls to your external data service and make any updates you need.
This is the least reliable method to accomplish what you want since it's entirely page-based and can be broken by simply disabling script or someone editing the CEWP or removing it altogether.
If you don't have access to place a CEWP or any of the other solutions, then you have no options at all.
it relatively easy now to pull all the data using the third party webpart and saving it into a custom list. I would recommend you not only creating custom list but also creating the content types for this list. take a look at SharPoint MVP's post about creating a Custom List with Content Types

Resources