Upload Excel 2013 Workbook to website hosted on Azure - excel

Does anyone have guidance and/or example code (which would be awesome) on how I would go about the following?
With a Web application using C# / ASP.NET MVC and hosted on Azure:
Allow a user to upload an Excel Workbook (multiple worksheets) via a web page UI
Populate a Dataset by reading in the worksheets so I can then process the data
Couple of things I'm unclear on:
I've read that Azure doesn't have ACEOLEDB, which is what Excel 2007+ requires, and I'd have to use OPEN XML SDK. Is this true? Is this the only way?
Is it possible to read the file into memory and not actually save it to Azure storage?
I DO NOT need to modify the uploaded spreadsheet. Only read the data in and then throw the spreadsheet away.

Well that's many questions in one post, let me see if we can tackle them one by one
With a Web application using C# / ASP.NET MVC and hosted on Azure:
1.Allow a user to upload an Excel Workbook (multiple worksheets) via a web page UI
2.Populate a Dataset by reading in the worksheets so I can then process the data
Couple of things I'm unclear on:
1.I've read that Azure doesn't have ACEOLEDB, which is what Excel 2007+ requires, and I'd have to use OPEN XML SDK. Is this true? Is
this the only way?
2.Is it possible to read the file into memory and not actually save it to Azure storage?
1/2. You can allow a user to upload the excel workbook to some /temp location and once you have read you can choose to do the cleanup, you can also write a script which can do the cleanup of the files which couldn't get deleted from /temp for whatever reasons.
Alternatively if you want to keep the files, you should store them in Azure Stoarge, and fetch/read when you need to.
check out this thread read excelsheet in azure uploaded as a blob
By default when you upload a file it is wrote into local disk and one later chooses to save the files to azure storage or whatever places.
Reading the excel - you can use any of the nugget packages given here http://nugetmusthaves.com/Tag/Excel and read the excel file, I prefer Gembox and NPOI
http://www.aspdotnet-suresh.com/2014/12/how-to-upload-files-in-asp-net-mvc-razor.html

Related

Access worksheet names from Excel file with Google Apps Script (without Drive.Files.insert)

In a Google App Script attached to a Google Sheet, I have the file ID of an excel file. I want to read the worksheet names of that excel file. The tutorials I've seen on conversion load the excel file as a blob then write it to Drive as a Google Sheet, then read it.
Is there a way to do this that does not to create artifacts that I then need to delete? The reasoning is that I am concerned with the following: safety if there's a bug (the wrong thing gets deleted), additional processing time (I need to process a long list of excel files), and leftover artifacts if the script aborts unexpectedly between inserting and deleting.
Thank you!
Answering your questions, the reason the tutorials first convert the Excel file to a Google Sheet is to interact with it (in your case, to gather the worksheet names) it's because the Google APIs or Apps Script cannot interact with the Excel file as row data, and Google needs to convert the file to something readable using Google APIs.
A workaround for this will be to use Excel JavaScript API to read the information original Excel file, you can use externals API in Apps Script since it's based in JavaScript, so you will use Apps Script as an IDE.
However, you can do the same with any other IDE that works with JavaScript.
There are some examples on how to list the worksheets using the Excel JavaScript API in this blog.
If you will like to keep using Google APIs, and using the Google Apps Script built-in services. You will need to convert the file to Google Sheets.
Updating Answer:
You can review more about the Excel Services API services here.

How to copy the latest file from Sharepoint to Blob Storage using Logic App?

I am trying to extract the latest excel file from Sharepoint into Azure blob storage using Logic App.
I created the flow and it's working. However, it's copying all the files from the sharepoint to Blob.
Below is my flow.
enter image description here
I get new excel file everyday in my Sharepoint (/Shared documents/Data), hence I used list folder to locate it.
Then I used Filter array to filter the files as last modified with less than or equal to 5 m
I don't get any error. However, it's copying all the files rather than last modified file.
Can anyone advise how to address this?
You can use the trigger specific to 'When new file is added in sharepoint folder'. Documentation link - https://learn.microsoft.com/en-us/connectors/sharepoint/#when-a-file-is-created-in-a-folder

Office JS API: last saved time?

I develop a Javascript Office add-in which can run on Word, Excel and PowerPoint.
One of the features of the add-in is to suggest user to upload current file to our server if the file have changed and/or saved since the last successful upload.
I had some luck with document.properties.lastSaveTime for Word application: https://dev.office.com/reference/add-ins/word/documentproperties?product=word
But this API is Word-specific and is not available for other hosts.
Is there a way to get the date of last file change (document, workbook and presentation accordingly to the host application) using shared API, that is, API available for all three hosts?
If such shared API functionality doesn't exist, even some clue of how to get last changed date of individual hosts' files would be helpful.

Running Excel automation locally or on server

Wanted some opinions on which method is a better practice. We have a sales report that MUST be generated in a very specific format (down to the row colors and fonts).
I already have written a macro which pulls from our database and populates the entire workbook in about 15 seconds. The question is how should it be populated?
1) Process server-side: Users initiate the request on the intranet page. ASP.NET opens the workbook template, executes the macro and serves back the final sheet.
2) Process locally: Users download the blank template, run from their desktops which automatically connect to the database.
I like the first one because I can enforce the template, timing, users, and security of the data. But is running Excel automation on an internet web server recommended? I like the second option, but I'm afraid of losing standardization as template sheets begin floating around the company.
As for server side:
I highly.. HIGHLY.. recommend checking out the OpenOffice/LibreOffice XML format for spread sheets.
You can use the localc binary in headless mode to convert the XML file to XLSX or what have you. I use it to create PDF files instead of using ReportLab.
Alternatively here are some other projects that attempt to write to Microsoft formats directly:
http://pypi.python.org/pypi/xlrd
http://pypi.python.org/pypi/xlwt
As for client side:
If you expect the user to be only using Excel and not any other spreadsheet software then go ahead and use an ODBC data source. ODBC will have to be configured per user unless you use some fun VBScript to pull the data from an HTTP server every time it is loaded. There is also the option of making an XLS spreadsheet that simply holds the data and including it into an XLS document as well which would be both a server and client XLS requirement.
Go for server side. Makes information simple to archive and share and will most likely be multi-platform as well.
If you like to use your first option, then you want to avoid using VBA on an installed instance of Excel on the server. This is extremely resource intensive and does not scale well. Instead, if you are writing ASP.NET code, then you should try using the Microsoft Office Interop functionality that is built into the .NET framework. It should possible to adapt your existing VBA code to run under ASP.NET with some changes, but you will have a much more reliable product in the end.
Example Code
However, as #whardier points out in his response, if this were for a large scale or public site, the suggestions he makes would be much more suitable and would scale much further.

SharePoint Online file storage

We have a requirement to store documents in SharePoint Online as people copy files to a shared network directly.
Is there a way of automating this? I was thinking of a windows service which will poll the directories, find any changes like new subdirectories or new files, then upload them to a SharePoint Online document library.
You don't have to poll if you use a FileSystemWatcher inside your Windows service for real-time notifications.
http://msdn.microsoft.com/en-us/library/system.io.filesystemwatcher.aspx
However, if your requirement is 100% accuracy, you will need to build in some sort of tracking/checksum mechanism to make sure that every document was 1) detected and 2) successfully moved to SharePoint.
You may want to have your service check the delta every time it starts up, and then subsequently only respond to FileSystemWatcher events.
EDIT: Per Tony's question below, here are some additional thoughts on getting files to SharePoint.
First, try a simple test.
1) Copy the URL of a document library within the BPOS SharePoint site. Make sure you're on a machine that has the Office Online sign in app on it.
2) Open Notepad. Type some random text.
3) Click on File -> Save As.
4) Paste the URL.
5) Attempt to save the file.
This works great on "regular" SharePoint (done it many times). If this works with BPOS, it opens up several options.
File System Replication to a SharePoint Online or Office 365 document library is planned to be released with the "Cloud Connector for Office 365". With the current version database content is supported only, but bi-directional with V2.0

Resources