Code for API in smartsheets on demand batch export - node.js

I'm fairly noob on the whole RESTful/SOAP arena, I've been looking around for samples using node.js to trigger a Rest or SOAP API to be able to export a batch of smartsheets to excel
I already got a sample for google drive export but its not quite what I need
I'm not sure how to search for code or samples using microsoft excel export and then batch from a list of files or even a smartsheets workspace
I expect a folder with all the exported excel files in the server at a specified location. I'm also thinking of how this could be done using node-red instead as an alternative, what is the suggested path of least resistance.

You can do a GET Sheet request via the Node.js SDK and have the results returned as an Excel file:
https://smartsheet-platform.github.io/api-docs/?javascript#get-sheet-as-excel-pdf-csv
Note, this requires the Sheet ID and is done as a request for each sheet. There isn't a method to request a batch export of a collection of sheets via the API. You would need to gather a list of Sheet IDs and loop through it to request each one as an Excel file while storing them in your desired location.
You could do a GET Workspace request to get a list of all of the sheets in a workspace for creating your list of Sheet IDs. You could use the List Sheets request to get a list of all of the sheets you have access to as well. Both of those requests are supported by the Node.js SDK.

Related

Sending recently created Sharepoint-file as attachment with Power Automate

After some months I could say I am getting the hang of Microsoft Flow, however I could use some help with the following issue:
In a flow for reporting purposes, a temporary file (.xlsx) is created in a sharepoint folder by means of a template. This temporary file is then filled with rows and info from other sources. So far so good.
I use the body of this newly created and furnished file as an attachment for an e-mail to the chief. However, the attachment came out identical to the (empty) template file, without the rows and furnishing.
Adding a delay of two minutes before attaching and sending the mail solved it for relatively small reports, but this is not ideal as I want it to work regardless of file size. Furthermore I do not understand why it would send an empty (old) version of the temp. file in the first place, as all the furnishing operations should have executed before copying and attaching (the flow is entirely in series).
Sorry for the long story. Does anyone have a more elegant solution than using a Delay-node?

Excel Script and Flow problem - Script not found. It may have been unshared or deleted. clientRequestId: ebf53c05-7f24-4537-a91d-9edb8903eb8c

I have a PowerApp that takes a SharePoint list, filters the data, puts it in a collection, and calls a flow/PowerAutomate that formats the data and puts it in a CSV format. Then the flow creates a spreadsheet using a pre-configured spreadsheet as a template which has a script included. The script is shared:
ExcelScriptProperties
When anyone but I run the flow either from the PowerApp or from the flow itself, they get an error "Script not found. It may have been unshared or deleted." This happens when anyone other than me tries to run the app/flow. I've shared the app, the excel template file, the flow, and the script. I have the person who's helping me test listed as an owner on the SharePoint list. I've also tried recreating the script, the template, the flow, and even the app.
I'm at a loss as to what to try next.

Excel 2007 Refresh Imported CSV File From Web

Log data from a test is uploaded to a web service, and the processed CSV is downloaded back into Excel for viewing in charts. At the moment, this is done via copy and paste for short CSV files and the Data > From Text feature for larger CSV files. Unfortunately, this takes a bunch of time for every test, and I need to make the process very simple for someone else to update the Excel spreadsheet.
The Excel spreadsheet contains 5 raw-data pages which are used to store the CSV from the server. I have no issues selecting Data > From Text, entering the website URL, and completing the format to import. This process can be repeated (same as the Copy and Paste) for all 5 pages to import the data.
This process only allows me to put in one filename, so I am using the same URL for the data, and having PHP return the CSV of the latest (or a specifically configured) test whenever the website is accessed. I've verified that this process is working correctly.
Unfortunately, when I do 'Refresh All', it prompts for a filename unless I go to Data > Connections > Properties, and uncheck 'Prompt for file name on refresh'.
However, even when I do that, I'm getting mixed results. Sometimes only one of the pages will update. (Seems to be the last one I set up.) Sometimes none of them do. I need a solution which updates all 5 pages based on the current CSV from the server without having to set up the connections again every time. Ideally I'd like to just hide these raw data sheets so we can have an Excel file that's just the final charts.
Surely this is a common function and I am doing something wrong, yet all the guides I try on the Internet don't seem to work. For example, this one:
http://www.kimgentes.com/worshiptech-web-tools-page/2010/8/18/web-connecting-csv-files-as-external-data-to-excel-spreadshe.html [URL is corrected]
Seems like they only set up one connection. I can get one working to refresh, but not more than one.
I have seen this happen and finally figured it out. There are actually 3 things that can happen to give this result, and a separate solution for each:
First, Excel software uses the IE 11 web object to when it does web
retrieval of data. This means it will be "sticky" to sessions using
IE11 to access the data. Most websites these days are run by cloud
servers, which generate sessions on the server with the most load.
This normally has no impact on users on web browsers since they
login and can visually enter their credentials etc. But when a
program accesses a website and must use a specific web browser, it
must use the properties of that browser and how it works. I ran into
this a lot when I would generate and be able to download my CSV
files on the website in Chrome, then try to use Excel to import the
same files wouldn't work (it would say they weren't there). The
solution to this, at least for now, is to use IE 11, login to the
website, generate the CSV files and test that they can be
downloaded. Then use Excel to run the web import and it should pick
up the same sticky session to get the CSV files.
Second, password entry is a different thing, but also has to do with the stickiness
of the data. For some reason Excel will not cache your credential
responses for logging into a website without you entering them 3
times. This experience may change for you, but I found that I must
enter a new credential set (for a new web import of a CSV) 3 times
before it becomes permanently cached by Excel. After this, I don't
have the problem.
Third, most complex Excel programs that require
web import may also require that you either import local data you
downloaded from a website, import data from a website into a sheet
or run more complex objects like Macros. All of these need proper
permissions. You may need to set your Trust Center settings to allow
you to use your Excel program on your computer in this way. That is
part of MS office. You can set add and update those as per MS info
here:
https://support.microsoft.com/en-us/office/add-remove-or-change-a-trusted-location-7ee1cdc2-483e-4cbb-bcb3-4e7c67147fb4

Download Webi report from Excel

With newly released Webi there's no way to manipulate reports with VBA like it was in DESKI era.
I'd like to know if there's a way for me to click a button with parameters in Excel sheet and get a report from the server?
I've been thinking of using the RESTful Web-services but it seems that there is a performance problem.
I also considered using a JAVA app in the middle using the SDK but it's not really satisfying as I add one layer.
Do you know if there's an other way to download a Webi report from and to Excel?
For this type of requirement, you'd normally use the OpenDocument feature. There is one thing that it won't do however, at least not for Webi documents, and that is deliver the output in Excel format (HTML and PDF are the two possible formats for Webi). In all fairness, the export to Excel option is only about two or three clicks away, but I can understand that this wouldn't be an ideal solution.
Another option is the Java SDK, which I would not recommend, as the ReBEAN SDK (the part of the Java SDK you need to interface with Webi documents) is deprecated and replaced by the REST SDK.
The REST SDK would be the way to go if the OpenDocument feature is not sufficient. Keep in mind that this would involve quite a few steps, each time sending a command to the WACS server and then decoding the answer. The steps would be:
Authenticate and get a logon token
Refresh the document (if necessary pass prompt values)
Export the document to Excel
Close the document
The REST interface is only supported on the WACS server, which should run on your BI4 server (unless you have a customised landscape). If it's slow, I would suggest looking into the root cause of this performance issue, instead of discarding the SDK altogether.
If you're going to use the REST interface, I would recommend opting for JSON to communicate through REST instead of XML. It's easier to read and parse.
A last option, which I wouldn't recommend, is LiveOffice. This is a separate product which allows you to embed contents from Webi documents into Office documents (most notably Excel). LiveOffice has always had its share of problems and has not received much love from SAP regarding much needed updates.
One final thought: the report will never appear in the same sheet, at least not without an additional amount of coding. Whatever SDK you end up choosing, you will always end up with an Excel file. If you want to show the results in the Excel file you started from, you'll need to code the steps to open the generated file, grab the contents and then copy those to your worksheet.

Excel Connections in Google Drive

I have an Excel 2010 file (.xlsx) with multiple connections to import data from the web. Instead of having to open the file manually and click "Refresh All", I want to use Google Docs (specifically Google Sheets) to automate the refresh process every 60 minutes. I uploaded the xlsx file to my Google Drive but the connections do not work anymore. How do I get around this?
Additional information:
1. The data connections are to Kayak flight search service. Here's an example page. I am importing the table data at the top of the page which shows prices for flexible dates.
2. I tried using IMPORTHTML in Google Sheets but for some reason it doesn't identify the above table as an HTML table. I get a parse error.
I used Kimono to generate this API (you have to sign up to use it) which can return CSV of the form:
"collection1"
"depart-sat.text","depart-sat.href"
"$798","javascript: FilterList.flexFilterToDates('20141213', '20150111', 798)"
"$798","javascript: FilterList.flexFilterToDates('20141213', '20150112', 798)"
...
"collection2"
"depart-sun.text","depart-sun.href"
"$1127","javascript: FilterList.flexFilterToDates('20141214', '20150111', 1127)"
"$1211","javascript: FilterList.flexFilterToDates('20141214', '20150112', 1211)"
...
In Google Spreadsheet you can get the data with =ImportDATA("https://www.kimonolabs.com/api/csv/ccyfjzlm?apikey=YOURAPIKEY").

Resources