How to pool all RTD calls at excel startup? - excel

I have an RTD server that gets the values from a realtime source. The problem is that the users have pretty large excel sheets close to 20,000 RTD formulas. So when the user opens the sheet, all the RTD formulas get fired resulting in sending 20,000 queries to the server. This works for now, but the server can perform much better if i can group the queries and send it to the server.
My idea was to maintain a flag. When the calculation starts, the flag will be set to false, and when the calculation ends i can reset it. When an RTD formula is called, if the flag is unset, i will not send the query to RTD server, but pool it. When the flag is set to true i can combine the pooled queries and send it to the server
I am not sure how to get the notification as when excel starts and stops calculating. Please help. Also if you know any other approach for solving this problem, it would be great. I am using Excel 2007, C# 3.5
Please help. Thank you very much.
Rashmi
Thanks,

Since you're using RTD, I wonder if this could work:
You make calls to the back-end in timed batches. You start a timer in the first RTD call - a short time, maybe 500ms, then build up a batch of work from all the calls made to your RTD server until the timer expires, then send the batch to the back-end and await the response, while starting a new batch on the client. When the batch response comes, you notify Excel that the topics have been updated, and when Excel calls RefreshData you return the individual items out of your batch response. This way your batching uses the async-ness of RTD effectively, but you are not tied to Excel's recalculation events.
Hope this makes sense.

Related

Google Sheets published on the web continue the calculation even if closed

I have an Excel spreadsheet with a huge amount of data to calculate in real time.
I noticed that Google Sheets has the option to publish the spreadsheet online and update it every 1 minute.
My calculations in real time are based on the quotes that are provided by Google Finance. Therefore, each time the quote is updated, the spreadsheet redoes all calculations.
If I publish on the web will it continue to calculate normally even with the browser CLOSED? Is there any better alternative (like a Virtual Machine or the like)?
You can turn off automatic updates for published files.
(I believe this is what you mean)
When you publish a file, you can choose to stop automatic updates of calculations on the sheet by following the steps outlined here.
Sorry If I misunderstood.
Also as a workaround to have the sheet load faster perhaps you might want to try the following:
You can use a Google Spreadsheet and:
Write a bound Apps Scripts to do the calculations.
Install a time-driven trigger to do it when you want it to (daily).
Have your script setValues into the spreadsheet range(s) that you need.
Hint: You can use
UrlFetch
to get the data from any url.
Since you are using Google finance, there is a built-in formula in sheets already.
GOOGLEFINANCE(ticker, attribute, start_date, end_date|num_days, interval)
Note:
Google sheets run in the cloud, and the sheet itself is not updated unless, it is opened, edited, called from a script, or script trigger, form trigger.
However, you can set up a time-based trigger to run a script and put the data in your sheet, once a day. The script will certainly take the time needed to process its instructions, but it will not be the sheet itself.
And by using Apps Script you can access all of Google Suite's Services and APIs , including Gmail, Drive, etc.
You can also have your script write logs that help a lot in debugging.

Halting macro code execution until SQL query is completed

I have written a fairly complex application that uses Excel (2016) as front-end application for data stored in a MS Access database, using the ADO 6.1 library. I have noticed during macro code execution that SQL transactions triggered by my Excel application can take quite long to complete, and often the execution of the next line of code in my Excel macro depends on this SQL transaction first being completed. Unfortunately, macro code execution and SQL transactions are asynchronous operations, which means that the macro jumps to the next line of code even though the SQL transaction hasn't been completed.
My current work-around is to use a Sleep() function using the Windows API to insert a fixed delay, but this is a really ugly solution as it does reduce the performance of my application and depends very much on the CPU load, so it may sometimes work, sometimes not.
I haven't been able to find so far a solution to this problem, I can't find any hints on the Internet either.
Using Application.CalculateUntilAsyncQueriesDone doesn't help here either.
Does anyone have an idea or a hint how to halt macro code execution in Excel until an SQL transaction has been completed? Is there a method in ADO to check the completion of an SQL transaction?
Is your query within the Data/connections section?
I had this problem to, I turned off "Enable Background refresh" and added "DoEvents" to the VBA, this forces the data connection to refresh before it allows the code to continue. Downside to this is it makes excel feel like its locked up, But this resolved my issue.
Sub Button1_Click()
ActiveWorkbook.Connections("ScrapData").Refresh
DoEvents
....Other code....
End With

Getting reasonable performance from excel while receiving RTD feeds and avoid blocking

I am handling RTD feeds again and remembering the difficulties, but now we have multi-core machines and multi-threading. maybe anyone can advise.
As I understand/rememeber: pushing data into Excel is not one(obvious reasons) so it sends a friendly nod to say your parcel is ready come and get it. Then when Excel has done its nails and feels in the mood, it might get the data.
So this kind of architecture is right back on the dance-floor and hopefully I can make it work.
Problem.
I need to run a bot that examines the data and responds differently;
1. looping or running a winapi timer even is still enough to keep excel busy and no data, or so it seems.
Definitely, executing bot logic , however small, will bring on an Excel fainting fit.
Tried responding via calculation event. Very hit-and-miss and definitely not up to the job. There is no logic obvious as to when and why it fires or does not other than a " bad hair day"
Tried winapi timer looking at the newly acquired data every second comparing to old in a separate data structure, running some EMAs and making a decision. No dice.
The timer is enough to put a delay up to 10 or even 20 seconds between occasional delivery of data.
Options I am thinking about:
1. The timer running outside of the excel environment and lookin in a the data. e.g an AQddon via pia etc. What I don't know is whether this Addon, perhaps inc C# or vb.net, could utilze multithreading, via tasks I think and do its bit without "scarin the panties of her ladyship"?
2. I remember hearing that XLL UDFs could be asynchronous, does anyone know if this is a potential option?
Any ideas?

How do I regularly post excel data to a web service?

I have a user requirement that I have been battling with for a while with no success. I need to write an add-in that can read around 100 formula-driven cells (of a specific spreadsheet) once every couple of minutes, and send to a web service.
I'm more than happy to use Excel-DNA or VSTO, but everything I've tried so far causes the user interface to hang for an instant. Would this always be the case if the data is being read from the active spreadsheet (even from a different thread) ?
Reading the sheet from a different thread is likely to have a worse effect than reading from the Excel main thread (say in an event or something). This is due to the COM threading switch that is required for the cross-thread calls. In the end, all the COM calls have to do their work on the main thread anyway.
You might have more success by hooking some of the Excel events, as a start the Workbook.SheetChange event, then checking whether the changes affect your watched Range(s) and updating an internal data structure with the new data.
You can then update the back-end periodically (or only when watched cells have change) from a background thread.
You need to run a secondary thread to post the data to the web service to prevent any UI freeze.

a synchronization issue between requests in express/node.js

I've come up with a fancy issue of synchronization in node.js, which I've not able to find an elegant solution:
I setup a express/node.js web app for retrieving statistics data from a one row database table.
If the table is empty, populate it by a long calculation task
If the record in table is older than 15 minutes from now, update it by a long calculation task
Otherwise, respond with a web page showing the record in DB.
The problem is,
when multiple users issue requests simultaneously, in case the record is old, the long calculation task would be executed once per request, instead of just once.
Is there any elegant way that only one request triggers the calculation task, and all others wait for the updated DB record?
Yes, it is called locks.
Put an additional column in your table say lock which will be of timestamp type. Once a process starts working with that record put a now+timeout time into it (by the rule of thumb I choose timeout to be 2x the average time of processing). When the process stops processing update that column with NULL value.
At the begining of processing check that column. If the value > now condition is satisfied then return some status code to client (don't force client to wait, it's a bad user experience, he doesn't know what's going on unless processing time is really short) like 409 Conflict. Otherwise start processing (also ideally processing takes place in a separate thread/process so that user won't have to wait: respond with an appropriate status code like 202 Accepted).
This now+timeout value is needed in case your processing process crashes (so we avoid deadlocks). Also remember that you have to "check and set" this lock column in transaction because of race conditions (might be quite difficult if you are working with MongoDB-like databases).

Resources