I have an Excel spreadsheet with a huge amount of data to calculate in real time.
I noticed that Google Sheets has the option to publish the spreadsheet online and update it every 1 minute.
My calculations in real time are based on the quotes that are provided by Google Finance. Therefore, each time the quote is updated, the spreadsheet redoes all calculations.
If I publish on the web will it continue to calculate normally even with the browser CLOSED? Is there any better alternative (like a Virtual Machine or the like)?
You can turn off automatic updates for published files.
(I believe this is what you mean)
When you publish a file, you can choose to stop automatic updates of calculations on the sheet by following the steps outlined here.
Sorry If I misunderstood.
Also as a workaround to have the sheet load faster perhaps you might want to try the following:
You can use a Google Spreadsheet and:
Write a bound Apps Scripts to do the calculations.
Install a time-driven trigger to do it when you want it to (daily).
Have your script setValues into the spreadsheet range(s) that you need.
Hint: You can use
UrlFetch
to get the data from any url.
Since you are using Google finance, there is a built-in formula in sheets already.
GOOGLEFINANCE(ticker, attribute, start_date, end_date|num_days, interval)
Note:
Google sheets run in the cloud, and the sheet itself is not updated unless, it is opened, edited, called from a script, or script trigger, form trigger.
However, you can set up a time-based trigger to run a script and put the data in your sheet, once a day. The script will certainly take the time needed to process its instructions, but it will not be the sheet itself.
And by using Apps Script you can access all of Google Suite's Services and APIs , including Gmail, Drive, etc.
You can also have your script write logs that help a lot in debugging.
Related
I am saving a counter number in user storage.
I want to provide some content to the user which changes daily using this counter.
So every time the counter increases by 1 the content will change.
The problem is the timezone difference.
Is there anyway to run a function, daily which will increase this counter by 1. I could use setInterval() which is a part of the NodeJs library but that won't be an accurate "daily" update for all users.
User storage is only available to you as a developer when the Action is active. This data is not available once the Action is closed, so you wouldn't be able to asynchronously update the field. If you do want asynchronous access, I'd suggest using an external database and only storing the database row key in the user's userStorage. That way you can access the data and modify it whenever you want.
The setInterval method will run a function periodically, but may not work in the way you want. It only runs the function while the runtime is active. A lot of services will shut down a runtime after a period. Cloud Functions, for example, run sometimes but then will shut down when not used. Additonally, Cloud Functions can be run several times in parallel instances, executing a setInterval function several times in parallel. That would increment the counter more times than you want.
Using a dedicated Cron service would help reduce the number of simultaneous executions while also ensuring it runs when you want.
You are unable to directly access the user's timezone within the Action, meaning you won't be able to determine the end of a day. You can get the content to change every day, but it'll have some sort of offset. To get around this, you could have several cron jobs which run for different segments of users.
Using the conv.user.locale field, you can derive their language. en-US is generally going to be for American users, which generally are going to live in the US. While this could result in an odd behavior for traveling, you can then bucket users into a particular period of execution. Running the task overnight, either 1AM or 4AM they'll probably be unaware but know that it updates overnight.
You could use the location helper to get the user's location more precisely. This may be a bit unnecessary, but you could use that value to determine their timezone and then derive that user's "midnight" to put in the correct Cron bucket.
My company is currently using excel for reporting where we have to collect data from various business units on a monthly basis. Each unit will send an excel file with 50 columns and 10-1000 row items each. After receiving each file, we will use vba to consolidate all these files. This consolidated master file is then split to various sections and sent to various personnels where any changes will have to be updated in the master file.
Is there any way that this process can be improved and automated using a different system?
Is there any way that this process can be improved and automated using a different system?
Well, you already have the "low cost, low tech" solution.
The proper solution (until something better is invented) is a proper web application, which collects the data from the various users, processes it and then generates the necessary reports.
This endeavor is not something to be treated lightly, even if it sounds like a small task. Your company needs to understand what they want, and then contact some supplier companies to get an estimation of the costs.
The costs cover at least:
the development of the application;
the server on which the application will run after it is finished; can be a virtual server;
the costs of training the employees to use the application properly;
the costs of actually making the employees to use the application properly (the most resilient being usually the managers themselves);
Of course, I assume that you already have some network infrastructure, backups of all important data is done according to the best practices (by IT)...
I have developed an Accounts and Inventory System in Excel, All Data Entry, Edit, Delete is through VBA Forms. Every thing worked fine till the The biggest problem arrived, which is now my company ask me to operate same excel software from different PCs at runtime. I know the work book behaves read only when opened to another location at a time. Another idea is shared workbook but it also limits data ENTRY from win form.
Your best option (unless you move to Access) is a master and clients.
The master will be a store of all data, and the clients will be local to each person, a trigger will need to exist in the clients (say on form open and close) to send their content into the master. Depending on what your workbook does you may need to build mechanism like caching, syncing, and error handling.
I built a package in SSIS that uses a script task to open an Excel file, format, and refresh some data in Excel. I would like to have Excel visible when the script task is running to see if Excel gets hung up which occurs all the time. Is this possible? I am converting a process that is calling Excel via a shell script to using SSIS to call Excel instead. I guess a second question is, is that a bad idea?
Why this is a bad idea
Generally speaking, administrators are tasked with maximizing the amount of "uptime" a server or service on the server has. The more software that gets installed on the machine, the greater the odds of service interruptions and outages due to patching. To be able to manipulate Excel in the mechanism you described, you're going to force the installation of MS Office on that machine. That will cost you a software license and the amount of patching required is going to blow holes in whatever SLAs those admins might be required to adhere to.
Memory leaks. Along with the whole patching bit, in the past at least, there were issues with programmatically manipulating Excel and it basically boiled down to it was easy to end up with memory leaks (I gotta make you understand. Allocated memory but never given it up, never let the allocated memory go down). Over time, the compounded effect is that running this package will result in less and less system memory available and the only way to reclaim it is through a reboot, which gets back to SLAs.
The reason you want to see what Excel is doing is so that you can monitor execution because it "gets hung up which occurs all the time". That doesn't sound like a stable process. Again, no admin is going to want an unstable process running on the servers. Something is not right in the cycle of events. Whether it's your code that opens Excel, the macros it runs, etc, something in there is awry and that's why you need to inspect the process. This is akin to putting a bandaid on a shotgun wound. Stop shooting yourself and you won't require bandages.
The task that you're attempting to perform is "open an Excel file, format, and refresh some data in Excel" SSIS can natively push data into Excel. If you preformat the file, develop your SSIS to write to the formatted file and just copy it off, that should work. It's not graceful but it works. There are better methods of providing formatted data but without knowing your infrastructure, I don't know if SSRS, SharePoint, Excel Services, Power Pivot, etc are viable options.
Why you won't be able to see Excel
Generally speaking, the account that runs SQL Agent is probably going to be fairly powerful. To prevent things like a shatter attack, from Windows 2008+ services are restricted in what they can do. For the service account to be able to interact with the desktop, you have to move it into the user tier of apps which might not be a good thing if you, or your DBA/admins, are risk adverse.
For more information, please to enjoy the following links
InteractWithDesktop
http://lostechies.com/keithdahlby/2011/08/13/allowing-a-windows-service-to-interact-with-desktop-without-localsystem/
https://serverfault.com/questions/576144/allow-service-to-interact-with-desktop
https://superuser.com/questions/415204/how-do-i-allow-interactive-services-in-windows-7
That said, if all of the stars are aligned and you accept the risk, of Allow Service to Interact with the Desktop, the answer is exactly as Sam indicated. In your unshown code, you need to set the Visible property to true.
As you go off and allow interactivity with the desktop and someone leaves some "testing" code in the package that gets deployed to production with MessageBox.Show("Click OK to continue"); be aware that if nobody notices this dialog box sitting there, you'll have a job waiting to complete for a very long time.
Regarding your first question, I understand that you want to debug your script task.
You can make Excel visible by adding the following line of code in your script task (assuming C# is the coding language):
// Create your Excel app
var excelApp = new Excel.Application();
// Make the Excel window visible to spot any issues
excelApp.Visible = true;
Don't forget to remove/comment that line after debugging.
Regarding your second question, I don't that this is a bad idea if you properly handle how Excel is opened and closed, in order to avoid memory issues.
I'd like to be able to schedule an Excel macro (VBA) to run in the middle of the night (after a file is ready) to create a customized workbook (multiple sheets, pivot tables, charts, filters, outlines, custom formatting, etc.). Currently, the macro is fired up manually the next day. Furthermore, it needs to run unattended on a server (laptop goes home at night!). Anybody successfully do something like this? Please, no Unix-side hacks (e.g., Perl modules) - need full access to VBA features, including database functions. Thanks!
Well you have some options.
First, for all Excel has to be installed on server.
Then you create a sheduled task to call a program.
In this case you can write e.g. a vbscript or .NET program to call the app, load the document and starts its content (your VBA). That should work at all.
Or you move the VBA code to a program and target Excel with your code, but prolly more work.
If you do this with .NET you have prolly best success. e.g. you can add an eventlog for successful run, etc.
If you can leave Excel running on the server all the time, you can use Application.OnTime to schedule the next runs of a particular macro (once it's run, reschedule another in the macro code). When I worked in banking we used this all the time to run night-time jobs.
If you cannot leave Excel running, I have to say you may be in a world of pain. It's possible to start Excel using an AT job (scheduled task) but you may have headaches getting it to run under the correct user privileges and if you use any addins you'll experience regular disasters where they failed to load and stopped Excel from starting up. At the end of the day, Excel isn't really meant to be run on servers (it's actually a violation of the terms of use) and starting/running/stopping it is not going to be a reliable system even if you do get it to work.