I'm creating an Excel dashboard that imports a variable number months' worth of financial/accounting information from a database to an Excel sheet. Using this information I have a Calculations sheet that computes some financial indicators, again, month by month. Finally, this information is displayed in graphs on a separate sheet (one indicator per graph, with the monthly information plotted to see the tendencies). Currently I have written VBA code that formats the sheets to accomodate the number of months requested, pull the data from the SQL server, and update the graphs. Since there are 53 indicators for each operation (6 operations), this process takes about 3 minutes.
Does anyone recommend a better way to do this? The current way 'works' but I've often thought that there must be a more efficient way to do this.
Thanks!
Chris
You could look at skipping out the excel part and using SQL server reporting services (SSRS). If you have ever used business objects or crystal reports its kind of the same thing and I would imagine would offer better performance than doing things in excel.
Related
I am having data which is reported in excel format and the same is created manually. There are lots of typos expected in the report as the person feeding the data in excel sheet is highly unskilled. The way the person listens to the data entry, as per his understanding the data is entered in excel eg Shree/Shri/Sri etc. sounds almost similar while they are pronounced but all 3 are different as the data as whole.
At present I am solving the problem by cleaning the data with Open Refine java based localhost solution using various clustering methods available in the software. The data is added to the excel pool in incremental manner hence the excel rows are increasing after each update. Open refine is consuming heavy resources as it works on localhost.
It will be really helpful if there is any other way round to solve the problem.
Thanks in advance!
High-Level Problem: Excel doesn't seem to be able to handle some of the more complex dashboards I am try to build, and I am hoping for some suggestions on how to alleviate this issue.
My job/requirements: I build executive and sales facing dashboards and reports for our organization in excel through an OLAP cube connection - the tool has been great for running quick analysis - but some of the larger projects that I run through this connection often crash or take painstakingly long to refresh. Here are the requirements I have, and why I am using excel:
Need to manipulate/alter our data freely - for example, if I am pulling customer data, where I have all 50 states in the columns, and all 10,000+ customers in the rows, I need to be able to run a formula adjacent to the data where I am comparing that customer/state data to company wide customer/state data.
End product (dashboard/table) needs to be visually pleasing as the end users are our sales team/executives.
Need to be able to re-run/refresh the same analysis easily. Most of the dashboards I run get sent out on a daily or weekly cadence, so each refresh of the dashboard needs to take no more than a few minutes, ideally seconds.
- Need built in time intelligence functionality: I need the dashboard I am building to know what day it is. Excel has the = today() formula.
Symptoms of issue: Right now, when I run more complex queries, data refreshes can take upwards of 15 minutes. Some of these files are approaching north of 200kb. For these problem files, further development is painfully slow and inefficient, because even the simplest of formulas (like a vlookup column off of the OLAP table) forces my computer to use all 8 processors and hogs about 95% of the 32gb of RAM.
Specifics of the analysis I am running: Much of this analysis requires pulling tens (and sometimes hundreds) of thousands of rows, and these tables are sometimes dozens of columns wide, as I might be pulling sales data by state or by month. Some dashboards often have half a dozen of these OLAP tables - each with its own purpose. There is no way to reduce the amount of data that I am pulling - I need all of the data to reach the conclusions I am trying to reach.
For example, one of the projects I am currently working on requires me to take 24 months of sales data, broken out by state (as columns, on the x axis of the OLAP table) and broken out by individual customer (over 10,000) on the rows. This calculation alone takes forever to process and refresh, and causes all other subsequent formulas to run extremely slow, even simple vlookup columns adjacent to the OLAP cube.
Research I have done: I have looked into alternatives to excel, but the largest issue I face is that I still have a set of requirements that need to be met, and the obvious tools at my disposal don't seem to meet those requirements at first glance. Again, the end product of what I am building must be easily consumable for our end users (sales reps and execs). If I were to run a sql query through Microsoft SQL Server to pull the data I am looking for, I would need to be able to refresh the data on a daily basis - but more importantly, I would need to be able to easily manipulate and build out extra calculations and formulas off of the query. Excel seems like it is the only tool that checks those boxes, and yet it also feels like a dead end.
I need a bit of help. My company has data in multiple excel sheets. Some sheets are straight forward (in that they easily map data types). But most of them are merged rows and cells etc within one header. I am developing an application in c# for maintaining a massive database with proper user and role management and multiple departments as stake holders.
I have identified the relations from within the excel sheets and all is well. What I cannot understand is how to import that historical data to map to the data tables? Basically, when a new custom system is designed, how would you import senseless data within it?
The only thing I could think of was writing a utility program that reads every row and every cell of the excel sheets and then extract the required values to insert to the proper database table. But this would take ages due to the numerous number of excel sheets.
Wondering anyone of you went through the same thing as I have?How did or would you handle this?
Many thanks guys :)
If the data is not regular, you've got a world of pain ahead of you. The object model in Excel can be driven by any of the Windows Scripting Host languages, like VBScript and JScript. In fact, most scripting languages have some support for Excel traversal.
However, as you've already noted, the data layout isn't the same across all spreadsheets. I had a similar problem years ago traversing SCADA data from power stations: each station did it slightly differently and changed their format from time to time.
I am looking at formatting a report which has been automatically generated by a 3rd party system. As we have no access to the Database directly i would like to build a Macro which would format the report into a more readable format.
I had initially thought about ingesting this raw data into a database as i am pretty competent a writing the SQL queries, however i think it would be easier if someone could run this through a macro.
The initial report shows which users have attempted which training modules and their completion status.
N.b. A user may have completed a module several times, therefore will appear multiple times.
The link below is the spreadsheet with two sheets, sheet 1 is the raw data and sheet 2 is how i would like things to appear.
https://www.dropbox.com/s/p1hipx17q3mf3dm/Learning-Report.xlsx
Any help / ideas would be much appreciated as i am pretty new to the whole macro's in excel thing.
Many thanks
Ian
I am answering in the same manner the question is phrased so please bear with me...
Identify all unique employees (for rows)
Identify all unique courses (for columns)
Find all the attempts and compute
a. Highest Score
b. Status
Put the data in the second sheet.
See if the question has been precise... the answer would also be to the point.
My current employer (to remain nameless) has a collection of incredibly sophisticated Microsoft Excel 2003 worksheets (developed by contractors, also to remain nameless).
The employer is replacing the Excel-based solution with a SalesForce-based solution (developed by other contractors, likewise to remain unnamed). The SalesForce solution is also very complex using dozens of related objects and "Dynamic SOQL" to contain the data and formulas which previously was contained in the Excel-based solution.
The employer's problem, which has become my problem, is that the data from the Excel spreadsheets needs to be meticulously and tediously recreated in .CSV files so it can be imported into SalesForce.
While I've recently learned I can use CTRL-` to review formulas in Excel, this doesn't solve the problem that variables in Excel have cryptic names like $O$15. If I'm lucky, when I investigate $O$15, I'll find some metadata explaining if n cells up and/or some other data m cells to the left, and/or (in rare instances) there may be a comment on the cell.
Patterns within the Excel spreadsheets are very limited, rarely lasting more than 6 concurrent rows or columns and no two sheets which need to be imported have much similarity.
Documentation of all systems are very limited.
Without my revealing any confidential data, does anyone have any good ideas how I might optimize my workflow?
It's not clear exactly what you need to do: here are 3 possible scenarios, requiring increasing knowledge of Excel.
1. If all you want is to convert the Excel spreadsheets into CSV format then just save the worksheets as CSVs.
2. If you just want the data and not the formulae then it would be simple (using VBA) to output anything that isn't a formula (the cell.Formula won't start with =).
3. If you need to create a linkage excel-->csv-->existing Salesforce objects/SOQL then you will need to understand both the Excel Spreadsheets and the Salesforce objects/SOQL that have been created. This will be difficult unless you have good knowledge and experience of Excel and also understand what the salesforce App requires.
Brian, if you're still working on this, here's one way to approach the problem. I use this kind of process often for updating data between SFDC and marketing automation apps.
1) Analyze the formulae that you're re-creating in Salesforce.com to determine what base data fields you need (stuff that doesn't have to be calculated from something else.
2) Find those columns/rows in your spreadsheets and use Paste Special -> Values in a new spreadsheet to create an upload file with values instead of formulae that you need for each data area (leads, prospects, accounts, etc.)
3) If you have to associate the info with leads or contacts or accounts and you have already uploaded or created those records in Salesforce.com, be sure to export them with their ID numbers. That makes it easy to use the vlookup formula in Excel to match up fields that you need to add and then re-upload the data into Salesforce.
Like data cleaning, this can be a tedious process. But if you take it step by step it shouldn't be too hard. Good luck.