Looking for suggestions/recommendations
We currently have a report that has a large no. of metrics, across various areas of the business. The report is largely manually compiled. For many metrics, the data is sourced from some business system, then complex calcs/transformations are performed in Excel to get the metric, which is then manually 'plugged into' the report. The report is presented using Power BI
We want to automate the extraction/calculation process as much as we can. Both to reduce manual effort as well as to remove potential for errors/manipulation that can happen due to the Excel calcs.
Because the Excel transformations are so complex, we need to somehow keep the logic that's in the spreadsheets, but want to, as far as possible, secure the source data/calcs from human error/manipulation.
I think we need to somehow integrate the Excel data extraction/transformation logic into Power BI, and, as far as possible, remove any manual intervention required. And also remove any potential for unauthorised users to change source data or calculations. I'm not sure of the best way to go about this, so I'm looking for suggestions.
Thank you
I think you have two options.
Import the data into excel using macros, then export the completed data into a new sheet using macros.
Set up a powerquery which will do all the data manipulation for you given the raw data.
You'd probably need to post a specific example if you need more help.
Related
I am having data which is reported in excel format and the same is created manually. There are lots of typos expected in the report as the person feeding the data in excel sheet is highly unskilled. The way the person listens to the data entry, as per his understanding the data is entered in excel eg Shree/Shri/Sri etc. sounds almost similar while they are pronounced but all 3 are different as the data as whole.
At present I am solving the problem by cleaning the data with Open Refine java based localhost solution using various clustering methods available in the software. The data is added to the excel pool in incremental manner hence the excel rows are increasing after each update. Open refine is consuming heavy resources as it works on localhost.
It will be really helpful if there is any other way round to solve the problem.
Thanks in advance!
I need to send RFQs to various vendors using their specific forms which are Excel formatted. I need specific information from my co-workers in order to properly fill out the vendor's forms. I was thinking about using MS Forms to gather the specific info I need from my co-workers, then hoping some how that data could easily/automatically be transferred to the vendor's specific Excel form based on the MS Forms responses. The responses received determine which vendor form, and in most cases multiple vendor forms, to use. Then each completed form saved as its own file.
I was looking at different MS Flows, PowerApps, and Power Automate templates that have already been created, but I'm sure one matches my needs or if any of those are the best solution. I did watch some videos about how to create your own MS Flow/PowerApp, but I wasn't sure it was going to be a viable solution. I am hoping to streamline the action of copying & pasting the data, but I'm not sure how I would go about setting up a way, if there is one. Or if there is a "dummies" how to way or if the way would be over my head.
Background knowledge/experience: I have zero experience or base knowledge of coding. I can record macros, but cannot edit the coding, I have to re-record it from scratch. I can do simple IF formula's & Pivot Tables in Excel. I tried for a minute to teach myself PowerBi, but think knowing SQL first would be better from my understanding. Haven't dived down rabbit hole of trying to teach myself SQL, if that's even possible without some base coding knowledge. I want to learn these things and how to do more, but lack of time is a factor. I piece and squeeze in micromillimeters of knowledge in when I can from Googling and YouTube. I haven't had much luck Googling this, because what I'm typing in the search bar isn't producing helpful results as far as I can tell.
Hi I recently have been taking a deeper dive into excel and I was thinking about how to automate a task that I have to do every month. Every month I update financial projections by manually entering in monthly expenses into a spreadsheet for each account.
I would like to find a way to pull data that I normally manually enter from SAP Netweaver and our inhouse website that lists salary charges to an excel spreadsheet.
What do I need to learn to automate this repetitive task? I am not an expert in CS by any means so if anyone has any suggestions on ways to solve this problem, topics to learn that would be helpful and/or online resources that would help me learn how to automate this data entry it would be greatly appreciated.
TL;DR I want to upload data from SAP to an excel file where financial projections are kept so I don't have to manually enter the data in every month.
If you want to write some vba code to login into SAP and select necessary data, you need to develop custom function module on ABAP that will select it.
So it's much easier to write custom report with export to Excel. ABAP isn't very difficult to learn.
Take a look at some of the query tools available within Netweaver. Transaction SQVI can be used to build a simple query involving one or more tables, from which the results (in ALV) can be exported to excel. Transaction SQ01, SQ02 and SQ03 can be used for the same, but are a bit more involved.
My current employer (to remain nameless) has a collection of incredibly sophisticated Microsoft Excel 2003 worksheets (developed by contractors, also to remain nameless).
The employer is replacing the Excel-based solution with a SalesForce-based solution (developed by other contractors, likewise to remain unnamed). The SalesForce solution is also very complex using dozens of related objects and "Dynamic SOQL" to contain the data and formulas which previously was contained in the Excel-based solution.
The employer's problem, which has become my problem, is that the data from the Excel spreadsheets needs to be meticulously and tediously recreated in .CSV files so it can be imported into SalesForce.
While I've recently learned I can use CTRL-` to review formulas in Excel, this doesn't solve the problem that variables in Excel have cryptic names like $O$15. If I'm lucky, when I investigate $O$15, I'll find some metadata explaining if n cells up and/or some other data m cells to the left, and/or (in rare instances) there may be a comment on the cell.
Patterns within the Excel spreadsheets are very limited, rarely lasting more than 6 concurrent rows or columns and no two sheets which need to be imported have much similarity.
Documentation of all systems are very limited.
Without my revealing any confidential data, does anyone have any good ideas how I might optimize my workflow?
It's not clear exactly what you need to do: here are 3 possible scenarios, requiring increasing knowledge of Excel.
1. If all you want is to convert the Excel spreadsheets into CSV format then just save the worksheets as CSVs.
2. If you just want the data and not the formulae then it would be simple (using VBA) to output anything that isn't a formula (the cell.Formula won't start with =).
3. If you need to create a linkage excel-->csv-->existing Salesforce objects/SOQL then you will need to understand both the Excel Spreadsheets and the Salesforce objects/SOQL that have been created. This will be difficult unless you have good knowledge and experience of Excel and also understand what the salesforce App requires.
Brian, if you're still working on this, here's one way to approach the problem. I use this kind of process often for updating data between SFDC and marketing automation apps.
1) Analyze the formulae that you're re-creating in Salesforce.com to determine what base data fields you need (stuff that doesn't have to be calculated from something else.
2) Find those columns/rows in your spreadsheets and use Paste Special -> Values in a new spreadsheet to create an upload file with values instead of formulae that you need for each data area (leads, prospects, accounts, etc.)
3) If you have to associate the info with leads or contacts or accounts and you have already uploaded or created those records in Salesforce.com, be sure to export them with their ID numbers. That makes it easy to use the vlookup formula in Excel to match up fields that you need to add and then re-upload the data into Salesforce.
Like data cleaning, this can be a tedious process. But if you take it step by step it shouldn't be too hard. Good luck.
I'm creating an Excel dashboard that imports a variable number months' worth of financial/accounting information from a database to an Excel sheet. Using this information I have a Calculations sheet that computes some financial indicators, again, month by month. Finally, this information is displayed in graphs on a separate sheet (one indicator per graph, with the monthly information plotted to see the tendencies). Currently I have written VBA code that formats the sheets to accomodate the number of months requested, pull the data from the SQL server, and update the graphs. Since there are 53 indicators for each operation (6 operations), this process takes about 3 minutes.
Does anyone recommend a better way to do this? The current way 'works' but I've often thought that there must be a more efficient way to do this.
Thanks!
Chris
You could look at skipping out the excel part and using SQL server reporting services (SSRS). If you have ever used business objects or crystal reports its kind of the same thing and I would imagine would offer better performance than doing things in excel.