Is it possible to import Excel file into OBIEE or OAS and use it with other subject areas? - excel

It is known that you could upload an Excel file in Visual Analyzer as a dataset and use that Excel file in Analysis as a separate Subject Area.
However, there was no way (or at least we couldn't find it) to make any connections between this Excel dataset and other subject areas, for example setting connectiong between Excel file's date column with OBIEE's Caledar.Day column, etc.
With new OAS, is there any update on this? Can we somehow make relationships between user-defined datasets and subject areas from rpd? Or is this feature not implemented?

Once you're on OAS you can create data sets which mash up any data source you want. Excel uploaded as a data set can be combined with other uploaded data sets, data sets created by data flows as well as Subject Areas. You have full freedom.

I believe only way to make relationship between data sources is to import them into repository of Analytics.
Maybe if you can import excel as data source into repository, you can manage to relate with other data sources. Here are some links :
https://datacadamia.com/dat/obiee/obis/obiee_excel_importation
https://www.ascentt.com/importing-excel-file-into-obiee-11g/
I hope these helps.
Hakan

Related

Filter Excel data when importing into Visio OrgChart

In Visio I am creating an Org Chart, using the 'Import Organization Data', and using 'Information that's already stored in a file or database'. When I select my xlsx file, it pulls in all of the data. However, what if I wanted to only create an org chart out of a subset of the data? Currently I'm applying a filter to the data in Excel, copying the result to a new Excel file, and using that new file to import into Visio. A slightly less bad version of this would be if I could at least copy the filtered data into a different sheet in the same file, but the Visio Import doesn't even seem to let me select which sheet to use. This is very annoying - is there a better way?
Though I could never get Visio to ask me which table/sheet I wanted to use within a single file, I found what I consider an acceptable workaround using inspiration from #y4cine's suggestion.
I created separate "slice" xlsx files, where in each of those I used a Power Query against the data in the main xlsx file. Then I can point Visio to one of those slice files and it will happily make an org chart with the slice of data I was interested in.
A bit clunky, but it sure beats repeated copy/pasting :)

Consumer PowerPivot/Excel DataModel from another Excel file?

Short version: Is there any way/hack to use the embedded DataModel/PowerPivot cube of an Excel 2013/6 file from another Excel file?
Long version:
We have a large Excel Data Model with >400k rows and >100 measurements, feeding multiple reports (i.e. PivotTable on separate worksheets). As all this is growing, we want to split this out into a (large) data model and multiple reports. I know this could be done with SharePoint or PowerBI - however one of the key requirements is to be able to analyse the data offline. Hence, I'm trying to figure out any way to connect to the data model from another file....
There's no way that I know to do what you're asking. Is there any reason you can't just include all the reports in one workbook with the data model? Since you have to be able to analyze offline, anyway, everyone will need a local copy of the model. If the concern is just that there will be too many sheets in a single workbook, you could just put a thin veneer of VBA in it to hide and unhide sheets in groups for ease of use.
It looks like Microsoft has added an option to establish connection via ODC file.
See this f.e. https://learn.microsoft.com/en-us/sql/reporting-services/report-data/use-an-office-data-connection-odc-with-reports?view=sql-server-ver15
However it's not working out for me, I am using Excel 2016, exported data model from the file with data model as a separate odc file but when I try to add this as a connection in another file - I get the message - can't open the file. Looks like creating ODC file is not that straightforward.
Anyone had similar issues?

Import Historical Records from Excel to Custom App

I need a bit of help. My company has data in multiple excel sheets. Some sheets are straight forward (in that they easily map data types). But most of them are merged rows and cells etc within one header. I am developing an application in c# for maintaining a massive database with proper user and role management and multiple departments as stake holders.
I have identified the relations from within the excel sheets and all is well. What I cannot understand is how to import that historical data to map to the data tables? Basically, when a new custom system is designed, how would you import senseless data within it?
The only thing I could think of was writing a utility program that reads every row and every cell of the excel sheets and then extract the required values to insert to the proper database table. But this would take ages due to the numerous number of excel sheets.
Wondering anyone of you went through the same thing as I have?How did or would you handle this?
Many thanks guys :)
If the data is not regular, you've got a world of pain ahead of you. The object model in Excel can be driven by any of the Windows Scripting Host languages, like VBScript and JScript. In fact, most scripting languages have some support for Excel traversal.
However, as you've already noted, the data layout isn't the same across all spreadsheets. I had a similar problem years ago traversing SCADA data from power stations: each station did it slightly differently and changed their format from time to time.

Importing data from Excel into a database

I want to import data from Excel into corresponding tables based on different column data's on based on ID's like customer data on based on CustomerID present in Customer table.
Means we have to extract data from the table and Excel source on basis of ID's.
Could you please help me out on this?
Use the SQL Server Data Import Wizard - see an article on it here.
(source: databasedesign-resource.com)
This wizard allows you to define your Excel file to import, it allows you to define the target where to put the data, it allows you to define mappings between columns in Excel and columns in your SQL table, and much more.
Update: based on your comment to the other answer, if you need to import the Excel sheet and match it up to some pre-existing lookup data, then you should definitely look at the SQL Server Integration Services (SSIS) which are there exactly for this kind of import/lookup scenario.
Your question's gamma is a bit all over the place so not entirely sure what you are asking about but here goes.
You can save you excel spreadsheet as a CSV file and then import that into your database. There a number of tutorials on this if you search google. Try searching "import CSV into database".

Tool for transforming Excel files? (swapping columns, basic string manipulation etc)

I need to import tabular data into my database. The data is supplied via spreadsheets (mostly Excel files) from multiple parties. The format of each of these files is similar but not the same and various transformations will be necessary to massage the data into the final format suitable for import. Furthermore the input formats are likely to change in the future. I am looking for a tool that can be run and administered by regular users to transform the input files.
Now let me list some of the transformations I am looking to do:
swap columns:
Input is:
|Name|Category|Price|
|data|data |data |
Output is
|Name|Price|Category|
|data|data |data |
rename columns
Input is:
|PRODUCTNAME|CAT |PRICE|
|data |data|data |
Output is
|Name|Category|Price|
|data|data |data |
map columns according to a lookup table, like in the above examples:
replace every occurrence of the string "Car" by "automobile" in the column Category
basic maths:
multiply the price column by some factor
basic string manipulations
Lets say that the format of the Price column is "3 x $45", I would want to split that into two columns of amount and price
filtering of rows by value: exclude all rows containing the word "expensive"
etc.
I have the following requirements:
it can run on any of these platform: Windows, Mac, Linux
Open Source, Freeware, Shareware or commercial
the transformations need to be editable via a GUI
if the tool requires end user training to use that is not an issue
it can handle on the order of 1000-50000 rows
Basically I am looking for a graphical tool that will help the users normalize the data so it can be imported, without me having to write a bunch of adapters.
What tools do you use to solve this?
The simplest solution IMHO would be to use Excel itself - you'll get all the Excel built-in functions and macros for free.Have your transformation code in a macro that gets called via Excel controls (for the GUI aspect) on a spreadsheet. Find a way to insert that spreadsheet and macro in your client's Excel files. That way you don't need to worry about platform compatibility (it's their file, so they must be able to open it) and all the rest. The other requirements are met as well. The only training would be to show them how to enable macros.
The Mule Data Integrator will do all of this from a csv file. So you can export your spreadsheet to a CSV file, and load the CSV file ito the MDI. It can even load the data directly to the database. And the user can specify all of the transformations you requested. The MDI will work fine in non-Mule environments. You can find it here mulesoft.com (disclaimer, my company developed the transformation technology that this product is based on).
You didn't say which database you're importing into, or what tool you use. If you were using SQL Server, then I'd recommend using SQL Server Integration Services (SSIS) to manipulate the spreadsheets during the import process.
I tend to use MS Access as a pipeline between multiple data sources and destinations - but you're looking for something a little more automated. You can use macros and VB script with Access to help through a lot of the basics.
However, you're always going to have data consistency problems with users mis-interpreting how to normalize their information. Good luck!

Resources