Can we develop drill down reports in IBM Cognos other than by DMR(Dimensional Modeled Relation) technique? My reports are running slow when I drill down to lower level using a DRM package while my reports runs fine when I run the report directly on the lower level using a non-DRM package.
Without using a cube source such as Transformer or TM1, no.
If that option is unavailable to you - are you running your reports against a large fact table? If so, consider creating these reports against summary/aggregate SQL tables which contain a minimum of data.
Related
We are using MS CRM 2015 and we are looking to know our options/best practice to archive audit logs. Any suggestion please? Thanks!
You can use the MSCRM Toolkit at http://mscrmtoolkit.codeplex.com/, which has a tool called Audit Export Manager to aid in archiving audit logs. The documentation for the tool is available at http://mscrmtoolkit.codeplex.com/documentation#auditexportmanager . The key items that this tool allows you to do is to do filtering by entities, Metadata, summary or detail, picking individual users, actions, and/or operations to include in your export. Exports can be limited to a particular date range and can be exported to CSV, XML, or XML spreadsheet 2003 format. Note that I've had trouble exporting with a couple of the formats, but typically get good results when exporting to CSV formats.
This is one of the tools I've found that gives you some flexibility when exporting audit records since Microsoft CRM allows you to filter the audit data, but doesn't provide a good built in means to export it.
You can try newest Stretch Database feature from SQL Server 2016:
Stretch Database migrates your cold data transparently and securely to the Microsoft Azure cloud.
Stretch warm and cold transactional data dynamically from SQL Server to Microsoft Azure with SQL Server Stretch Database. Unlike typical cold data storage, your data is always online and available to query. You can provide longer data retention timelines without breaking the bank for large tables like Customer Order History.
There is a helpful hands-on review SQL Server 2016 Stretch Database with very interesting SWITCH TABLE example.
Also there must be a solution with moving archived data from audit to separate filegroup. Take a look to Transferring Data Efficiently by Using Partition Switching:
You can use the Transact-SQL ALTER TABLE...SWITCH statement to quickly and efficiently transfer subsets of your data in the following ways:
Assigning a table as a partition to an already existing partitioned table.
Switching a partition from one partitioned table to another.
Reassigning a partition to form a single table.
We use commodity trading software linked to an Oracle database to export reports to Excel. I'm connecting to this Oracle database using PowerPivot as well as SQL developer. In doing so, I'm able to connect to Oracle directly creating live, refreshable reports which no longer need to be constantly exported.
I located an Oracle view responsible for generating one of the most important reports we export to Excel. What's strange is that all of the columns are completely empty. When I open it using PowerPivot or SQL Developer, I just see the headers which contain no data. It populates with data just fine when exported from our trading software however.
Does anyone know why this might be and how I can get this view to populate the data (using PowerPivot for example)?
Is this a materialized view I'm dealing with?
My first guess would be it has to do with permissions or row-level security on the view. Whether it is materialized view is impossible to determine from the data you've provided, but should make no difference in accessing the data from Power Pivot.
This is a question for your DBAs and unlikely a problem in Power Pivot.
I am trying to do a Performace test on Excel based application with loadrunner. Started of running the protocol advisor. which is trowing error.
My main target is to record the excel based application. For simulation created a database and calling database from excel.
Any sugestions what protocol to use. Or any other tools for conducting performance test on excel based application.
Here is where the foundation classes of knowledge on development and architecture come into play for a performance test professional.
Tell us about the next upstream component from excel? Is it connecting directly to the database or is it going through a web services or other application server layer? Do you have SQL Queries you are trying to reproduce or some other mechanism so as accessing your source?
What have you tried? (Other than protocol confuser?)
What the differences between "Cognos TM1" and "Cognos 10 BI"?
Which one is consider as BI Tools by IBM?
There are huge differences between "Cognos TM1" & "Cognos 10 BI"!
Cognos TM1 is a (OLAP) multidimensional database which can be queried from Excel and the Web through "TM1 Web", "TM1 Executive Viewer" and "Cognos 10 BI". Within this database, you'll be able to create almost anykind of OLAP / Decisional application.
Cognos 10 BI is a web based reporting application. Users, depending on their rights (licence) will be able to run reports, schedule reports and/or create ad hoc analysis against relational and/or multidimensional databases. In order to do so, a logical layer has to be built using "Framework Manager". This logical layer is used to hide database complexity and to provide to the end users relevant information.
In simplest terms:
TM1 is a database engine and a collection of applications for accessing and managing its databases.
Cognos BI is a collection of web applications that provide pretty interfaces for viewing and doing stuff with data.
More often than not, Cognos BI uses TM1 databases as its data source, but it does not have to. If Cognos BI is using TM1, most of the same user functions that are possible in TM1 applications are also available through Cognos BI, except they are available in a more user-friendly manner. Cognos BI also adds functionality not in TM1 applications to allow additional data management. IBM's marketing is confusing, but generally Cognos BI and Cognos TM1 collectively are considered to be the BI Tools package that they offer.
Now to be a little more technical about TM1, it is not just a plain old database. As others here have mentioned, it is a multidimensional OLAP database. It is able to handle numeric and string data, but it does not have a concept of NULL values. Numeric data can be summarized using consolidated elements. It has attributes to store metadata in. It has a built-in rules engine to handle business logic and custom calculations. It has processes for ETL and database maintenance tasks. It has chores for scheduling processes at various intervals. It links to Excel workbooks. Lastly, in addition to these features and more that are provided out of the box, TM1 exposes an API for programming against using 3GLs such as C++, C#, Java, and even VBA.
The key difference is Cognos TM1 is meant to work with excel easily wheras Cognos 10 BI is mainly browser based.
Have a look here for gory details.
http://www.tm1forum.com/viewtopic.php?f=5&t=1442
Cognos TM1 is also referred to by IBM as Financial Performance Management FPM. It's an in-memory MOLAP cube application with real-time rules calculation. As it provides end-user writeback it's often used by the Office of Finance for budgeting and modelling - therefore the MS Excel interface is frequently used. However it also comes with a zero footprint web "front end" as well as Cognos Insight, which permits distributed or disconnected access to the TM1 cubes (located server side).
TM1 may be integrated into Cognos BI, so that Cognos BI reports from TM1 cubes; or TM1 may be accessed from a portlet within a Cognos BI dashboard.
difference between cognos10 and cognos tm1 :-BI is the reporting tool and cognos tm1 is analysis and reporting tool , in tm1 bulid the CUBES and BI generatead the reports
I need to combine data from the Project Server reporting database with data from custom lists in SharePoint workspaces. The results need to be displayed within a single report. How should this be done? Options I've thought of:
Extend the reporting database with the custom list data (if this is possible). Use Reporting Services to display the output.
Query the reporting database and the SharePoint workspaces and combine results in memory. Write custom code to display the output.
Any other ideas? I have the skills to develop this but am very open to purchasing a product if it solves the problem.
I've had this sort of problem as well. My apporach:
Create a Custom reporting Db.
Run regular jobs from the SQL Server to query sharepoint (via WS) and store the results in the db.
i use the ListItemsChangesSinceToken is Lists.asmx to improve effeciency. Also I utilise the sitedataquery tool set. I wrote a really simple interface into it for the ability to call a sitedataquery remotely, returning a dataTable.
Use Reporting Services / any tool to extract and report on the data.
The reason I opted for a staging Db was for
Performance - the WS calls are pretty slow.
Service continuity - if SP is down for any reason or slow then queries will fail.
Hope this helps.
I also found the tool SharePoint Data Miner which appears to do the same as DJ's answer.