please let me know what are the API’s available in Cognos?
what are the API's required to expose cognos reports as web services for mobile apps?
Thanks in advance.
There are a handfull of APIs for orchestrating activities or retrieving data in Cognos
Cognos exposes a "URL" interface for some of the most common activities. This includes running a report, viewing saved report output, opening a report in report studio, etc.
You can use the Cognos SDK. This is the most powerful way to orchestrate things in Cognos. It has a large surface area and a fairly steep learning curve, but if you're willing to invest the time in it, the possibilities are endless
You can use the Cognos Mashup service - this is kind of like "SDK Lite". Its much easier to use but exposes a much smaller surface area. As the name implies, its useful for mashup type scenarios.
Here's an article which describes Cognos Mashup vs. Cognos SDK, and provides some heuristics for when you would choose one over the other :
http://www.motio.com/blog/2010/11/01/cognos-sdk-vs-cognos-mashup.do
You are looking for Cognos Mobile.
http://www-01.ibm.com/software/analytics/cognos/mobile/features-and-benefits.html
Related
Our Customer currently build out a number of use cases for the client and facilitate the onboarding of logs from 300+ applications. The client is limited on the number of use cases they can support so they have been looking into the option of creating a custom schema with parsers etc.
The focus is insider threat so they are primarily collecting audit/activity logs for these applications.
The challenges they see them are that application audit/activity logs vary greatly and this will make it difficult to bring the data together from multiple applications. The client has a non-standard architecture and ingest their logs through ADX instead of Sentinel and then forward a subset of data for alerting. They also don’t make wide use of native tables yet.
Please do refer the architecture diagram in the attachment.
Question:
Is there a way of normalizing application audit and activity logs so they can build out insider threat use cases over multiple applications?
Any guidance for this requirement would be of great help. Thanks in advance.
I have an idea whereby I intend to build a cloud native application for algorithmic trading, ideally by consuming all PaaS and SaaS (no IaaS), and I'd like to get some feedback on how I intend to build it. The concept is pretty straight-forward in that I intend to consume financial trading data from an external SaaS solution via an API query, feed that data into various Azure PaaS solutions (most notably ML for modeling), and then take some action. Here is a high-level diagram I've come up with so far:
Solution Overview
As a note, while I'm familiar with Azure, I'm not a Azure cloud engineer and have limited experience in actually building solutions myself. Subsequently, I intend to use this project as a foundation to further educate myself.
When starting on the build, I immediately questioned whether I should or shouldn't use Event Hubs. Conceptually it makes sense, in that I'm decoupling the production of a data stream from the consumption of it. Presumably, this facilitates less complications when / if I need to update the data feed(s) in the future. I also thought about where the data is stored... should it be a SQL database, or more simply, an Azure Table? The idea here is that the trading data will need to be stored for regression testing as my iterate through my models. All that said, looking for some insights from anybody that may have experience in this space.
Thanks!
There's no real question in here. Take a look on the architecture reference provided by Microsoft: https://learn.microsoft.com/en-us/azure/architecture/reference-architectures/
From the link :https://reviews.financesonline.com/p/alteryx/, I see the following details
Alteryx is an advanced data analytics platform intended to serve the
needs of business analysts looking for a self-service solution. It
contains 3 basic components: Gallery, Designer, and Server, which
blend data from external sources and generate comprehensive reports.
Each of them, however, can be used separately.
The software structures and evaluates data from multiple external
sources, and organizes it into comprehensive insights that can be used
for business deciding and shared with multiple internal/external
users. Basically, Alteryx is deploying data in a decentralized way,
and eliminating in such way the risk of underestimating it. At the
same time, Alteryx is well-integrated, easy to use, and ran both on
premise and in cloud.
Can anyone help me to know what is the text above in bold trying to explain. I am interested to understand it in details with some explanation.
The basic idea of is that the tool can blend just about any kind of data and dump the result to your own local extract... the local extract is "decentralized" in that, obviously it's local, and also you didn't need to rely on some core ETL team to build a process for you (which they would probably dump in a central location). The use of the term "underestimating" probably indicates that, if you're not building in your own insights (say you find something online that you can blend into your analysis), you're "underestimating" the importance of that data.
It's worth noting that your custom extract could be turned into a nightly job and the output could itself be dumped to a centralized database server if desired. So the tool can be used to build centralized assets too. It really just depends on how you're using it. (With Alteryx this would require either their Desktop Automation, or their Server.)
So... it seems that any self service data blending tool would be capable of the same. What's special about Alteryx? The distinguishing factors will lie elsewhere: number of data types supported, overall functionality and power, performance, built-in examples, ease-of-use, service, support, online community, and perhaps other areas.
Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/
What the differences between "Cognos TM1" and "Cognos 10 BI"?
Which one is consider as BI Tools by IBM?
There are huge differences between "Cognos TM1" & "Cognos 10 BI"!
Cognos TM1 is a (OLAP) multidimensional database which can be queried from Excel and the Web through "TM1 Web", "TM1 Executive Viewer" and "Cognos 10 BI". Within this database, you'll be able to create almost anykind of OLAP / Decisional application.
Cognos 10 BI is a web based reporting application. Users, depending on their rights (licence) will be able to run reports, schedule reports and/or create ad hoc analysis against relational and/or multidimensional databases. In order to do so, a logical layer has to be built using "Framework Manager". This logical layer is used to hide database complexity and to provide to the end users relevant information.
In simplest terms:
TM1 is a database engine and a collection of applications for accessing and managing its databases.
Cognos BI is a collection of web applications that provide pretty interfaces for viewing and doing stuff with data.
More often than not, Cognos BI uses TM1 databases as its data source, but it does not have to. If Cognos BI is using TM1, most of the same user functions that are possible in TM1 applications are also available through Cognos BI, except they are available in a more user-friendly manner. Cognos BI also adds functionality not in TM1 applications to allow additional data management. IBM's marketing is confusing, but generally Cognos BI and Cognos TM1 collectively are considered to be the BI Tools package that they offer.
Now to be a little more technical about TM1, it is not just a plain old database. As others here have mentioned, it is a multidimensional OLAP database. It is able to handle numeric and string data, but it does not have a concept of NULL values. Numeric data can be summarized using consolidated elements. It has attributes to store metadata in. It has a built-in rules engine to handle business logic and custom calculations. It has processes for ETL and database maintenance tasks. It has chores for scheduling processes at various intervals. It links to Excel workbooks. Lastly, in addition to these features and more that are provided out of the box, TM1 exposes an API for programming against using 3GLs such as C++, C#, Java, and even VBA.
The key difference is Cognos TM1 is meant to work with excel easily wheras Cognos 10 BI is mainly browser based.
Have a look here for gory details.
http://www.tm1forum.com/viewtopic.php?f=5&t=1442
Cognos TM1 is also referred to by IBM as Financial Performance Management FPM. It's an in-memory MOLAP cube application with real-time rules calculation. As it provides end-user writeback it's often used by the Office of Finance for budgeting and modelling - therefore the MS Excel interface is frequently used. However it also comes with a zero footprint web "front end" as well as Cognos Insight, which permits distributed or disconnected access to the TM1 cubes (located server side).
TM1 may be integrated into Cognos BI, so that Cognos BI reports from TM1 cubes; or TM1 may be accessed from a portlet within a Cognos BI dashboard.
difference between cognos10 and cognos tm1 :-BI is the reporting tool and cognos tm1 is analysis and reporting tool , in tm1 bulid the CUBES and BI generatead the reports