Cognos 11.2.2 - Nofification when a dataset is refreshed via a job - cognos

when I create a job and add a dataset refresh as a job step, I'd like to get a notification about a successful refresh. How could I do this please? The only idea that comes to mind so far - to create a dummy report and add it as the next job step, after the dataset (refresh). Then edit the delivery of this dummy report as Send report by email. And set the whole job as Run in sequence and disable Continuing on error.
This way, when I receive my dummy report in an email, I know my dataset has been refreshed.
Is there a better way to handle it?
Thank you very much

How do you know if it is successful? Because Cognos says it completed? Because the schedule ran?
Your suggestion is a good one... mostly.
When you receive the report via email, that may not mean the dataset refreshed successfully. So it shouldn't be a dummy report. The report should use the dataset and send you some metrics that indicate that the dataset refreshed.
Here's the tricky bit. If you put them both in the same job, are you sure the dataset refresh completes before the report runs? Or could they just be started in that order, then the report runs before the dataset has completely refreshed?
A better solution may be to determine how long the dataset should take to refresh, then schedule the report to run at a time you are certain the dataset should be done refreshing, and notify you with relevant metrics from the dataset.

Related

Information Link Caching

I'm working through a problem that I can't seem to find a solution for. I'm attempting to speed up the load time for a report. The idea is to open the report on on the Analyst Client, and I've identified one information link that bogs down the load time. Easy enough, I figured i'd cache the information link:
I reloaded the report expecting the first load to take a while, however the data is reloading everything every time. The amount is less than 2 GB so that can't be the problem. The only other issue I can think of is that I am filtering the data in the Where clause in my SQL statement. Can you all think of anything I'm missing?

BadGateway Power Automate using Execute Script Action

I'm using the Run Script action to transfer data between Excel sheets and I'm facing the Badgateway error, saying that the action has timed out. The export action converts the data to JSON format, which is then used as input in the action to import the data and transform the JSON into table format again. The amount of data transferred is large (>10000), however, even when using a smaller database (1000 rows x 3 columns) the error appears sometimes, which may lead me to the conclusion that it may not be the amount of data and yes, Microsoft's database is low at the time of running Flow, so it can't fulfill my request.
I would like to know if any Power Automate plan can help to solve this problem, if any license allows the user to use a greater capacity of the database or have a space destined in which the connection to the server does not fail at the time of executing the flow, flawlessly processing my request. Or if it is a problem that I must solve by decreasing the amount of data transferred in this format, and if is it, how can I measure this quantity of Data Power Automate can process.

CRM 365 Workflows and importing data file from Excel

When designing Workflows you have a chance to indicate how it is triggered.
In my particular case I am interested to detect changes in the Status Reason and, for specific states, do something. I can use the "After" filed change on the Status Reason or a Wait condition and everything looks to be OK.
The question I have is in the relation to an Excel Export/Import used for bulk operations. In this case the user can change (using Excel) the Status Reason field to a value matching the condition in the workflow.
Assuming the workflow is Activated at the time of Excel import, does the workflow get triggered for every row imported?
It might be very inefficient from a timing perspective but for small data sets might be beneficial and acting as a bulk update, which in fact I am looking for.
Thank you!
For your question,
Yes workflow does get triggered every time you Import data using Excel and it matches the criteria for your Workflow.
Workflow run on server side that means, they will trigger every time value changes in Database and matches criteria. You could run your workflow in asynchronous mode and Crm Async job will take care of allocating resources as and when it has capacity. In this way you will not see performance impact when you Import data via Excel.

SSRS Report Execution

I have a need to get the number of executions of a report(s) currently taking place. I thought, maybe, that the execution log view on the report server database inserted a new record each time a report execution started, with the report (item), item path, username, and timestart... then just updated the record with the timeend, the other time field values, etc. Unfortunately though, the record isn't inserted into the execution log (with all information) until the report run is complete.
I've done some google searching and find plenty about how to get information on report runs, but this is of course on report executions that have completed. I haven't been able to find anything on if there is a way, and how, to get a count of a particular report or reports that have executions currently in progress.
Just wondering if anyone has found, or come up with, some clever way of doing this.
Thanks in advance.
Answer from Microsoft support is that currently, there is nothing in the Reporting Services database that stores this information.

SSRS report is running slow in report browser

It should be simple process but unable to find solution and need your help.
The report is executing faster when I run the report locally (SSDT - VS2012) but when I deploy the report to report server, it is taking a lot of time to display the report.
I checked the stored procedure (SQL Server 2008R2) and it is running fast without any issues.
Please help. Thanks in advance :)
As this should be a simple process I noticed that the issue is with one of the table in Stored Procedure. The retrieval of records from that table is taking a lot of time. If the data retrieval from the view itself is slow then I believe we can add Indexes there. It will definitely help.
In addition while testing the report, I found that after the first execution, the stored procedure was returning records faster. However, I removed the background image for slide 3 and the third slide is loading faster (1-2 sec Max) than before in test when compared to Production (10-11 sec).
Thanks all off you for your time and efforts. :)

Resources