I have a need to get the number of executions of a report(s) currently taking place. I thought, maybe, that the execution log view on the report server database inserted a new record each time a report execution started, with the report (item), item path, username, and timestart... then just updated the record with the timeend, the other time field values, etc. Unfortunately though, the record isn't inserted into the execution log (with all information) until the report run is complete.
I've done some google searching and find plenty about how to get information on report runs, but this is of course on report executions that have completed. I haven't been able to find anything on if there is a way, and how, to get a count of a particular report or reports that have executions currently in progress.
Just wondering if anyone has found, or come up with, some clever way of doing this.
Thanks in advance.
Answer from Microsoft support is that currently, there is nothing in the Reporting Services database that stores this information.
Related
when I create a job and add a dataset refresh as a job step, I'd like to get a notification about a successful refresh. How could I do this please? The only idea that comes to mind so far - to create a dummy report and add it as the next job step, after the dataset (refresh). Then edit the delivery of this dummy report as Send report by email. And set the whole job as Run in sequence and disable Continuing on error.
This way, when I receive my dummy report in an email, I know my dataset has been refreshed.
Is there a better way to handle it?
Thank you very much
How do you know if it is successful? Because Cognos says it completed? Because the schedule ran?
Your suggestion is a good one... mostly.
When you receive the report via email, that may not mean the dataset refreshed successfully. So it shouldn't be a dummy report. The report should use the dataset and send you some metrics that indicate that the dataset refreshed.
Here's the tricky bit. If you put them both in the same job, are you sure the dataset refresh completes before the report runs? Or could they just be started in that order, then the report runs before the dataset has completely refreshed?
A better solution may be to determine how long the dataset should take to refresh, then schedule the report to run at a time you are certain the dataset should be done refreshing, and notify you with relevant metrics from the dataset.
I'm working through a problem that I can't seem to find a solution for. I'm attempting to speed up the load time for a report. The idea is to open the report on on the Analyst Client, and I've identified one information link that bogs down the load time. Easy enough, I figured i'd cache the information link:
I reloaded the report expecting the first load to take a while, however the data is reloading everything every time. The amount is less than 2 GB so that can't be the problem. The only other issue I can think of is that I am filtering the data in the Where clause in my SQL statement. Can you all think of anything I'm missing?
When designing Workflows you have a chance to indicate how it is triggered.
In my particular case I am interested to detect changes in the Status Reason and, for specific states, do something. I can use the "After" filed change on the Status Reason or a Wait condition and everything looks to be OK.
The question I have is in the relation to an Excel Export/Import used for bulk operations. In this case the user can change (using Excel) the Status Reason field to a value matching the condition in the workflow.
Assuming the workflow is Activated at the time of Excel import, does the workflow get triggered for every row imported?
It might be very inefficient from a timing perspective but for small data sets might be beneficial and acting as a bulk update, which in fact I am looking for.
Thank you!
For your question,
Yes workflow does get triggered every time you Import data using Excel and it matches the criteria for your Workflow.
Workflow run on server side that means, they will trigger every time value changes in Database and matches criteria. You could run your workflow in asynchronous mode and Crm Async job will take care of allocating resources as and when it has capacity. In this way you will not see performance impact when you Import data via Excel.
It should be simple process but unable to find solution and need your help.
The report is executing faster when I run the report locally (SSDT - VS2012) but when I deploy the report to report server, it is taking a lot of time to display the report.
I checked the stored procedure (SQL Server 2008R2) and it is running fast without any issues.
Please help. Thanks in advance :)
As this should be a simple process I noticed that the issue is with one of the table in Stored Procedure. The retrieval of records from that table is taking a lot of time. If the data retrieval from the view itself is slow then I believe we can add Indexes there. It will definitely help.
In addition while testing the report, I found that after the first execution, the stored procedure was returning records faster. However, I removed the background image for slide 3 and the third slide is loading faster (1-2 sec Max) than before in test when compared to Production (10-11 sec).
Thanks all off you for your time and efforts. :)
We have a bunch of reports on SharePoint, using SQL Reporting Services.
The statistics reports - the ones that aggregate data and display few hundreds to a few thousands records are loading fine.
However, we also have reports that display raw records from database. These reports usually have tens or hundreds of thousands of records. Sometimes even millions. And most of the times, they do not load but throw OutOfMemory errors.
The queries for those reports are very simple selects with some where conditions (sometimes, another few small tables might be joined on the huge one). In SQL Server Management Studio the query completes in 5-10 seconds.
I'm frustrated, because the clients are asking for the report, but I can't find any solutions to this (I googled a lot, but the best advice I could find was "get rid of the report or try to minimize the amount of data in it" which doesn't really solve anything - clients insist that they need the ENTIRE report.)
Is it possible to solve this somehow?
Move to 64 bit for the reporting server?
Chances are the users need the ENTIRE report because they are scraping the data off into excel or some other format and using it elsewhere. If you can get away with it, coding a webpage or similar that displays the query in a simple text format/csv may be more effective than a report.
I.e. The best advice you can find is the best advice.