It should be simple process but unable to find solution and need your help.
The report is executing faster when I run the report locally (SSDT - VS2012) but when I deploy the report to report server, it is taking a lot of time to display the report.
I checked the stored procedure (SQL Server 2008R2) and it is running fast without any issues.
Please help. Thanks in advance :)
As this should be a simple process I noticed that the issue is with one of the table in Stored Procedure. The retrieval of records from that table is taking a lot of time. If the data retrieval from the view itself is slow then I believe we can add Indexes there. It will definitely help.
In addition while testing the report, I found that after the first execution, the stored procedure was returning records faster. However, I removed the background image for slide 3 and the third slide is loading faster (1-2 sec Max) than before in test when compared to Production (10-11 sec).
Thanks all off you for your time and efforts. :)
Related
I'm working through a problem that I can't seem to find a solution for. I'm attempting to speed up the load time for a report. The idea is to open the report on on the Analyst Client, and I've identified one information link that bogs down the load time. Easy enough, I figured i'd cache the information link:
I reloaded the report expecting the first load to take a while, however the data is reloading everything every time. The amount is less than 2 GB so that can't be the problem. The only other issue I can think of is that I am filtering the data in the Where clause in my SQL statement. Can you all think of anything I'm missing?
I currently have an excel based data extraction method using power query and vba (for docs with passwords). Ideally this would be programmed to run once or twice a day.
My current solution involves setting up a spare laptop on the network that will run the extraction twice a day on its own. This works but I am keen to understand the other options. The task itself seems to be quite a struggle for our standard hardware. It is 6 network locations across 2 servers with around 30,000 rows and increasing.
Any suggestions would be greatly appreciated
Thanks
if you are going to work with increasing data, and you are going to dedicate a exclusive laptot for the process, i will think about install a database in the laptot (MySQL per example), you can use Access too... but Access file corruptions are a risk.
Download to this db all data you need for your report, based on incremental downloads (only new, modified and deleted info).
then run the Excel report extracting from this database in the same computer.
this should increase your solution performance.
probably your bigger problem can be that you query ALL data on each report generation.
I have a need to get the number of executions of a report(s) currently taking place. I thought, maybe, that the execution log view on the report server database inserted a new record each time a report execution started, with the report (item), item path, username, and timestart... then just updated the record with the timeend, the other time field values, etc. Unfortunately though, the record isn't inserted into the execution log (with all information) until the report run is complete.
I've done some google searching and find plenty about how to get information on report runs, but this is of course on report executions that have completed. I haven't been able to find anything on if there is a way, and how, to get a count of a particular report or reports that have executions currently in progress.
Just wondering if anyone has found, or come up with, some clever way of doing this.
Thanks in advance.
Answer from Microsoft support is that currently, there is nothing in the Reporting Services database that stores this information.
I am using grails 2.3.7 and the latest excel-import plugin (1.0.0). My requirement is that I need to copy the contents of an excel sheet completely as it is into the database. My database is mssql server 2012.
I have got the code working for the development version. The code works fine when the number of records are few or may be upto a few hundreds.
But while in production the excel sheet will be having as many as 50,000 rows and over 75 columns.
Initially I faced a data out of memory exception. I increased the heap size to as much as 8GB, but now the thread keeps running on and on without termination. No errors are generated.
Please note that this is a once in while operation and it will be carried out by a person who will ensure that this operation does not hamper other operations running parellely. So need to worry about the huge load of this operation. I can afford to run it.
When the records are upto 10,000 with the same number of columns the data gets copied in around 5 mins. If now I have 50,000 rows then the time taken should ideally be around 5 times more, which is around 25 mins. But the code kept running for more than an hour without termination.
Any idea how to go about this issue. Any help is highly appreciated.
If you load 5 times more data in memory, it doesn't always take 5 times more. I guess that most of 8GB are in virtual memory and the virtual memory is very slow on hardware. Try to decrease the memory, run some memory tests and try to use as much as possible the RAM.
In my experience, a normal problem with large batch operations in Grails. I think you have memory leaks that radically slow down the operation as it proceeds.
My solution has been to use an ETL tool such as Pentaho Kettle for the import, or chunk the import into manageable pieces. See this related question:
Insert 10,000,000+ rows in grails
Not technically an answer to your problem, but have you considered just using CSV instead of of excel?
From a users point of view, saving as a CSV before importing is not a lot of work.
I am loading, validating and saving CSVs with 200-300 000 rows without a hitch.
Just make sure you have the logic in a service so it puts a transaction around it.
A bit more code to decode csv maybe, especially to translate to various primitives, but it should be orders of magnitude faster.
We have a bunch of reports on SharePoint, using SQL Reporting Services.
The statistics reports - the ones that aggregate data and display few hundreds to a few thousands records are loading fine.
However, we also have reports that display raw records from database. These reports usually have tens or hundreds of thousands of records. Sometimes even millions. And most of the times, they do not load but throw OutOfMemory errors.
The queries for those reports are very simple selects with some where conditions (sometimes, another few small tables might be joined on the huge one). In SQL Server Management Studio the query completes in 5-10 seconds.
I'm frustrated, because the clients are asking for the report, but I can't find any solutions to this (I googled a lot, but the best advice I could find was "get rid of the report or try to minimize the amount of data in it" which doesn't really solve anything - clients insist that they need the ENTIRE report.)
Is it possible to solve this somehow?
Move to 64 bit for the reporting server?
Chances are the users need the ENTIRE report because they are scraping the data off into excel or some other format and using it elsewhere. If you can get away with it, coding a webpage or similar that displays the query in a simple text format/csv may be more effective than a report.
I.e. The best advice you can find is the best advice.