SSRS and large data set - ssrs-2016

I am trying to automate a data dump in SSRS but since the data set is large, I receive the out of memory error message. yes, SSRS is not designed for data dump report but this report is being requested frequently and I am trying to get it off my plate by automating it. Any thoughts/suggestions?
T‌hank You,
H‌elal‌‌

Related

BadGateway Power Automate using Execute Script Action

I'm using the Run Script action to transfer data between Excel sheets and I'm facing the Badgateway error, saying that the action has timed out. The export action converts the data to JSON format, which is then used as input in the action to import the data and transform the JSON into table format again. The amount of data transferred is large (>10000), however, even when using a smaller database (1000 rows x 3 columns) the error appears sometimes, which may lead me to the conclusion that it may not be the amount of data and yes, Microsoft's database is low at the time of running Flow, so it can't fulfill my request.
I would like to know if any Power Automate plan can help to solve this problem, if any license allows the user to use a greater capacity of the database or have a space destined in which the connection to the server does not fail at the time of executing the flow, flawlessly processing my request. Or if it is a problem that I must solve by decreasing the amount of data transferred in this format, and if is it, how can I measure this quantity of Data Power Automate can process.

How can I work with a huge csv file that does not fit in the RAM with a combination of Power Query and Power Pivot on Excel 2010

I wonder if someone can help me with the following problem. First off my setup (which I can't change because it is a corporate environment)
Operating system: Windows 7 Professional. Service Pack 1. 32bit
Hardware: 8.00 GB RAM (2.73 Gb usable)
I am looking for a solution for slicing and dicing really big files (around 5Gb) with Excel. So the equivalent of being able to use pivot tables and graphs with so much data.
I just got Power Query and Power Pivot installed on my laptop (remember I'm running Excel 2010 - 32bit) and saved the huge .csv file as a connection with Power Query. However, I can't add it to my data model and use it from Power Pivot (apparently that is a problem in Excel 2010). I tried to get around by clicking on Power Pivot -> existing connections. But then it tries to import everything and I run out of memory or hit some Excel limit.
To me, the idea should be that the data is never loaded, that it is kept as a connection (where you only store the query) and that data is loaded "lazily" and only what you need after you set up the Power Pivot report (otherwise I don't know how Power Query and Power Pivot help to work with big files that wouldn't fit in Excel otherwise).
What can I do to add the connection to the huge file to the data model so that I can continue working until I can set up a report (with Power Pivot) and see the results?
If there is a software package I am missing (such as Power BI), that would help me fix the problem please let me know. If it's free (like Power Query and Power Pivot) I could have it installed.
Thank you very much in advance and regards
Since it sounds like the loading the CSV directly into power pivot failed due to importing everything. One option would be loading the CSV into an intermediary database like an access file (or SQL server Express) while cleaning up the data to improve memory usage.
Note: power pivot uses more memory when it is updating the model.
Link to Creating an memory efficient model which has some tips on how to design power pivot models to be more memory efficient.
Also note that uniqueness of data drastically effects how much data can fit in memory. Columns with less distinct values consume less memory, something that is unique like a row_ID on a fact table would consume a ton of memory.

List view threshold in reporting services integrated mode in sharepoint

In one of my sharepoint sites I have a document library with 7,000 document sets, if I count the files inside there are 21000 files in total.
In the beginning we had some views, but when they growth we had list view threshold issues. What I did was to remove some of those views and use search results webparts to get the results the user wants. For me incrementing the threshold is not a solution because this document library grows fast, (2K per month)
This solved the problem for some time.
However, some users do require to export to excel to do pivots based on this data, the only way I can think of is using reporting services in integrated mode with sharepoint, because I can export reports to excel and then they can do pivots.
The question is, will I have the same threshold problem when I make a report based on list data?
What other options do I have?
I have exported files to excel with 600,000+ rows. If the data you are pulling out reaches a certain size you will have to transition to .csv files as excel too has a threshold. The main issues you will run into on very large datasets are timeout issues that can be managed by configuring your http and ssrs timeouts, however, this will lead to other issues including long running reports of 15+ minutes and bandwidth usage.
I would recommend testing your scenario with a simple report to pull your data and see where the limit is reached. Also, look into some filtering mechanisms using parameters to reduce the amount of data returned to the client. If it becomes unmanageable then you may want to look into SSIS or some of the data-warehousing features. SSRS also has cached reporting that can place the processing burden to off hours if real-time data is not a necessity.

SSRS - cannot export large files to Excel

I need to give users the option to download a report into Excel.
The report includes about 70,000 records (and about 50 fields).
The report generates within about 1.5 minutes, but when the user tries to download, nothing happens.
(In Explorer, a second tab opens and the progress wheel keeps turning...)
I have tried on a filtered list of about 10,000 and it downloads quickly without issue.
However, at 30,000 it already has issues.
I tried from this from the server itself, so network issues were ruled out.)
I also tried sending out the report by email subscription, but this also failed.
Is there a way to go around this size limitation?
Or to give the user similar flexibility on the reports server itself, as he would have in Excel, without building in every possible filter into the report itself. (Probably too big a question here.)

SharePoint SQL Reporting Services - OutOfMemory exception on large reports. How to solve?

We have a bunch of reports on SharePoint, using SQL Reporting Services.
The statistics reports - the ones that aggregate data and display few hundreds to a few thousands records are loading fine.
However, we also have reports that display raw records from database. These reports usually have tens or hundreds of thousands of records. Sometimes even millions. And most of the times, they do not load but throw OutOfMemory errors.
The queries for those reports are very simple selects with some where conditions (sometimes, another few small tables might be joined on the huge one). In SQL Server Management Studio the query completes in 5-10 seconds.
I'm frustrated, because the clients are asking for the report, but I can't find any solutions to this (I googled a lot, but the best advice I could find was "get rid of the report or try to minimize the amount of data in it" which doesn't really solve anything - clients insist that they need the ENTIRE report.)
Is it possible to solve this somehow?
Move to 64 bit for the reporting server?
Chances are the users need the ENTIRE report because they are scraping the data off into excel or some other format and using it elsewhere. If you can get away with it, coding a webpage or similar that displays the query in a simple text format/csv may be more effective than a report.
I.e. The best advice you can find is the best advice.

Resources