Wednesday, March 28, 2012

Problem with Large data amounts

I have a dataset with 300,000 records and I'm getting the following error with MS Reporting Services. "An error has occurred during report processing. Exception of type System.OutOfMemoryException was thrown. any help with this would be highly appreciated.

Unfortunately, Reporting Services has some memory limitations in the current (SQL 2K and SQL 2K5). Because we support things like dataset aggregates (you can have sum(x) on the first page of your report), we materialize the entire dataset. Some things you might try:

Do you need the entire data set displayed in the report or could you do some grouping / aggregation in the query?|||

I am having a similar problem with large data amounts. My reporting server is a Dual Xeon 3GHz machine with 2.5GB RAM and still has issues.

What I find is that the reportviewer object is not releasing memory after the reports are generated, viewed and closed. It recycles the application several times if I try to do an export to excel.

Is there any way to be sure to dispose of the memory being allocated by the reportviewer? I can generate one report, close it, wait, open another, and the memory allocation just keeps growing and growing.

Any help is appreciated.

|||

I seem to be having a similar problem with a Xeon and 4Gig of memory. In my case, after the out of memory error happens, remote connections stop working stating that there is insufficient memory. In other words I can't even run a select statement in Management Studio after it hits the memory error. I have to restart the SQL Service.

HELP!!!

|||

First off. A human does not look at 300,000 records. Most of the time I have seen this it has been to export to Excel. When exporting to Excel export as CSV ASCII (although I don't think Excel can handle more that 64,000 records).

Note that this is with records returned, not records in the base table. I go against tables with 150 million records but I only return the data needed.

I suggest looking at design drill through reports to limit the data.

|||

Is this still a problem with Reporting Services? We are having a similar problem with .net choking on large data sets and were considering using Reporting Services as an alternative. Is Microsoft an enterprise player or not?

We are required to periodically return several hundred thousand rows from SQL Server. I'm tired of being told that is a design problem. Unfortunately when our client says he needs to print a detail report from his general ledger for his auditor we don't have the luxury of telling him Microsoft doesn't think he needs that large of a report. If he can't get the data he needs from us, he'll be happy to go down the street to get it (read Oracle).

If I can't serialize it through .net or get it out of Reporting Services, how am I supposed to extract large data sets from SQL Server?

Will 64 bit processors and multi-GB ram implementations help?

Thanks.

No comments:

Post a Comment