Generate PivotTable in Excel from MSSQL with 8 million records

1

I have a table in MSSQL with currently 8 million records.

I use Pentaho to analyze this data. I'm trying to use Excel too. If I set it to get the data and put it directly into the PivotTable, it does not restrict the 1 million records, because it reported having passed 2 million.

However, it was consuming 1.5GB of RAM and stopped with error "Excel can not complete this task with available resources. Select less data or close other applications." The PC has 7.7GB and is using 6GB.

Is there any way in it, instead of trying to load all the data into RAM, just get the metadatas and go on making queries with group by as needed, as Pentaho does?

    
asked by anonymous 27.01.2017 / 17:41

2 answers

0

I used PowerPivot and with it I was able to load the original Star Schema instead of needing to use HyperDenormalization as I was trying to do.

But in the end I prefer to continue using Pentaho to analyze and then export if appropriate.

    
12.04.2017 / 20:51
1

Apparently Excel stores the data loaded in memory without doing any type of compression. What BI tools do, so you can work with so much data.

I do not think it's possible to do this with Excel, otherwise it would be an excellent substitute for BI tools.

I work with QlikView and it performs an absurd compression of the data and uses it in memory for agility and we always have a headache with the staff wanting to use only Excel.

Is Pentaho able to export tables in Excel format? If I needed a specific table in Excel using QlikView, I would generate it in the tool and export it to Excel, thus managing to display the data in Microsoft's much-loved tool.

    
02.02.2017 / 18:37