SuperTables handles
Massive Datasets in Tableau.
Big tables.
With the help of server-side row models, SuperTables can now handle HUGE datasets, including millions of rows and even beyond 10 million, without slowing down dashboards. This is powered by a brand-new data fetching method built specifically for scale.

Built for big data performance.
SuperTables now has the option to use WriteBackExtreme V6 as a data service, all within your governed and secure environment.
Instead of querying the database over and over… and over again, data is fetched through a high-performance data engine designed for large volumes.
The results
When datasets grow beyond 10,000 rows, WriteBackExtreme automatically applies caching for grand totals. This avoids unnecessary database queries and keeps performance consistently high.
SuperTables detects when data changes and updates totals automatically. For use cases that require more frequent refreshes, full control over data loading is available.
Lazy loading:
Only load what you need.
SuperTables uses lazy loading, meaning only the rows needed for the current analysis are loaded. Instead of pulling in millions of records at once, data is fetched on demand as users interact with the table.
Combined with smart caching, lazy loading makes it possible to work with extreme data volumes without overwhelming Tableau or the underlying database.
Big Tables. Smooth Experience.
This new data service removes the traditional limits of large tables in Tableau. SuperTables is now built for scale, fast, responsive, and ready for serious data. If large datasets have been holding back table use cases in Tableau, that limitation is now gone.