View CSVs with 100 Million Rows
Large CSV Viewer handles CSV files with 100 million rows by uploading them in chunks to a DuckDB server engine. DuckDB indexes the entire dataset, and the browser only fetches the rows currently visible. Every filter, sort, and search query runs server-side at SQL speed across all 100 million rows — results return in under a second for most queries.
Last updated:
If you're working with database exports, log files, or scientific datasets in the tens or hundreds of millions of rows, every standard tool will either crash or refuse to open the file. Large CSV Viewer was specifically built for this — using a DuckDB backend and chunked streaming to handle datasets of any size.
Key features
- Tested on files with 100M+ rows
- DuckDB backend — SQL-grade query performance
- Infinite scroll renders only the rows on screen
- Sort, filter, and search operate on the full dataset
How it works
- Upload the massive file. Your file is sent in 15 MB chunks — the upload completes in the background while you can begin previewing data.
- Index with DuckDB. Once fully uploaded, DuckDB indexes the entire dataset. From this point, every query runs at SQL speed against all 100 million rows.
- Navigate efficiently. Use infinite scroll, column sorting, and search to navigate the dataset. The browser renders only the visible rows, so performance stays smooth.
Frequently asked questions
How long does it take to load a 100-million-row file?
Typical upload time is 2–5 minutes depending on file size and your connection. Once uploaded, queries and page loads are near-instant because DuckDB indexes the data.
Can I sort a 100-million-row file?
Yes. Sorting is executed server-side by DuckDB on the full dataset, not just the visible page.
Can I search across 100 million rows instantly?
Yes. Search and filter queries run server-side and typically return results in under a second for most column types.
What types of files have 100 million rows?
Common examples include server log exports, financial transaction datasets, scientific sensor readings, e-commerce event streams, and large database table dumps.
Is there a maximum row count?
There is no enforced row limit. Performance depends on your upload connection speed and file size. Files up to 100M+ rows have been tested successfully.