Optimize huge MB web page

We have a lot of dynamic data to be displayed on our web page(just one image rest is text). Since querying for this data from DB while the page is waiting to load will take a lot of time we take the data and put it into json files once a day. When the page is loading we decided on prebuilding the html before displaying so that once the page is loaded there is no delay in filtering or browsing. This html page is now 16 mb which we tried to compress using mod_deflate. The compressed file is 3-4 Mb now. The initial load is still slow at around 10-20 seconds. Can anyone suggest ways to speed this up?

Over what connection? 4MB on a 1.5Mb DSL connection would take 21 seconds under optimal conditions (ignoring overhead, slow start, etc). 16MB of content is a LOT. Can it (or should it) be incrementally delivered/paginated? If you’re injecting that into the DOM through Javascript (not sure what you are doing with the json) then it’s going to be significantly slower as well.

If the data is held in the database, then creating a snapshot table daily would be far more beneficial, as it will utilise any internal caching that is configured, so you’re delivering the data directly from memory, not ( as Patrick alluded to ) this weird JSON approach.

Raw data in ASCII form should compress to approx 10% if you’re careful. In fact, it sounds like this page is static on a daily basis, so a decent web server should be able to access a gzipped version of the page so that there should be almost no effort on the part of the web server to deliver it.

But that’s all addressing the TTFB… you still have 4MB to download. As there’s nobody who can understand 16MB of data instantly, can you not logically split that data set into an overview page and accompanying detail ones??