While we cannot optimize the load time of individual connections we can ensure that nginx has the ideal environment optimized for handling high traffic situations. By high traffic I mean several hundreds of requests per second, the far majority of people don’t need to mess around with this, but if you do, are curious or simply want to be prepared then read on.
This is part two in my caching series. Part one covered the concept behind the full page caching as well as potential problems to keep in mind. This part will focus on implementing the concept in actual PHP code. By the end of this you’ll have a working implementation that can cache full pages and invalidate them intelligently when an update happens.
Caching in PHP is usually done on a per-object basis, people will cache a query or some CPU intensive calculations to prevent redoing these CPU intensive operations. This can get you a long way. I have an old site which uses this method and gets 105 requests per second on really old hardware. The method I propose will net you a solid 12,000 requests per second.