IT:AD:Patterns:Cache Is King Strategy

Summary

Communicating over the wire between tiers in any shape or form – whether it be from the web browser in the Client Tier to the Presentation Tier, or from the Presentation Tier to the Application Logic Tier, or from the Application layer to the DBMS, is expensive as heck (many thousands of times slower than a same process operation).

So keep the number of times you do it to a mininum: IT:AD:Patterns:Chunky over Chatty Strategy

Better yet…don't do it you don't have to.

Cache as much information as is reasonable in the upper tiers. Reasonable being ascertained on a case by case basis, but you can safely cache lookup data for 30 seconds or longer (even as little as 30 seconds caching substantially improves performance). Depends on the complexity of the query and number of rows returned of course.

Sutiable candidates are lookup tables, etc.

Cache as close to Usable Format as Possible

Another point is to keep in mind is to cache the data in a usable format. If you are going to cache a list of Country Entities, only to convert them to a dropdown list every single time, then maybe consider doing the work once, and then caching the dropdown list html only. Note that this only works if every single visitor will see the same list, and you don't need to use logical statements to remove some countries from the list. In that case, the best you can do is cache the entity list.

Cache Chain

Your caching can be nuanced…you may use one cache to call another cache…which finally calls the repository. In the above case, maybe you cache the html for the full list of countries (usable by most visitors), in a front cache, which then calls the second cache of country entities. The visitors that need logic applied will use that cache. Best of both worlds. Squeeze performance out of what you can.

Cache Duration

People set cache values to inane values like 20 minutes. Unless we are talking about highly complicated queries that take ages, this is nonsense for small direct lookup tables which is 90 percent of what is being cached generally. Don't forget these reference materials are being shared by all threads. So by caching it by 30 seconds, you're saving thousands of db queries (total ballpark 30 secs x 400 page requests/server) over the wire.

30 seconds is a looooong time when talking about the speed of a processor…