-
Notifications
You must be signed in to change notification settings - Fork 12
Description
We need caching to smooth out the occasional latency on read from Cloud Files, and other parts of the infrastructure that have a tendency to be a little slow. I think putting an HTTP cache like Varnish in front of the presenter would yield the biggest benefit for users, where we can cache the latency in both cloud files and the presenter's fetching of various things.
Clearing the cache, however, would be a little more complicated. Because our site is essentially static, we could have Varnish cache responses forever until new content is submitted, at which point it would be flushed. If we do any time-sensitive operations on the back-end, this strategy wouldn't work, and we'd have to settle for a reasonable expiry time.
Bonus points: we could use request statistics from the ELK stack to determine common requests and intelligently warm the cache after a flush.