ruby-on-railsperformancecachingshopifyrussian-doll-caching

Rails fragment caching of collection


I have a rails 4.1 app, that on a particular page retrieves a list of orders and lists them out in a table. It's important to note that the list is different depending on the logged in user.

To improve performance of this, I am looking to cache the partials for each order row. I am considering to do it like this:

_order_list.html.erb

<% cache(@orders) do %>
    <%= render @orders %>
  <% end %>

_order.html.erb

<% cache(order) do %>
  ...view code for order here
<% end %>

However, I'm unsure about the caching of the collection (@orders). Will all users then be served the same set of cached @orders (which is not desired)?

In other words, how can I ensure to cache the entire collection of @orders for each user individually?


Solution

  • Will all users then be served the same set of cached @orders (which is not desired)?

    Actually cache_digests does not cache @orders themselves. It caches html part of the page for a particular given object or set of objects (e.g. @orders). Each time a user asks for a webpage, @orders variable is going to be set in controller action and its digest is compared to the cached digest.

    So, assuming we retrieve @orders like this:

    def index
      @orders = Order.where(:id => [1,20,34]).all
    end
    

    What we gonna get is cached view with a stamp like that:

    views/orders/1-20131202075718784548000/orders/20-20131220073309890261000/orders/34-20131223112753448151000/6da080fdcd3e2af29fab811488a953d0

    Note that ids of retrieved orders are mentioned in that stamp, so each user with his/her own unique set of orders should obtain his/her own individual cached view.

    But here comes some great downsides of your approach:

    1. Page caches are always stored on disk. That means that you can't have page stamp with any desired length. As soon as you retrieve a solid bunch of orders in a time, you exceed your OS's limit for filenames (e.g. it's 255 bytes for linux) and end up with runtime error.
    2. Orders are dynamic content. As soon as at least one of them updates, your cache becomes invalid. Cache generation and saving it to disk is a pretty consuming operation, so it would be better to cache each order individually. In this case you will have to re-generate cache for a single order instead of re-generating the whole massive collection's cache.