databasenode.jsormredisjugglingdb

New node.js app using legacy database, new database, and Redis caching layer


We're developing a new version of our site using Node, but we need to continue using a legacy mysql database as-is yet also add new fields to some models via new tables in a new database, AND add a caching layer. What's the best way to do this? We were thinking of using Jugglingdb and writing our own adapter. It would need to do several things:

  1. round-robin select from several servers in our db herd.
  2. cache into Redis for read-only connections
  3. know which fields are in the legacy database and which are in the new database.
  4. connect to databases for CRUD connections.

Is this something theoretically doable using a jugglingdb adapter? Or does anyone have other recommendations using another better technique and/or a completely different ORM package?

There's an adapter, jugglingdb-redis-hq, that has a "backyard" feature that is almost what we want, except that it seems to basically be for a sort of backwards caching, i.e. making a persistent copy of expired data in redis over to the database. We don't want to touch the database read/write unless we're changing or inserting something.


Solution

  • Amazing that it's been 3 years since I posted this question. What we ended up doing, and we're finally almost live with this, is this stack:

    Crucially, Sequelize did not make it easy to have connections to 2 different databases, so we made the decision to just only add new tables to the old schema, and not make any changes to the old tables. We've since ended up making a couple minor ALTER TABLEs when we really had to. Am still curious if we could have done this part another way, if another ORM would have let us more easily meld the 2 databases under the hood.