databasecachingredis

Performance comparison of using Redis hashes vs many keys


Okay, I'm currently planning on using Redis as a front end cache to my NoSQL database. I will be storing a lot of frequently used user data in the Redis database. I was wondering if making a key-value entry for each user would be better or using the Redis hash where the field is the user id and the value is a large json object. What do you think would be better?

I saw this article to sort of answer the question, but it doesn't discuss the limitations on value size.


Solution

  • Choosing hash over string has many benefits and some drawbacks depending on the use cases. If you are going to choose hash, it is better to design your json object as hash fields & values such as;

    127.0.0.1:6379> hset user:1 ssn 10101010101 name john surname wick date 2020-02-02 location continental
    (integer) 5
    127.0.0.1:6379> hgetall user:1
     1) "ssn"
     2) "10101010101"
     3) "name"
     4) "john"
     5) "surname"
     6) "wick"
     7) "date"
     8) "2020-02-02"
     9) "location"
    10) "continental"
    

    Here are the benefits of hash over strings when you do a proper data modeling.

    Speed of RAM and memory bandwidth seem less critical for global performance, especially for small objects. For large objects (>10 KB), it may become noticeable though.

    Hashes, Lists, Sets composed of just integers, and Sorted Sets, when smaller than a given number of elements, and up to a maximum element size, are encoded in a very memory-efficient way that uses up to 10 times less memory (with 5 times less memory used being the average saving).

    On the other hand, depending on your use case(s);