pythonredispy-redis

Redis-python add multiple values of type set to a redis hash


Redis n00b here so don't shoot!

I need to store a collection of sets in redis indexed by key - I can:

import redis
r = redis.Redis()

r.sadd("A:B:123", *{456, 789})
r.sadd("A:B:124", *{666, 777})

but then if I want the collection "A:B" I have to collect the keys manually as in:

{k.rsplit(b':', 1)[-1]: r.smembers(k) for k in r.scan_iter("A:B:*") }
# {b'124': {b'666', b'777'}, b'123': {b'456', b'789'}}

This seems awfully slow (note also the rsplit)

I have been trying to use hmset to do the trick:

r.hmset("A:B", mapping={123: 'X', 124: 'Z'})

but I can't find a way of substituting my sets for 'X' and 'Z'.

(note that ideally set elements should be of type int as passed in - note also that those collections are meant to be read only, so I want to optimize lookup not insertion time)


Solution

  • import redis
    r = redis.Redis()
    
    r.sadd("A:B:123", *{456, 789})
    r.sadd("A:B:124", *{666, 777})
    

    New code

    import redis
    r = redis.Redis()
    
    r. hmset("A:B", {"123": ",".join(map(str, {456, 789}))})
    r. hmset("A:B", {"124": ",".join(map(str, {666, 777}))})
    

    Print All Elements

    print(dict([(k,set(map(int, v.split(b",")))) for k,v in r.hgetall("A:B").items()]))
    

    Use a single map to store related keys and in a given key store concatenated values as a string.

    NOTE: Adding element(s) to set is not atomic since a procedure has to read first then perform deserialization and store it back unless LUA script is used.

    Using SUNION

    r.sadd("A:B:123", *{456, 789})
    r.sadd( "A:B:members", "A:B:123")
    r.sadd("A:B:124", *{666, 777})
    r.sadd( "A:B:members", "A:B:124")
    

    Print Method

    r.sunion(r.smembers("A:B:members"))