What's the fast method to compress Python objects (list, dictionary, string, etc) before saving them to cache and decompress after read from cache?
I'm using Django and I hope to add compress/decompress support directly in Django's cache backend which makes it available to all my Django apps.
I looked into django/core/cache/backends/memcached.py
import cmemcache as memcache
class CacheClass(BaseCache):
def __init__(self, server, params):
BaseCache.__init__(self, params)
self._cache = memcache.Client(server.split(';'))
def get(self, key, default=None):
val = self._cache.get(smart_str(key))
if val is None:
return default
return val
def set(self, key, value, timeout=0):
self._cache.set(smart_str(key), value, self._get_memcache_timeout(timeout))
Looks like pickle/unpickle is done by cmemcache library. I dont know where to put the compress/decompress code.
I looked further into python-memcache's source code.
It already supported compressing values by zlib before sending them to memcached.
lv = len(val)
# We should try to compress if min_compress_len > 0 and we could
# import zlib and this string is longer than our min threshold.
if min_compress_len and _supports_compress and lv > min_compress_len:
comp_val = compress(val)
# Only retain the result if the compression result is smaller
# than the original.
if len(comp_val) < lv:
flags |= Client._FLAG_COMPRESSED
val = comp_val
def _set(self, cmd, key, val, time, min_compress_len = 0):
Here is Django's implemention for the "set" command in its memcache backend:
def set(self, key, value, timeout=0):
self._cache.set(smart_str(key), value, self._get_memcache_timeout(timeout))
Apparently it does not have "min_compress_len" parameter.