Redis provides a way to input structured JSON documents and retrieve them by content (this can, for example, power an autocomplete search experience). So I am doing this, and it is working fine:
test = "{'key': 'glooler-61245', 'val': 'sighgh'}"
r.json().set("a:glooler-61245", Path.root_path(), ast.literal_eval(test))
rs.search(Query(f"glooler"))
Redis also allows pipelines where many insertions can be streamed together and executed in one big batch, like this:
pipe = r.pipeline()
pipe.set("a", "a value")
pipe.set("b", "b value")
pipe.execute()
pipe.get("a")
However, pipe allows user to retrieve only by key, unlike the JSON one, where one can retrieve by content. How can I merge this two, so that I can push 10k structured JSONs like the first example, and then execute the pipe in one go and then be able to retrieve by value?
I tried this:
pipe.set('a:glooler-61246', "{'key': 'glooler-61246', 'val': 'loltest'}")
pipe.set('a:glooler-61247', "{'key': 'glooler-61247', 'val': 'loltest2'}")
pipe.set('a:glooler-61248', "{'key': 'glooler-61248', 'val': 'loltest3'}")
pipe.set('a:glooler-61249', "{'key': 'glooler-61249', 'val': 'loltest4'}")
pipe.execute()
However, rs.search(Query(f"glooler"))
still returns only the first value I had earlier inserted.
I suspect your problem is that when pipelining you are inserting values as regular strings instead of JSONs, therefore key values are not matched during search. Try this
pipe = r.pipeline()
pipe.json().set("a:glooler-61246", Path.root_path(), ast.literal_eval("{'key': 'glooler-61246', 'val': 'loltest1'}"))
pipe.json().set("a:glooler-61247", Path.root_path(), ast.literal_eval("{'key': 'glooler-61247', 'val': 'loltest2'}"))
pipe.execute()