pythonscraperwiki

Appending data to ScraperWiki datastore


Here is a simple Python script to store some data in ScraperWiki:

import scraperwiki
scraperwiki.sqlite.save(unique_keys=["a"], data={"a":1, "b":"Foo"})
scraperwiki.sqlite.save(unique_keys=["a"], data={"a":1, "c":"Bar"})

The result is the following table in the datastore:

a  b  c
1     Bar

This is annoying, since in my second sqlite.save command, I did not specify "b":"" or any other such thing to blank the contents of the "b" column for row 1. In other words, my desired result is to end up with the following table in the datastore:

a  b    c
1  Foo  Bar

So my question is: when using successive "save" operations to the ScraperWiki datastore, what is the best way to append data without overwriting existing data, in order to achieve the sort of result I have outlined above?


Solution

  • My solution to this problem was to replace successive save operations with successive modifications to a Python dictionary of dictionaries: one sub-dictionary for each row of the intended contents of the datastore. Using a dictionary of dictionaries rather than a list of dictionaries makes it easier to write to the relevant sub-dictionary, albeit with two minor annoyances:

    NB. For a significant number of data rows, saving a list of dictionaries to the datastore in one operation is much faster than iterating over those dictionaries and saving them to the datastore one at a time.

    Code example:

    import scraperwiki
    superdictionary = {}
    superlist       = []
    superdictionary['1'] = {"a":1, "b":"Foo"}
    superdictionary['1'].update({"c":"Bar"})
    superdictionary['2'] = {"a":2, "b":"Grue", "c":"Gnu"}
    
    for subdictionary in superdictionary:
        superlist.append(superdictionary[subdictionary])
    scraperwiki.sqlite.save(["a"], superlist)
    

    should produce:

    a  b     c
    1  Foo   Bar
    2  Grue  Gnu