indexeddb

Adding many items to indexedDB


I am looking for best practices when it comes to adding many items to an indexedDB. The upper bound of the number of items is usually around 100k, and each item is a js object usually around 350 bytes.

I maintain a list of pending items (actions) to be taken on the database (each action is either "add", "put", or "delete"), and each item could belong to a different object store.

I simply take the first item, apply it to the appropriate objectStore, and once that's done (either successfully or not), I move on to the next item.

Is there a better (more efficient) way to do this? And are there any scoping issues w.r.t. transaction lifetime I need to be concerned with in the snippet below?

function flushPendingDbActions() {
    var transaction = db.transaction(storeNames, "readwrite");

    var addNext = function() {
        var nextItem = pendingDbActions[0];
        if (nextItem) {
            pendingDbActions.shift();

            var objectStore = transaction.objectStore(nextItem.store),
                params,
                request;

            switch(nextItem.action) {
            case 'add':
            case 'put':
                params = nextItem;
                break;
            case 'delete':
                params = nextItem.key;
                break;
            }

            request = objectStore[nextItem.action](params);
            request.onsuccess = request.onerror = addNext;
        }
    };

    addNext();
}

Solution

  • Looks alright to me, but a couple things:

    Edit - added content:

    According to https://www.w3.org/TR/IndexedDB-2/#transaction-lifetime-concept:

    Unless otherwise defined, requests must be executed in the order in which they were made against the transaction. Likewise, their results must be returned in the order the requests were placed against a specific transaction.

    Therefore, you should be able to fire off your requests concurrently instead of serially, which should avoid the TransactionInactiveError. This is true even if you fire off requests that do the equivalent of add item > delete item.

    Then you run into the stack overflow issue with too many pending requests because requests are located in the stack, and then you should consider buffering in that case. So take the pendingactions array and process it in chunks. Take let's say 1000 at a time, pass 1000 to a processchunk helper function that does fires off 1000 concurrent requests, wait for it to complete, and then process the next chunk, up until all chunks processed.