I'm trying to store CSRF protected (querystring + cookie) API POST requests for later replay when a webapp comes back online.
To do this, I want to save the Request Object (Fetch API) in IndexedDB, but IDBObjectStore.put fails with a DataCloneError "An object could not be cloned".
The Request object has a simple JSON body, no binary data, just all strings.
This is running in a service worker (a web worker) environment.
Is there any reason why the structured cloning algorithm would not clone a Request Object? [Answer: Yes] If so, what are my best options for dehydrating/rehydrating this object in lieu of structured cloning?
I really want to avoid having to know/access individual properties of the Request object. The parts of the Request I'll need are url, headers, body, and cookie (but again, I don't want the code to have to know about that).
Thanks in advance for any advice.
Are you sure you need to store the auth cookie and CSRF parameter in IndexedDB, as opposed to just regenerating them when the Request
is replayed?
We faced a similar situation in the Google I/O 2015 Web App and ended up just storing the basic request info (URL + method, but a serialized JSON body would be conceptually the same) in IndexedDB. Each time the page was loaded and there were valid credentials available, we checked IndexedDB to see if there were any queued replay requests, and if so, sent them to the server with fresh credentials.
We didn't have much of a choice, since the credentials we were using expired after one hour, but in general it seemed like a reasonable pattern to follow whenever there's a chance that the credentials you're using might go stale.
(You'd obviously want to clear out queued requests in IndexedDB if the user logs out, but that would be necessary regardless.)