What's the proper way to deal with a web page that returns results of a search that could have differing results from one moment to another?
I.e. returning the query the first time could contain different results from when the user clicks on page 2, and running the query again.
How do most people deal with this scenario? Generally I'm working with internal ASP.Net applications (where security/bandwidth aren't huge concerns), so I'll store the results in the ViewState, and on postbacks deal with that data opposed to querying the database.
What's the proper methodology for external WWW use? My best guess is to store the results in a temporary database table (not literally a temp table; I guess 'staging' might be more accurate), but I would think that table would get hammered quite a bit with inserts/deletes/etc, and I would think you'd need a process to clean the table up, which doesn't seem like a very elegant solution.
Am I anywhere close?
Most applications don't deal with it... they assume the results won't change enough to warrant some sort of caching mechanism. However, if you're working with highly-realtime data (like Twitter results, for instance), your paging links would most likely look like this:
?q=your+query&olderthan={last result shown}&limit=10
...where {last result shown}
is the ID of the last result on the current page. This ID would be used to allow you to query for results older than the specified ID: SELECT * FROM table WHERE id < {last result shown}
.