androidandroid-listviewandroid-cursorandroid-cursoradapter

How to implement an endless CursorAdapter?


At our company we are developing an app that displays a timeline. We are willing to let the user scroll it (almost) indefinitely.

Now there are 2 facts to consider:

In the current implementation we're loading by default 40 items, then when the user scrolls beyond a certain threshold we repeat the query by increasing the limit to 40+20 items, and so on.

However, this approach seems to be quite weak because it clashes with both the principles stated before: the query will eventually become fairly large and at some point the cursor might hit the memory limit of 1MB (we load a lot of strings).

Now we're thinking about exploiting the MergeCursor and proceed like so:

  1. Load a cursor of 40 items the first time
  2. When the user scrolls beyond a certain level, we load another cursor containing the next 40 items and set in the cursor adapter a MergeCursor that concatenates the new cursor to the previous one.
  3. Continue with this approach till at most X steps (depending on tests) to avoid hitting some OOM exception. At the end the timeline cursor will be the concatenation of X cursors.

What do you think about this approach? Any weakness (except the overhead, which should be small)?

In case, can you point/describe better solutions?

Thanks in advance


Solution

  • In the comments pskink suggests to use AbstractWindowedCursor.

    I was not familiar with this class and investigated it a bit. It turns out that SQLiteCursor extends it already. The documentation states this:

    The cursor owns the cursor window it uses. When the cursor is closed, its window is also closed. Likewise, when the window used by the cursor is changed, its old window is closed. This policy of strict ownership ensures that cursor windows are not leaked.

    This means that at any given moment only a tiny portion of the data queried from the DB is actually kept in memory. This is the interesting part of the code in SQLiteCursor:

    @Override
    public boolean onMove(int oldPosition, int newPosition) {
        // Make sure the row at newPosition is present in the window
        if (mWindow == null || newPosition < mWindow.getStartPosition() ||
                newPosition >= (mWindow.getStartPosition() + mWindow.getNumRows())) {
            fillWindow(newPosition);
        }
    
        return true;
    }
    
    @Override
    public int getCount() {
        if (mCount == NO_COUNT) {
            fillWindow(0);
        }
        return mCount;
    }
    
    private void fillWindow(int requiredPos) {
        clearOrCreateWindow(getDatabase().getPath());
    
        try {
            if (mCount == NO_COUNT) {
                int startPos = DatabaseUtils.cursorPickFillWindowStartPosition(requiredPos, 0);
                mCount = mQuery.fillWindow(mWindow, startPos, requiredPos, true);
                mCursorWindowCapacity = mWindow.getNumRows();
                if (Log.isLoggable(TAG, Log.DEBUG)) {
                    Log.d(TAG, "received count(*) from native_fill_window: " + mCount);
                }
            } else {
                int startPos = DatabaseUtils.cursorPickFillWindowStartPosition(requiredPos,
                        mCursorWindowCapacity);
                mQuery.fillWindow(mWindow, startPos, requiredPos, false);
            }
        } catch (RuntimeException ex) {
            // Close the cursor window if the query failed and therefore will
            // not produce any results.  This helps to avoid accidentally leaking
            // the cursor window if the client does not correctly handle exceptions
            // and fails to close the cursor.
            closeWindow();
            throw ex;
        }
    }
    

    This means 2 things:

    1. it should be safe to load the entire dataset, as it will be not fully kept in memory. Only a portion of it (a CursorWindow) is in memory at any time. The 1MB size limit is either (possibly) a myth or it refers to the CursorWindow object and in that case it is a safe size
    2. Performance shouldn't be an issue since, again, the cursor always works on a fixed amount of data. Probably the initial query (which calculates the total size of the data set, stored in the mCount variable) might have some impact on the perceived performance. I need to further test this.

    In conclusion, most likely there is no need to use the MergeCursor trick or to worry excessively about OOM.

    I could have investigated better in the source code but I was a bit tricked by what I read on the web.