I keep hitting this annoying message !
When data finished insert into local database, I saw this message
JNI critical lock held for 30.083ms on Thread[27,tid=23883,Runnable,Thread*=0xce150a00,peer=0x12cc0190,"Sqflite"]
If I select data from the table, I get
W/CursorWindow(23809): Window is full: requested allocation 3095146 bytes, free space 2096696 bytes, window size 2097152 bytes
E/SQLiteQuery(23809): exception: Row too big to fit into CursorWindow requiredPos=0, totalRows=1; query: SELECT * FROM description_table;
I/flutter (23809): DatabaseException(Row too big to fit into CursorWindow requiredPos=0, totalRows=1) sql 'SELECT * FROM defect_description_table;' args []}
I don't have any blob data, all are String. So why would this happened?
Here the json structure
[{ "id": 1, "name": "Descriptions","data": [{..},{...} ... ]},{....},{....}]
I think the issue is on the data list,because it contains lot of data(it has 10298) ?
What is the solution for this?
My insert method
Future insertData(
BuildContext context, String urls, String accessToken) async {
CategoryTableData categoryData = CategoryTableData();
try {
var desc = List<Desc>();
var headers = {
'authorization': "Bearer" + " " + accessToken,
"Accept": "application/json"
};
var url = xxx;
var response = await http.get(url, headers: headers);
var res = json.decode(response.body);
for (var i in res) {
if (i['name'] == "Descriptions") {
desc.add(Desc(
id: i['id'], name: i['name'], data: i['data']));
}
}
await dao.batch((b) {
b.insertAll(_dao.descTable, desc);
});
categoryData = await _dao.selectAllCategories();
return categoryData;
} catch (e) {
print(e);
categoryData = await _dao.selectAllCategories();
return categoryData;
}
}
Desc
class Desc extends Table {
IntColumn get id => integer().nullable()();
TextColumn get name => text().named("name").nullable()();
TextColumn get data => text().map(const ListConverter()).nullable()();
}
The message Window is full: requested allocation 3095146 bytes, free space 2096696 bytes, window size 2097152 bytes
Is telling you that you are trying to fit a row (* = ALL columns) that is 3095146 bytes into a Cursor Window (buffer) that has 2096696 bytes free of the 2097152 (2Mb) available bytes.
Simply put there is too much data for the data to be extracted. This is not an issue when you are inserting the data as there is no intermediary and relatively limited buffer (Cursor Window).
You often see this when attempts are made to retrieve stores images.
There are various ways that could be used to circumvent the issue.
With such large rows, size wise, you could store the actual data as a file and store a path to the file in the database (this is recommended for images).
You may have some luck by reducing the columns extracted. i.e. by not using SELECT * but instead specifying only the columns that you require.
It may be that you can extract multiple portions using multiple if it is the accumulated size of the column data that is the issue.
You could select potions of data utilising the length(column_name) function, perhaps using the CASE WHEN THEN ELSE END expression/construct and then build the full data when it has been retrieved.
It may be that you are inadvertently storing more data, e.g. accidentally concatenating data in a loop.
You would need knowledge of the data stored to know which of the above may be of use.