I have about 500000+ records in ignite in my table called mytable.
I have to read them via my cpp program for which I am using the c++ thin client and configured that within my application successfully.
Initially I was able to perform basic sql like queries e.g fetching 1000 records and writing them in a file etc.
For quering and processing all records I tried two approaches and both result in the same
Error:
org.apache.ignite3.lang.IgniteException: IGN-TX-13 TraceId:613fb353-aa2f-4cca-bde1-87f661c45ad9 org.apache.ignite.sql.SqlException: Transaction is already finished () [txId=0196d8d1-26ae-0000-e5f1-efe300000001, readOnly=true].
Context: I am a beginner with ignite and I have the ignite engine running locally with the default node and I have not specified any custom configuration and using it as extracted from the release version. The C++ application is also running locally.
The first approach I tried was using a query like "Select data from mytable"
which fetches a page whose size is implicitly limited to 1024 records by ignite.Then I was iterating over the records and writing them in a file as per my use case.At the end I was doing something like(checking for more pages if any and looping again by fetching next page) :
result_set result = client.get_sql().execute(nullptr, {"SELECT DATA FROM MYTABLE"}, std::vector<primitive>{});
do {
std::vector<ignite_tuple> page;
try {
page = result.current_page();
} catch (const std::exception& e) {
std::cerr << "Error fetching current page: " << e.what() << std::endl;
break;
}
if (page.empty())
break;
for (const auto& row : page) {
try {
auto data= row.get("DATA");
{hidden processing}
} catch (const std::exception& e) {
std::cerr << "Error extracting row: " << e.what() << std::endl;
}
}
try {
if (result.has_more_pages())
result.fetch_next_page();
else
break;
} catch (const std::exception& e) {
std::cerr << "Error fetching next page: " << e.what() << std::endl;
break;
}
} while (result.has_more_pages());
This approach works for a few iterations but after 1-2 runs of the for loop and pages , I get the error mentioned earlier.
In the second approach I was trying to do sql like pagiantion by providing an offset and record limit where the cursor I was saving was the ID . The query looked like
SELECT DATA FROM "MYTABLE" where ID > lastIDSaved ORDER BY ID ASC LIMIT bacthSize
This query immediately failed and even running this in gridgain console returns the same error.
Expectation: I need a way to be able to read 500000+ records from ignite and process them within my app.
This might be caused by transactions being timed out. In Ignite 3.0, RO transactions are timed out after 10 seconds by default (this default is configured via ignite.transaction.timeout
configuration property).
So you might either try specifying a transaction timeout when creating a transaction explicitly, or just increase the default transaction timeout via configuration.
In Ignite 3.0, it doesn't yet seem to be possible to pass transaction options when creating an explicit transaction in C++, but in Java it would be something like
Transaction tx = client.transactions().begin(new TransactionOptions().readOnly(true).timeoutMillis(1000000));
var rs = client.sql().execute(tx, queryString);
...