I have a table with 50 million records, but a compact structure (id, int1, int1, int2, int3). All relevant indexes are implemented.
I need to query it about 50 times for each user interaction. This is taking about 5 seconds, using regular prepared queries against the db server. All the queries are simple selects.
My question is: what can I do to speed up this, even spending significantly more memory? The Locate method for queries is not flexible enough and using filters directly into the query is slow.
The main query I am running is
select knowledge_id, knowledge_family_id, tag_level, tag_order,
total_title_direct_words, total_title_parenthesis_words from knowledge_tags
where dictionary_word_id = XX order by tag_level, tag_order
Can anyone suggest a strategy? Would TVirtualTable increase speed?
I my opinion it would be much faster load entire data in a TClientDataSet
and use FINDKEY
to Seek those records.
To use FindKey()
you must define Indexes like this:
Cds.IndexDefs.ADD('IDX1', 'FieldA;FieldB',[]);
Cds.IndexName := 'IDX1';
if Cds.FindKey([A,B]) then begin
//Do something
end;
You can also create multiple indexes and use it whenever you want:
Cds.IndexDefs.ADD('IDX1', 'FieldA;FieldB',[]);
Cds.IndexDefs.ADD('IDX2', 'FieldD;FieldB',[]);
Cds.IndexDefs.ADD('IDX3', 'FieldA;FieldC',[]);
if SeekAB then
Cds.IndexName := 'IDX1'
else if SeekDB then
Cds.IndexName := 'IDX2'
else
Cds.IndexName := 'IDX3';