I am new to chronicle-map. I am trying to model an off heap map using chronicle-map where the key is a primitive short and the value is a primitive long array. The max size of the long array value is known for a given map. However I will have multiple maps of this kind each of which may have a different max size for the long array value. My question relates to the serialisation/deserialisation of the key and value.
From reading the documentation I understand that for the key I can use the value type ShortValue and reuse the instance of the implementation of that interface. Regarding the value I have found the page talking about DataAccess and SizedReader which gives an example for byte[] but I'm unsure how to adapt this to a long[]. One additional requirement I have is that I need to get and set values at arbitrary indices in the long array without paying the cost of a full serialisation/deserialisation of the entire value each time.
So my question is: how can I model the value type when constructing the map and what serialisation/deserialisation code do I need for a long[] array if the max size is known per map and I need to be able to read and write random indices without serialising/deserialising the entire value payload each time? Ideally the long[] would be encoded/decoded directly to/from off heap without undergoing an on heap intermediate conversion to a byte[] and also the chronicle-map code would not allocate at runtime. Thank you.
First, I recommend to use some kind of LongList
interface abstraction instead of long[]
, it will make it easier to deal with size variability, provide alternative flyweight implementations, etc.
If you want to read/write just single elements in large lists, you should use advanced contexts API:
/** This method is entirely garbage-free, deserialization-free, and thread-safe. */
void putOneValue(ChronicleMap<ShortValue, LongList> map, ShortValue key, int index,
long element) {
if (index < 0) throw throw new IndexOutOfBoundsException(...);
try (ExternalMapQueryContext<ShortValue, LongList, ?> c = map.getContext(key)) {
c.writeLock().lock(); // (1)
MapEntry<ShortValue, LongList> entry = c.entry();
if (entry != null) {
Data<LongList> value = entry.value();
BytesStore valueBytes = (BytesStore) value.bytes(); // (2)
long valueBytesOffset = value.offset();
long valueBytesSize = value.size();
int valueListSize = (int) (valueBytesSize / Long.BYTES); // (3)
if (index >= valueListSize) throw new IndexOutOfBoundsException(...);
valueBytes.writeLong(valueBytesOffset + ((long) index) * Long.BYTES,
element);
((ChecksumEntry) entry).updateChecksum(); // (4)
} else {
// there is no entry for the given key
throw ...
}
}
}
Notes:
writeLock()
from the beginning, because otherwise readLock() is going to be acquired automatically when you call context.entry()
method, and you won't be able to upgrade read lock to write lock later. Please read HashQueryContext
javadoc carefully.Data.bytes()
formally returns RandomDataInput
, but you could be sure (it's specified in Data.bytes()
javadoc) that it's actually an instance of BytesStore
(that's combination of RandomDataInput
and RandomDataOutput
).SizedReader
and SizedWriter
(or DataAccess
) are provided. Note that "bytes/element joint size" technique is used, the same as in the example given in SizedReader
and SizedWriter
doc section, PointListSizeMarshaller
. You could base your LongListMarshaller
on that example class.ChecksumEntry
javadoc and the section about checksums in the doc. If you have a purely in-memory (not persisted) Chronicle Map, or turned checksums off, this call could be omitted.Implementation of single element read is similar.