arrayscomputer-sciencecomplexity-theorytheory

Why is array element referencing a constant time operation?


Let me briefly explain how I think array element referencing works.

Start address of the array + Data type size*index of the element to be fetched = Address of the required array element

Basically, the above operation is executed when suppose b[i] is written (where b is an array and i is the index of the desired element). This operation is supposed to be a constant-time operation. However, I think otherwise. Consider the following examples,

Example 1

579015713945 (start address) + 16 (datatype size) * 2789280 (index of the required element) = Address of the required element.

Example 2

579015713945 (start address) + 16 (datatype size) * 1 (index of the required element) = Address of the required element.

Example 3

100 (start address) + 8 (datatype size) * 1 (index of the required element) = Address of the required element.

Will a machine not be able to compute examples 2 and 3 faster than example 1? So, how is the operation a constant-time operation then when there are variable amounts of time needed?

I computed the value of example 1 and 3 on colab, and both took 0 seconds to compute (haha). Particular numbers do not matter here. I could have made the numbers much bigger to make the execution time difference between example 1 and example 3 greater. Then example 1 probably would have taken 1 second, and example 3 would have taken 0.1 second. At any rate, I think I have been able to convey my general question.


Solution

  • ALUs in CPUs are fixed size and operate on a fixed number of bits -- on a 64-bit machine, the ALUs will be 64 bits and will do an add or multiply1 of 64 bit values in constant time. Thus as long as your addresses and indexes are less that 264 (which they must be to be valid addresses), the time will be constant


    1On some machines, multiplies of some corner case values (such as mulitplying by 0) may take less time, bit this is usually ignored when talking about complexity or Big-O time. The distinction may be important in other domains (such a cryptography), so you need to be careful about exactly what "constant" means in your domain. It varies.