c++eigeneigen3

About the default storage index type for Eigen::SparseMatrix


For dense matrices in Eigen, the storage index type is std::ptr_diff by default, but can be adjusted with a processor directive setting EIGEN_DEFAULT_DENSE_INDEX_TYPE (affects all code).

For sparse matrices however, the storage index type is set for every individual matrix as a template argument, and defaults to int. There doesn't seem to be a #define that can be set to change this default in the whole code.

I was wondering what were the reasons for this difference, and if there was a way I missed to globally set the default storage index for sparse matrices, so that it is consistent with dense matrices ?

To give a bit of context, my code uses a mix of dense and sparse matrices, and I get a lot of compiler warnings due to integer type conversions. To get rid of these, the only option currently is to always specify the storage type of sparse matrices, which is a bit verbose (although I guess a typedef could help as a last resort).


Solution

  • In a dense matrix there are only two index variables, one for the number of rows and one for the number of columns. Therefore their memory size is rather unimportant. Meanwhile, any index variable may be directly used to address memory. Using ptrdiff_t is the natural choice for this. Using int may cause the compiler to emit additional instructions to sign-extend to 64 bit ptrdiff_t.

    However, for sparse matrices, you need to record the location of each non-zero element. Eigen uses compressed sparse row/column (CSR/CSC) format. In CSC, each non-zero scalar is coupled with one index for its row. Additionally, there is a fixed overhead of one (or two in non-compressed form) index variables per column, regardless of the sparsity. Switching to ptrdiff_t would therefore double the memory use. Simultaneously, the chance of you needing more than 2 billion rows or columns or 2 billion non-zero entries is rather low. That's why this tradeoff makes sense and as you noted, you can always make your own typedefs.

    Rather than hurting your performance by using the wrong index type I suggest you change your compiler warning settings or add explicit casts.