c++algorithmlru

LRU implementation in production code


I have some C++ code where I need to implement cache replacement using LRU technique.
So far I know two methods to implement LRU cache replacement:

  1. Using timeStamp for each time the cached data is accessed and finally comparing the timeStamps at time of replacement.
  2. Using a stack of cached items and moving them to the top if they are accessed recently, so finally the bottom will contain the LRU Candidate.

So, which of these is better to be used in production code?
Are their any other better methods?


Solution

  • Recently I implemented a LRU cache using a linked list spread over a hash map.

        /// Typedef for URL/Entry pair
        typedef std::pair< std::string, Entry > EntryPair;
    
        /// Typedef for Cache list
        typedef std::list< EntryPair > CacheList;
    
        /// Typedef for URL-indexed map into the CacheList
        typedef boost::unordered_map< std::string, CacheList::iterator > CacheMap;
    
        /// Cache LRU list
        CacheList mCacheList;
    
        /// Cache map into the list
        CacheMap mCacheMap;
    

    It has the advantage of being O(1) for all important operations.

    The insertion algorithm:

    // create new entry
    Entry iEntry( ... );
    
    // push it to the front;
    mCacheList.push_front( std::make_pair( aURL, iEntry ) );
    
    // add it to the cache map
    mCacheMap[ aURL ] = mCacheList.begin();
    
    // increase count of entries
    mEntries++;
    
    // check if it's time to remove the last element
    if ( mEntries > mMaxEntries )
    {
        // erease from the map the last cache list element
        mCacheMap.erase( mCacheList.back().first );
    
        // erase it from the list
        mCacheList.pop_back();
    
        // decrease count
        mEntries--;
    }