A high performance version of java.util.LinkedHashMap for use as a software cache.
DesignA concurrent linked list runs through a ConcurrentHashMap to provide eviction ordering. Supports insertion and access ordered eviction policies (FIFO, LRU, and Second Chance). StatusTesting new algorithm. Original AlgorithmThe Sept. 2008 jar contains the initial version, which uses a concurrent deque that runs through a ConcurrentHashMap. For simplicity, the implementation used a lazy removal strategy to allow focusing on validating the doubly linked list. A backtracking eager removal strategy was initially prototyped, but not adopted due to concerns over management of the data races.
According to Greg's benchmarks, he found up to a 92.5x performance gain when compared to the previous version of Ehcache (v1.5).
Note: A rare race condition was uncovered by Greg Luck (Ehcache). This algorithm is deprecated.
Current AlgorithmThe design document describes a newer approach based on per-node spin-locks.
Note: The algorithm needs further testing and is not deemed production ready. It is functional under concurrent tests, but needs additional load testing to assert correctness.
Production VersionDue to popular demand and accidental usages in production environments, a certified production-ready version is provided for download. This version is will provide the desired performance without using experimental code intended for personal education. The modifications are documented outside of the repository.
FutureAfter the current algorithm is released, work will begin towards maturing it into a lock-free algorithm. A lock-free algorithm is non-blocking, maintains the list in an acceptable but not always perfect state, and coerces to the desired state. This changes the thought process of trying to block concurrent operations from corrupting shared state to making them work together to arrive at a final state.
LicenseThe algorithms and code are provided as-is under the APL-2. They were built primarily for fun and as a hobby project. The main motivation was that the puzzle of how to implement a concurrent linked hashmap sounded quite enjoyable (and educational).
To figure it out for myself, I avoided venturing into certain academic literature. While this may make progress slower, it allows me to understand the algorithms rather than copying them. That said, others are free to use this structure as a template and make the necessary changes to implement one of those algorithms.
Copyright © 2013 Black Duck Software, Inc. and its contributors, Some Rights Reserved. Unless otherwise marked, this work is licensed under a Creative Commons Attribution 3.0 Unported License . Ohloh ® and the Ohloh logo are trademarks of Black Duck Software, Inc. in the United States and/or other jurisdictions. All other trademarks are the property of their respective holders.