-
-
Notifications
You must be signed in to change notification settings - Fork 46.8k
Added LRU Cache #2138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added LRU Cache #2138
Conversation
Fascinating approach to use a doubly linked list. More Pythonic to use a dict. It would be cool if this could support the decorator calling conventions of https://docs.python.org/3/library/functools.html#functools.lru_cache |
I'll add it as the decorator on monday. But I don't know how to do it using dictionary and O(1) time |
Skip the dict for now. Let’s land it as a doubly linked list implementation. |
other/lru_cache.py
Outdated
@@ -0,0 +1,127 @@ | |||
class Double_Linked_List_Node(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do not use snake_cases
for class naming. Also, no ()
needed as you are not subclassing from any class.
You might wanna try
class DoubleLinkedListNode:
other/lru_cache.py
Outdated
Double Linked List Node built specifically for LRU Cache | ||
''' | ||
|
||
def __init__(self, key, val): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Type hits please
other/lru_cache.py
Outdated
self.prev = None | ||
|
||
|
||
class Double_Linked_List(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same as above
other/lru_cache.py
Outdated
''' | ||
|
||
def __init__(self): | ||
self.head = Double_Linked_List_Node(None, None) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about a sensible default in the method declaration?
other/lru_cache.py
Outdated
self.rear = Double_Linked_List_Node(None, None) | ||
self.head.next, self.rear.prev = self.rear, self.head | ||
|
||
def add(self, node: Double_Linked_List_Node) -> None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs doctests
other/lru_cache.py
Outdated
temp.next, node.prev = node, temp | ||
self.rear.prev, node.next = node, self.rear | ||
|
||
def remove(self, node: Double_Linked_List_Node) -> Double_Linked_List_Node: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs doctests
other/lru_cache.py
Outdated
return node | ||
|
||
|
||
class Lru_Cache: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Naming convention as suggested above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is there any changes required for the decorator @cclauss ?
Thanks a lot for the review @onlinejudge95, I completely overlooked the type hints. I didn't add the doctests for the linked lists as it doesn't have any utility of its own (made for the specific use case for the cache). |
other/lru_cache.py
Outdated
... res = fib(i) | ||
|
||
>>> fib.cache_info() | ||
'CacheInfo(hits=194, misses=99, capacity=100, current size=99)' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
AWESOME!!!
other/lru_cache.py
Outdated
@@ -114,11 +132,46 @@ def set(self, key: int, value: int) -> None: | |||
node.val = value | |||
self.list.add(node) | |||
|
|||
def has_key(self, key: int) -> bool: | |||
def cache_info(self) -> str: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def cache_info(self) -> str: | |
def __str__(self) -> str: |
This allows us to just print(cache)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using repr to use >>> cache
in doctests
other/lru_cache.py
Outdated
|
||
if result is not None: | ||
return result | ||
|
||
result = func(*args, **kwargs) | ||
LruCache.decorator_function_to_instance_map[func].set(args[0], result) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if result is not None: | |
return result | |
result = func(*args, **kwargs) | |
LruCache.decorator_function_to_instance_map[func].set(args[0], result) | |
if not result: | |
result = func(*args, **kwargs) | |
LruCache.decorator_function_to_instance_map[func].set(args[0], result) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
using result is None
as it might also be 0
self.capacity = capacity | ||
self.num_keys = 0 | ||
self.hits = 0 | ||
self.miss = 0 | ||
self.cache = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why have both a cache and a decorator_function_to_instance_map? Could we have one instead of two?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need both as the cache maps the keys to the Double Linked List Node
self.capacity = capacity | ||
self.num_keys = 0 | ||
self.hits = 0 | ||
self.miss = 0 | ||
self.cache = {} | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def __contains__(key) -> bool:
"""
>>> cache = LruCache(1)
>>> 1 in cache
False
>>> set(1, 1)
>>> 1 in cache
True
"""
return key in self.cache
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
__contains__()
is the modern version of has_key()
. It enables the use of in
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One of the coolest submissions ever!!
Thanks a lot :) 👍
Traceback (most recent call last): | ||
... | ||
ValueError: Key '2' not found in cache | ||
>>> cache.get(2) # None returned |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why get rid of the exception?
Now the caller is blind to whether 2 was in cache and set to None vs. 2 was not in cache.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Its given in the type hint, the function will return an integer or None if the key is absent
other/lru_cache.py
Outdated
|
||
def cache_info(): | ||
if func not in LruCache.decorator_function_to_instance_map: | ||
return "Cache for function not initialized" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this raise an Exception?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, but later realized, its an impossible case, so removed it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One of the coolest submissions ever!!
@cclauss won't it be merged? And should I add LFU Cache too (it runs in O(capacity) time not O(1)) |
I approved this PR as it is. @onlinejudge95 should approve it as well before it is merged. My sense is that LFU cache is not really needed. If you want to add it, you can but in a separate Python file in a separate pull request. |
* Added LRU Cache * Optimized the program * Added Cache as Decorator + Implemented suggestions * Implemented suggestions
Describe your change:
Checklist:
Fixes: #{$ISSUE_NO}
.