Skip to content

Added LRU Cache #2138

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 25, 2020
Merged

Added LRU Cache #2138

merged 4 commits into from
Jun 25, 2020

Conversation

ruppysuppy
Copy link
Member

Describe your change:

  • Add an algorithm?
  • Fix a bug or typo in an existing algorithm?
  • Documentation change?

Checklist:

  • I have read CONTRIBUTING.md.
  • This pull request is all my own work -- I have not plagiarized.
  • I know that pull requests will not be merged if they fail the automated tests.
  • This PR only changes one algorithm file. To ease review, please open separate PRs for separate algorithms.
  • All new Python files are placed inside an existing directory.
  • All filenames are in all lowercase characters with no spaces or dashes.
  • All functions and variable names follow Python naming conventions.
  • All function parameters and return values are annotated with Python type hints.
  • All functions have doctests that pass the automated testing.
  • All new algorithms have a URL in its comments that points to Wikipedia or other similar explanation.
  • If this pull request resolves one or more open issues then the commit message contains Fixes: #{$ISSUE_NO}.

@cclauss
Copy link
Member

cclauss commented Jun 20, 2020

Fascinating approach to use a doubly linked list. More Pythonic to use a dict.

It would be cool if this could support the decorator calling conventions of https://docs.python.org/3/library/functools.html#functools.lru_cache

@ruppysuppy
Copy link
Member Author

I'll add it as the decorator on monday. But I don't know how to do it using dictionary and O(1) time

@cclauss
Copy link
Member

cclauss commented Jun 20, 2020

Skip the dict for now. Let’s land it as a doubly linked list implementation.

@@ -0,0 +1,127 @@
class Double_Linked_List_Node():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do not use snake_cases for class naming. Also, no () needed as you are not subclassing from any class.

You might wanna try

class DoubleLinkedListNode:

Double Linked List Node built specifically for LRU Cache
'''

def __init__(self, key, val):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type hits please

self.prev = None


class Double_Linked_List():
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above

'''

def __init__(self):
self.head = Double_Linked_List_Node(None, None)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about a sensible default in the method declaration?

self.rear = Double_Linked_List_Node(None, None)
self.head.next, self.rear.prev = self.rear, self.head

def add(self, node: Double_Linked_List_Node) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs doctests

temp.next, node.prev = node, temp
self.rear.prev, node.next = node, self.rear

def remove(self, node: Double_Linked_List_Node) -> Double_Linked_List_Node:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs doctests

return node


class Lru_Cache:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Naming convention as suggested above

Copy link
Member Author

@ruppysuppy ruppysuppy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there any changes required for the decorator @cclauss ?

@ruppysuppy
Copy link
Member Author

Thanks a lot for the review @onlinejudge95, I completely overlooked the type hints. I didn't add the doctests for the linked lists as it doesn't have any utility of its own (made for the specific use case for the cache).

... res = fib(i)

>>> fib.cache_info()
'CacheInfo(hits=194, misses=99, capacity=100, current size=99)'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AWESOME!!!

@@ -114,11 +132,46 @@ def set(self, key: int, value: int) -> None:
node.val = value
self.list.add(node)

def has_key(self, key: int) -> bool:
def cache_info(self) -> str:
Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
def cache_info(self) -> str:
def __str__(self) -> str:

This allows us to just print(cache).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using repr to use >>> cache in doctests

Comment on lines 157 to 162

if result is not None:
return result

result = func(*args, **kwargs)
LruCache.decorator_function_to_instance_map[func].set(args[0], result)
Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if result is not None:
return result
result = func(*args, **kwargs)
LruCache.decorator_function_to_instance_map[func].set(args[0], result)
if not result:
result = func(*args, **kwargs)
LruCache.decorator_function_to_instance_map[func].set(args[0], result)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

using result is None as it might also be 0

self.capacity = capacity
self.num_keys = 0
self.hits = 0
self.miss = 0
self.cache = {}
Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why have both a cache and a decorator_function_to_instance_map? Could we have one instead of two?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need both as the cache maps the keys to the Double Linked List Node

self.capacity = capacity
self.num_keys = 0
self.hits = 0
self.miss = 0
self.cache = {}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def __contains__(key) -> bool:
    """
    >>> cache = LruCache(1)
    >>> 1 in cache
    False
    >>> set(1, 1)
    >>> 1 in cache
    True
    """
    return key in self.cache

Copy link
Member

@cclauss cclauss Jun 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

__contains__() is the modern version of has_key(). It enables the use of in.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the coolest submissions ever!!

Thanks a lot :) 👍

Traceback (most recent call last):
...
ValueError: Key '2' not found in cache
>>> cache.get(2) # None returned
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why get rid of the exception?
Now the caller is blind to whether 2 was in cache and set to None vs. 2 was not in cache.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Its given in the type hint, the function will return an integer or None if the key is absent


def cache_info():
if func not in LruCache.decorator_function_to_instance_map:
return "Cache for function not initialized"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this raise an Exception?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, but later realized, its an impossible case, so removed it

Copy link
Member

@cclauss cclauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the coolest submissions ever!!

@ruppysuppy
Copy link
Member Author

@cclauss won't it be merged? And should I add LFU Cache too (it runs in O(capacity) time not O(1))

@cclauss
Copy link
Member

cclauss commented Jun 23, 2020

I approved this PR as it is. @onlinejudge95 should approve it as well before it is merged.

My sense is that LFU cache is not really needed. If you want to add it, you can but in a separate Python file in a separate pull request.

@cclauss cclauss merged commit 27dde06 into TheAlgorithms:master Jun 25, 2020
stokhos pushed a commit to stokhos/Python that referenced this pull request Jan 3, 2021
* Added LRU Cache

* Optimized the program

* Added Cache as Decorator + Implemented suggestions

* Implemented suggestions
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants