Conversation
|
Fascinating approach to use a doubly linked list. More Pythonic to use a dict. It would be cool if this could support the decorator calling conventions of https://docs.python.org/3/library/functools.html#functools.lru_cache |
|
I'll add it as the decorator on monday. But I don't know how to do it using dictionary and O(1) time |
|
Skip the dict for now. Let’s land it as a doubly linked list implementation. |
other/lru_cache.py
Outdated
| @@ -0,0 +1,127 @@ | |||
| class Double_Linked_List_Node(): | |||
There was a problem hiding this comment.
Do not use snake_cases for class naming. Also, no () needed as you are not subclassing from any class.
You might wanna try
class DoubleLinkedListNode:
other/lru_cache.py
Outdated
| Double Linked List Node built specifically for LRU Cache | ||
| ''' | ||
|
|
||
| def __init__(self, key, val): |
other/lru_cache.py
Outdated
| self.prev = None | ||
|
|
||
|
|
||
| class Double_Linked_List(): |
other/lru_cache.py
Outdated
| ''' | ||
|
|
||
| def __init__(self): | ||
| self.head = Double_Linked_List_Node(None, None) |
There was a problem hiding this comment.
How about a sensible default in the method declaration?
other/lru_cache.py
Outdated
| self.rear = Double_Linked_List_Node(None, None) | ||
| self.head.next, self.rear.prev = self.rear, self.head | ||
|
|
||
| def add(self, node: Double_Linked_List_Node) -> None: |
other/lru_cache.py
Outdated
| temp.next, node.prev = node, temp | ||
| self.rear.prev, node.next = node, self.rear | ||
|
|
||
| def remove(self, node: Double_Linked_List_Node) -> Double_Linked_List_Node: |
other/lru_cache.py
Outdated
| return node | ||
|
|
||
|
|
||
| class Lru_Cache: |
There was a problem hiding this comment.
Naming convention as suggested above
ruppysuppy
left a comment
There was a problem hiding this comment.
Is there any changes required for the decorator @cclauss ?
|
Thanks a lot for the review @onlinejudge95, I completely overlooked the type hints. I didn't add the doctests for the linked lists as it doesn't have any utility of its own (made for the specific use case for the cache). |
other/lru_cache.py
Outdated
| ... res = fib(i) | ||
|
|
||
| >>> fib.cache_info() | ||
| 'CacheInfo(hits=194, misses=99, capacity=100, current size=99)' |
other/lru_cache.py
Outdated
| self.list.add(node) | ||
|
|
||
| def has_key(self, key: int) -> bool: | ||
| def cache_info(self) -> str: |
There was a problem hiding this comment.
| def cache_info(self) -> str: | |
| def __str__(self) -> str: |
This allows us to just print(cache).
There was a problem hiding this comment.
using repr to use >>> cache in doctests
other/lru_cache.py
Outdated
|
|
||
| if result is not None: | ||
| return result | ||
|
|
||
| result = func(*args, **kwargs) | ||
| LruCache.decorator_function_to_instance_map[func].set(args[0], result) |
There was a problem hiding this comment.
| if result is not None: | |
| return result | |
| result = func(*args, **kwargs) | |
| LruCache.decorator_function_to_instance_map[func].set(args[0], result) | |
| if not result: | |
| result = func(*args, **kwargs) | |
| LruCache.decorator_function_to_instance_map[func].set(args[0], result) |
There was a problem hiding this comment.
using result is None as it might also be 0
| self.num_keys = 0 | ||
| self.hits = 0 | ||
| self.miss = 0 | ||
| self.cache = {} |
There was a problem hiding this comment.
Why have both a cache and a decorator_function_to_instance_map? Could we have one instead of two?
There was a problem hiding this comment.
We need both as the cache maps the keys to the Double Linked List Node
| self.hits = 0 | ||
| self.miss = 0 | ||
| self.cache = {} | ||
|
|
There was a problem hiding this comment.
def __contains__(key) -> bool:
"""
>>> cache = LruCache(1)
>>> 1 in cache
False
>>> set(1, 1)
>>> 1 in cache
True
"""
return key in self.cache
There was a problem hiding this comment.
__contains__() is the modern version of has_key(). It enables the use of in.
There was a problem hiding this comment.
One of the coolest submissions ever!!
Thanks a lot :) 👍
| Traceback (most recent call last): | ||
| ... | ||
| ValueError: Key '2' not found in cache | ||
| >>> cache.get(2) # None returned |
There was a problem hiding this comment.
Why get rid of the exception?
Now the caller is blind to whether 2 was in cache and set to None vs. 2 was not in cache.
There was a problem hiding this comment.
Its given in the type hint, the function will return an integer or None if the key is absent
other/lru_cache.py
Outdated
|
|
||
| def cache_info(): | ||
| if func not in LruCache.decorator_function_to_instance_map: | ||
| return "Cache for function not initialized" |
There was a problem hiding this comment.
Should this raise an Exception?
There was a problem hiding this comment.
Yes, but later realized, its an impossible case, so removed it
cclauss
left a comment
There was a problem hiding this comment.
One of the coolest submissions ever!!
|
@cclauss won't it be merged? And should I add LFU Cache too (it runs in O(capacity) time not O(1)) |
|
I approved this PR as it is. @onlinejudge95 should approve it as well before it is merged. My sense is that LFU cache is not really needed. If you want to add it, you can but in a separate Python file in a separate pull request. |
* Added LRU Cache * Optimized the program * Added Cache as Decorator + Implemented suggestions * Implemented suggestions
Describe your change:
Checklist:
Fixes: #{$ISSUE_NO}.