5

I'd like to activate or deactivate a "cache" in some class method during execution.

I found a way to activate it with something like that:

(...)
setattr(self, "_greedy_function", my_cache_decorator(self._cache)(getattr(self, "_greedy_function")))
(...)

where self._cache is a cache object of my own that stores the results of self._greedy_function.

It's working fine but now what if I want to deactivate the cache and "undecorate" _greedy_function?

I see a possible solution, storing the reference of _greedy_function before decorating it but maybe there is a way to retrieve it from the decorated function and that would be better.

As requested, here are the decorator and the cache object I'm using to cache results of my class functions:

import logging
from collections import OrderedDict, namedtuple
from functools import wraps

logging.basicConfig(
    level=logging.WARNING,
    format='%(asctime)s %(name)s %(levelname)s %(message)s'
)

logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)

CacheInfo = namedtuple("CacheInfo", "hits misses maxsize currsize")

def lru_cache(cache):
    """
    A replacement for functools.lru_cache() build on a custom LRU Class.
    It can cache class methods.
    """
    def decorator(func):
        logger.debug("assigning cache %r to function %s" % (cache, func.__name__))
        @wraps(func)
        def wrapped_func(*args, **kwargs):
            try:
                ret = cache[args]
                logger.debug("cached value returned for function %s" % func.__name__)
                return ret
            except KeyError:
                try:
                    ret = func(*args, **kwargs)
                except:
                    raise
                else:
                    logger.debug("cache updated for function %s" % func.__name__)
                    cache[args] = ret
                    return ret
        return wrapped_func
    return decorator

class LRU(OrderedDict):
    """
    Custom implementation of a LRU cache, build on top of an Ordered dict.
    """
    __slots__ = "_hits", "_misses", "_maxsize"

    def __new__(cls, maxsize=128):
        if maxsize is None:
            return None
        return super().__new__(cls, maxsize=maxsize)

    def __init__(self, maxsize=128, *args, **kwargs):
        self.maxsize = maxsize
        self._hits = 0
        self._misses = 0
        super().__init__(*args, **kwargs)

    def __getitem__(self, key):
        try:
            value = super().__getitem__(key)
        except KeyError:
            self._misses += 1
            raise
        else:
            self.move_to_end(key)
            self._hits += 1
            return value

    def __setitem__(self, key, value):
        super().__setitem__(key, value)
        if len(self) > self._maxsize:
            oldest, = next(iter(self))
            del self[oldest]

    def __delitem__(self, key):
        try:
            super().__delitem__((key,))
        except KeyError:
            pass

    def __repr__(self):
        return "<%s object at %s: %s>" % (self.__class__.__name__, hex(id(self)), self.cache_info())

    def cache_info(self):
        return CacheInfo(self._hits, self._misses, self._maxsize, len(self))

    def clear(self):
        super().clear()
        self._hits, self._misses = 0, 0

    @property
    def maxsize(self):
        return self._maxsize

    @maxsize.setter
    def maxsize(self, maxsize):
        if not isinstance(maxsize, int):
            raise TypeError
        elif maxsize < 2:
            raise ValueError
        elif maxsize & (maxsize - 1) != 0:
            logger.warning("LRU feature performs best when maxsize is a power-of-two, maybe.")
        while maxsize < len(self):
            oldest, = next(iter(self))
            print(oldest)
            del self[oldest]
        self._maxsize = maxsize

Edit: I've updated my code using the __wrapped__ attribute suggested in comments and it's working fine! The whole thing is here: https://gist.github.com/fbparis/b3ddd5673b603b42c880974b23db7cda (kik.set_cache() method...)

10
  • "maybe there is a way to retrieve it from the decorated function" Maybe, unfortunately we don't know what the decorated function is. Commented Mar 18, 2019 at 0:12
  • @Goyo The "decorated function" is _greedy_function which is generated by my_cache_decorator. So the question is already clearly defined. Although, it would be better if the PO can provide more context of the decorator.
    – gdlmx
    Commented Mar 18, 2019 at 0:25
  • How is lru_cache used in your class? I don't see any reference to it after declaration.
    – gdlmx
    Commented Mar 18, 2019 at 0:36
  • 2
    Usually wrappers provide the __wrapped__ attribute for this purpose. That said, I’d recommend using/making a wrapper that provides a switch, rather than removing and reinstating the wrapper. Commented Mar 18, 2019 at 2:25
  • 1
    There’s no reason to use setattr/getattr with an identifier as a string literal. Commented Mar 19, 2019 at 6:52

2 Answers 2

4

You have made things too complicated. The decorator can be simply removed by del self._greedy_function. There's no need for a __wrapped__ attribute.

Here is a minimal implementation of the set_cache and unset_cache methods:

class LRU(OrderedDict):
    def __init__(self, maxsize=128, *args, **kwargs):
        # ...
        self._cache = dict()
        super().__init__(*args, **kwargs)

    def _greedy_function(self):
        time.sleep(1)
        return time.time()

    def set_cache(self):
        self._greedy_function = lru_cache(self._cache)(getattr(self, "_greedy_function"))

    def unset_cache(self):
        del self._greedy_function

Using your decorator lru_cache, here are the results

o = LRU()
o.set_cache()
print('First call', o._greedy_function())
print('Second call',o._greedy_function()) # Here it prints out the cached value
o.unset_cache()
print('Third call', o._greedy_function()) # The cache is not used

Outputs

First call 1552966668.735025
Second call 1552966668.735025
Third call 1552966669.7354007
5
  • 1
    That’s certainly a clean approach when the function in question is a method. Commented Mar 19, 2019 at 3:43
  • You put the _cache and _greedy function in the LRU Class but my LRU class is not intended for this, the stuff I want to cache is in other classes, that's why it's complicated... But maybe you're right so I'll review my code first...
    – fbparis
    Commented Mar 19, 2019 at 9:29
  • @fbparis Sorry that I didn't read through your 976 lines of codes in your Github-gist. I just took a look in your kik._set_cache(...), which sets decorator for other methods of kik. My technique should work in that scenario, because the cache content is basically irrelevant. It can be located anywhere and set to anything.
    – gdlmx
    Commented Mar 20, 2019 at 16:19
  • @gdlmx Thanks for your answer anyway, I think next time I'll go your way (the 976 lines of unreadable code I've posted are only to provide the context but I didn't expect people to read it :D)
    – fbparis
    Commented Mar 21, 2019 at 1:37
  • The common use case is a decorator being used @decorator before a function name. You don't actually show how to remove that. This would be really helpful!
    – rjurney
    Commented May 12, 2021 at 22:09
2

Modern versions of functools.wraps install the original function as an attribute __wrapped__ on the wrappers they create. (One could search through __closure__ on the nested functions typically used for the purpose, but other types could be used as well.) It’s reasonable to expect whatever wrapper to follow this convention.

An alternative is to have a permanent wrapper that can be controlled by a flag, so that it can be enabled and disabled without removing and reinstating it. This has the advantage that the wrapper can keep its state (here, the cached values). The flag can be a separate variable (e.g., another attribute on an object bearing the wrapped function, if any) or can be an attribute on the wrapper itself.

Not the answer you're looking for? Browse other questions tagged or ask your own question.