A while ago I asked this question Memoizing decorator that can retry and then promptly forgot about it. I more recently saw Python decorator for retrying w/exponential backoff and wanted to add backoff, but figured I should improve what I already have first. I have taken some of the suggestions of the original, but really I've just completely reimagined it (and if I do say so myself, rather cleverly too). My biggest issue with this is that I'm not sure that my current distinction between "exceptions that get suppressed but not remembered" and "exceptions that get remembered" seems like it might be a bit confusing - is there a friendlier interface here?
import functools
from types import MappingProxyType as FrozenDict
class Memoizer:
def __init__(self, retry_times=0, suppressed_exceptions=tuple(), capture_exceptions=False):
self.retry = retry
self.retry_times = retry_times
self.suppressed_exceptions = suppressed_exceptions
self.capture_exceptions = capture_exceptions
def __call__(self, function):
d = {}
@functools.wraps(function)
def wrapper(*args, **kwargs, __forget=False):
key = (args, FrozenDict(kwargs))
if key not in d or __forget:
i = self.retry_times
while i > 1:
try:
d[key] = function(*args, **kwargs)
except self.suppressed_exceptions:
continue
except Exception as e:
if self.capture_exceptions:
d[key] = e
break
raise
else:
break
i -= 1
else:
# If we didn't already break out, the last attempt shouldn't suppress exceptions
d[key] = function(*args, **kwargs)
return d[key]
return wrapper
3 Answers 3
That's a nice looking decorator you have there!
However, architecture-wise it does not look like a good idea, because your decorator has 2 responsibilities.
I see use cases for memoizing decorator and I see use cases for retry decorator.
However, there are much less use cases for memoizing retry decorator.
I'd recommend splitting it into 2 decorators you would be more likely to reuse later. For example, memoizing decorator that does not suppress any exceptions and lets higher level handle them and retry decorator knows nothing about memoization - only about exceptions.
I'd also recommend creating a base class for well-behaved decorators and inherit boilerplate from there.
Your code is broken:
__forget=False
needs to be before**kwargs
.retry
is undefined. (But is not used)MappingProxyType
is not hashable, and so you can't memoize it.
To fix the problem with MappingProxyType
you can create a hashable dict, which seems kinda hacky, but is better than nothing. The other two are easy fixes.
I don't see why you're using a class,
the only benefit that you get from it is that you can hide implementation details in it.
However self.
is an annoyance in your current implementation, in my opinion.
Design wise, I still don't get the name retry_times
, let's actually retry that amount rather than try that amount of times.
And I'd split the memoizer away from the retry decorator.
Being able to use say functools.lru_cache
would be nice.
And so I'd make the retry aspect a function, removing where it currently adds to the cache. changing the while loop to a for loop, and will removing the cache. This should get you:
def retry(retry_times=0, suppressed_exceptions=tuple(), capture_exceptions=False):
def inner(function):
@functools.wraps(function)
def wrapper(*args, **kwargs):
for _ in range(retry_times):
try:
return function(*args, **kwargs)
except suppressed_exceptions:
continue
except Exception as e:
if capture_exceptions:
return e
raise
else:
return function(*args, **kwargs)
return wrapper
return inner
And now you can use the retry part without the memoizing part!
After this, I'd just make a generic memoizer, without caching the **kwargs
.
Say:
def memoize(fn):
cache = {}
@functools.wraps(fn)
def wrapper(*args, _forget=False):
key = (args)
if key not in cache or _forget:
cache[key] = fn(*args)
return cache[key]
return wrapper
-
\$\begingroup\$ Thank you for pointing those out! Luckily I fixed all of those in the latest iteration... \$\endgroup\$Dan Oberlam– Dan Oberlam2016年07月11日 20:47:14 +00:00Commented Jul 11, 2016 at 20:47
The code is good, I can find no big issues.
for
I can suggest using a for _ in range(self.retry_times)
as it is simpler than while
(one less variable to think about) and makes it clear that you just want to repeat the block of code retry_times
times and don't use the iteration counter in the code.
-
\$\begingroup\$ The problem is that I don't want to repeat it
retry_times
, I want to repeat itretry_times-1
, which looked weird to me \$\endgroup\$Dan Oberlam– Dan Oberlam2016年07月08日 19:35:56 +00:00Commented Jul 8, 2016 at 19:35 -
\$\begingroup\$ @Dannnno Why do you retry one time less than the user asks? \$\endgroup\$Caridorc– Caridorc2016年07月08日 19:41:58 +00:00Commented Jul 8, 2016 at 19:41
-
\$\begingroup\$ We try the last time, but it isn't in the loop. The last one happens outside, and we don't suppress any exceptions. \$\endgroup\$Dan Oberlam– Dan Oberlam2016年07月08日 20:08:00 +00:00Commented Jul 8, 2016 at 20:08
-
\$\begingroup\$ @Dannnno Ok, so use
retry_times - 1
\$\endgroup\$Caridorc– Caridorc2016年07月08日 20:56:29 +00:00Commented Jul 8, 2016 at 20:56 -
\$\begingroup\$ I intentionally chose not to do that - it is a bit of cognitive dissonance to use
retry_times - 1
. Do you have an idea for a better name, or do you think that the ease of the for loop is worth the weirdretry_times - 1
? \$\endgroup\$Dan Oberlam– Dan Oberlam2016年07月08日 21:11:19 +00:00Commented Jul 8, 2016 at 21:11
Explore related questions
See similar questions with these tags.
d
defined inside the function? I think this should erase its contents each time the function is called? I am confused as to why this works... \$\endgroup\$d
is changed each time, but the individualwrapper
functions are closures overd
, so they maintain their own copies. \$\endgroup\$d
defined in__init__
if you do want to set it to empty just once? \$\endgroup\$m = Memoizer()
you'd share the cache between functions with your suggestion, where this way the cache is made when you pass the functionfn = Memoizer()(fn)
. \$\endgroup\$