Although throwing excptions for control flow is a controversial topic there are some quite popular examples of using this anti-pattern like C#'s async routines throwing the OperationCanceledException
to cancel a task or python throwing the StopIteration
to control iterators.
I thought I'll try to use such an exception with my logger decorator package (GitHub). I call it ContinuationError
. In general the decorator handles logging of such states as
started
before entering a functioncompleted
when a function successfully executedcanceled
when a function exited prematurelyfaulted
when something unexpected occured
The exception supports the canceled
state by replacing spammy logging with an error:
Before:
logger.canceled(reason="No luck!")
return 5
After:
raise ContinuationError("No luck!", 5)
It expects a reason for why the cancellation was necessary, optionally a return value if the function is expected to return something and also optionally other arguments that are later rendered into a json-message.
class ContinuationError(Exception):
"""Raise this error to gracefully handle a function cancellation."""
def __new__(cls, *args, **details) -> Any:
instance = super().__new__(cls)
instance.details = details | dict(reason=args[0])
if len(args) > 1:
instance.result = args[1]
return instance
def __init__(self, message: str, result: Optional[Any] = None, **details):
super().__init__(message)
The decorator takes care of handling it by checking whether a result
was provided and returns it if necessary. The decorator also provides a lambda for creating started
details or another lambda for logging the result.
def telemetry(on_started: Optional[OnStarted] = None, on_completed: Optional[OnCompleted] = None, **kwargs):
"""Provides flow telemetry for the decorated function."""
on_started = on_started or (lambda _: {})
on_completed = on_completed or (lambda _: {})
def factory(decoratee):
@contextlib.contextmanager
def logger_scope() -> Logger:
logger = Logger(
module=inspect.getmodule(decoratee).__name__,
scope=decoratee.__name__,
attachment=kwargs.pop("attachment", None),
parent=_scope.get()
)
token = _scope.set(logger)
try:
yield logger
except Exception:
logger.faulted()
raise
finally:
_scope.reset(token)
def inject_logger(logger: Logger, d: Dict):
""" Injects Logger if required. """
for n, t in inspect.getfullargspec(decoratee).annotations.items():
if t is Logger:
d[n] = logger
def params(*decoratee_args, **decoratee_kwargs) -> Dict[str, Any]:
# Zip arg names and their indexes up to the number of args of the decoratee_args.
arg_pairs = zip(inspect.getfullargspec(decoratee).args, range(len(decoratee_args)))
# Turn arg_pairs into a dictionary and combine it with decoratee_kwargs.
return {t[0]: decoratee_args[t[1]] for t in arg_pairs} | decoratee_kwargs
if asyncio.iscoroutinefunction(decoratee):
@functools.wraps(decoratee)
async def decorator(*decoratee_args, **decoratee_kwargs):
with logger_scope() as scope:
inject_logger(scope, decoratee_kwargs)
scope.started(**on_started(params(*decoratee_args, **decoratee_kwargs)))
try:
result = await decoratee(*decoratee_args, **decoratee_kwargs)
scope.completed(**on_completed(result))
return result
except ContinuationError as e:
if hasattr(e, "result"):
scope.canceled(**(on_completed(e.result) | e.details))
return e.result
else:
scope.canceled(**e.details)
else:
@functools.wraps(decoratee)
def decorator(*decoratee_args, **decoratee_kwargs):
with logger_scope() as scope:
inject_logger(scope, decoratee_kwargs)
scope.started(**on_started(params(*decoratee_args, **decoratee_kwargs)))
try:
result = decoratee(*decoratee_args, **decoratee_kwargs)
scope.completed(**on_completed(result))
return result
except ContinuationError as e:
if hasattr(e, "result"):
scope.canceled(**(on_completed(e.result) | e.details))
return e.result
else:
scope.canceled(**e.details)
decorator.__signature__ = inspect.signature(decoratee)
return decorator
return factory
Later one of the logging APIs checks for the exception and decides whether to log a normal message or an actual error:
def _log(self, **kwargs):
status = inspect.stack()[1][3]
details = Logger.serialize_details(**kwargs)
with _create_log_record(
functools.partial(_set_module_name, name=self.module),
functools.partial(_set_func_name, name=self.scope)
):
# Ignore the ContinuationError as an actual error.
is_error = all(sys.exc_info()) and sys.exc_info()[0] is not ContinuationError
self._logger.log(level=self._logger.level, msg=None, exc_info=is_error, extra={
"parent": self.parent.id if self.parent else None,
"node": self.id,
"status": status,
"elapsed": self.elapsed,
"details": details,
"attachment": self.attachment
})
Internally the package is using python's standard logging library.
Example
I use it like this:
import wiretap.src.wiretap as wiretap # becasue it's from the test environment
@wiretap.telemetry(on_started=lambda p: {"value": p["value"], "bar": p["bar"]}, on_completed=lambda r: {"count": r})
def foo(value: int, logger: wiretap.Logger = None, **kwargs) -> int:
logger.running(test=f"{value}")
raise wiretap.ContinuationError("No luck!", 0, foo="bar")
return 3
if __name__ == "__main__":
print(foo(1, bar="baz")) # <-- prints: 0
What do you think of this idea? I guess I probably should check if the decorated function is expected to return something and throw an invalid operation exception when a return value wasn't provided.
1 Answer 1
Your decorator has almost no type hints.
You can use ParamSpec
to hint your code.
Here's an example typed closure wrapper decorator:
import functools
from typing import Callable, ParamSpec, TypeVar
P = ParamSpec("P")
TRet = TypeVar("TRet")
def typed_decorator() -> Callable[[Callable[P, TRet]], Callable[P, TRet]]:
def wrapper(fn: Callable[P, TRet]) -> Callable[P, TRet]:
@functools.wraps(fn)
def inner(*args: P.args, **kwargs: P.kwargs) -> TRet:
return fn(*args, **kwargs)
return inner
return wrapper
def untyped_decorator():
def wrapper(fn):
@functools.wraps(fn)
def inner(*args, **kwargs):
return fn(*args, **kwargs)
return inner
return wrapper
@typed_decorator()
def foo(bar: str) -> int:
return 1
@untyped_decorator()
def bar(bar: str) -> int:
return 1
reveal_type(foo)
reveal_type(bar)
$ mypy --strict foo.py
foo.py:36: note: Revealed type is "def (bar: builtins.str) -> builtins.int"
foo.py:37: note: Revealed type is "Any"
As you can see, we basically can just say the parameters are P
and the return type is just TRet
.
Next lets get the DI working.
I'll be copying and modifying your inject_logger
code.
Since I'm focusing on type hints at the moment I'm going to have to ask you to ignore the contents of the function.
The function does the same thing yours does, DI the logger.
I'm going to change foo
to take Logger
as an argument, with a default.
We need to provide a default for the type hints to work as intended.
Otherwise you'll attempt to call the function without a logger and get an error.
The logger has some nonsense code just to make determining which logger we're interacting with easier to reason with.
import functools
import inspect
from typing import TYPE_CHECKING, Any, Callable, ParamSpec, TypeVar
P = ParamSpec("P")
TRet = TypeVar("TRet")
TArgs = TypeVar("TArgs", bound=tuple) # type: ignore
TKwargs = TypeVar("TKwargs", bound=dict) # type: ignore
class Logger:
ID = 0
id: int
def __init__(self) -> None:
self.id = self.ID
type(self).ID += 1
def __repr__(self) -> str:
return f"<{type(self).__name__} {self.id}>"
def inject_logger(
fn: Callable[..., Any],
logger: Logger,
args: TArgs,
kwargs: TKwargs,
) -> tuple[TArgs, TKwargs]:
""" Injects Logger if required."""
sig = inspect.signature(fn)
bound = sig.bind_partial(*args, **kwargs)
for param in sig.parameters.values():
if issubclass(Logger, param.annotation):
bound.arguments.setdefault(param.name, logger)
bound.apply_defaults()
return bound.args, bound.kwargs # type: ignore
def typed_decorator() -> Callable[[Callable[P, TRet]], Callable[P, TRet]]:
def wrapper(fn: Callable[P, TRet]) -> Callable[P, TRet]:
@functools.wraps(fn)
def inner(*args: P.args, **kwargs: P.kwargs) -> TRet:
args, kwargs = inject_logger(fn, Logger(), args, kwargs)
print(args, kwargs)
return fn(*args, **kwargs)
inner.__signature__ = fn.__signature__ = inspect.signature(fn) # type: ignore
return inner
return wrapper
@typed_decorator()
def foo(bar: str, logger: Logger = Logger(), /) -> int:
return 1
if TYPE_CHECKING:
reveal_type(foo)
if __name__ == "__main__":
foo("test")
$ python foo.py
('test', <Logger 1>) {}
$ mypy --strict foo.py
foo.py:58: note: Revealed type is "def (builtins.str, foo.Logger =) -> builtins.int"
We can avoid the need specify a default value by using Concatenate
.
The limitation with Concatenate
is the syntax only works with arguments on the left side of the function.
def typed_decorator() -> Callable[[Callable[Concatenate[Logger, P], TRet]], Callable[P, TRet]]:
def wrapper(fn: Callable[Concatenate[Logger, P], TRet]) -> Callable[P, TRet]:
@functools.wraps(fn)
def inner(*args: P.args, **kwargs: P.kwargs) -> TRet:
return fn(Logger(), *args, **kwargs)
fn.__signature__ = sig = inspect.signature(fn) # type: ignore
params = iter(sig.parameters.values())
next(params, None)
inner.__signature__ = inspect.Signature(list(params), return_annotation=sig.return_annotation) # type: ignore
return inner
return wrapper
@typed_decorator()
def foo(logger: Logger, bar: str) -> int:
return 1
if TYPE_CHECKING:
reveal_type(foo)
$ mypy --strict foo.py
foo.py:56: note: Revealed type is "def (bar: builtins.str) -> builtins.int"
We can support both of the options by using a Protocol
with overload
.
As you can possibly see from the increased # type: ignore
comments the approach is somewhat janky internally.
def inject_logger(
fn: Callable[..., Any],
logger: Logger,
args: TArgs,
kwargs: TKwargs,
) -> tuple[TArgs, TKwargs]:
""" Injects Logger if required."""
sig = inspect.signature(fn)
if ((param := next(iter(sig.parameters.values()), None)) is not None
and issubclass(param.annotation, Logger)
):
args = (logger,) + args # type: ignore
bound = sig.bind_partial(*args, **kwargs)
for param in sig.parameters.values():
if issubclass(param.annotation, Logger):
bound.arguments.setdefault(param.name, logger)
bound.apply_defaults()
return bound.args, bound.kwargs # type: ignore
class PTypedDecorator(Protocol):
@overload
def __call__(self, /, fn: Callable[Concatenate[Logger, P], TRet]) -> Callable[P, TRet]: ...
@overload
def __call__(self, /, fn: Callable[P, TRet]) -> Callable[P, TRet]: ...
def typed_decorator() -> PTypedDecorator:
def wrapper(fn: Callable[Concatenate[Logger, P], TRet] | Callable[P, TRet]) -> Callable[P, TRet]:
@functools.wraps(fn)
def inner(*args: P.args, **kwargs: P.kwargs) -> TRet:
args, kwargs = inject_logger(fn, Logger(), args, kwargs)
print(args, kwargs)
return fn(*args, **kwargs)
fn.__signature__ = sig = inspect.signature(fn) # type: ignore
params = iter(sig.parameters.values())
if ((param := next(params, None)) is not None
and issubclass(param.annotation, Logger)
):
sig = inspect.Signature(list(params), return_annotation=sig.return_annotation)
inner.__signature__ = sig # type: ignore
return inner
return wrapper
@typed_decorator()
def foo(logger: Logger, bar: str, /) -> int:
return 1
@typed_decorator()
def bar(bar: str, *, logger: Logger = Logger()) -> int:
return 1
if TYPE_CHECKING:
reveal_type(foo)
reveal_type(bar)
if __name__ == "__main__":
print(inspect.signature(foo))
foo("test")
print(inspect.signature(bar))
bar("test")
$ python foo.py
(bar: str, /) -> int
(<Logger 1>, 'test') {}
(bar: str, *, logger: __main__.Logger = <Logger 0>) -> int
('test',) {'logger': <Logger 2>}
$ mypy --strict foo.py
foo.py:73: note: Revealed type is "def (bar: builtins.str) -> builtins.int"
foo.py:74: note: Revealed type is "def (bar: builtins.str, *, logger: foo.Logger =) -> builtins.int"
Review
Your existing code doesn't really support type hints.
-
@wiretap.telemetry(on_started=lambda p: {"value": p["value"], "bar": p["bar"]}, on_completed=lambda r: {"count": r}) def foo(value: int, logger: wiretap.Logger = None, **kwargs) -> int:
The only reason
logger: wiretap.Logger = None
would type correctly is if you're not running your static analysis tool in strict mode.By default
mypy
runs in non-strict mode. The mode is useful for showing glaring typing issues, like passingint
to a function forstr
. The benefit is by hiding some more pedantic static analysis tools users with large Python code bases can ease into static Python. You have a lot of functions in your
factory
closure which could just be global, look at myinject_logger
.Your current
inject_logger
cannot support positional only parameters. To add the logger you mutate thekwargs
however the logger can be provided before the/
to be a positional only argument.Neither
on_started
oron_completed
are properly typed.I can appreciate you may think using
*
and**
in alambda
can be ugly foron_started
. So having an untyped interface may be preferable.>>> (lambda *_, foo, **__: foo)(foo="foo") 'foo' >>> (lambda *_, foo, **__: foo)(bar="foo") Traceback (most recent call last): File "<stdin>", line 1, in <module> TypeError: <lambda>() missing 1 required keyword-only argument: 'foo'
If you ignore typing, your code is good. Seems to do the job, doesn't seem to be poorly designed - from what I've seen.
As you may be aware in Python using exceptions for control flow is expected.
However, I can't really comment on whether using exceptions for control flow is good here.
I would need to see real world usage of your library to see if another option would be better.
However, your use of exceptions here seem to be a good way to get what you want to achieve with decorators. An alternate would be to return, say, a LogObject
which has the exact same interface as ContinuationError
.
Explore related questions
See similar questions with these tags.