I've started to learn Python today. I'm trying to write decoupled middleware or chain of responsibility pattern.
In my app I have some two classes:
- abstract class or interface named Processable
- class ProcessorChain, which must locate implementations of the Processable, create objects of these implementations and call methods of these objects
In client code, user can:
- create his own class based on Processable
- write full path to his class to some global setting
- and be sure what it will be processed in queue by my app processor chain :-)
So, the same thing we got using service container, when we call services by tag. In Symfony, for example, we use service tags and CompilerPass to process a group of decoupled objects which implement one interface.
The question is: how to get same in Python?
I've already learnt about __subclasses__
method, but it works only if sub-classes was imported. So, I've learnt about importlib.importmodule
, then I got what I wanted.
The folder structure is:
- patterns
- main.py
- settings.py
- Chain
- chain.py
- processor.py
patterns.Chain.chain.py
import importlib
from abc import ABCMeta, abstractmethod
from patterns import settings
class Processable(metaclass=ABCMeta):
"""Meta class define required properties and methods"""
__metaclass__ = ABCMeta
@abstractmethod
def execute(self) -> str:
return 'Base class processing...'
class ProcessorChain:
@staticmethod
def process_directly():
# load processors from settings list
for definition in settings.CHAIN_PROCESSORS:
# parse module and package
m, p = definition.rsplit('.', 1)
module = importlib.import_module(m, p)
# get class
cls = getattr(module, p)
# check if the class is a child of Processable
if issubclass(cls, Processable):
# create an instance of the class
processor = cls()
# and do something
print(processor.execute())
@staticmethod
def process_subclass():
# still requires make dynamic imports
for definition in settings.CHAIN_PROCESSORS:
# parse module and package
m, p = definition.rsplit('.', 1)
# import a module
importlib.import_module(m, p)
# load all subclasses of Processable
for processor in Processable.__subclasses__():
# and do something
print(processor().execute())
patterns.Chain.processor.py
from .chain import Processable
class FirstProcessor(Processable):
"""Provides required properties and methods"""
context = 1
def execute(self):
return super(FirstProcessor, self).execute() + ' (implementation from meta class)'
class SecondProcessor(Processable):
"""Provides required properties and methods"""
context = 2
def execute(self):
return str(self.context) + ' processing...' + ' (its own implementation)'
patterns.settings.py
CHAIN_PROCESSORS = [
'patterns.Chain.processor.FirstProcessor',
'patterns.Chain.processor.SecondProcessor'
]
patterns.main.py
from patterns.Chain.chain import ProcessorChain
ProcessorChain().process_directly()
The run of main.py as expected prints:
Base class processing... (implementation from meta class)
2 processing... (its own implementation)
https://github.com/xcono/py3-trials/tree/master/patterns
But, actually, I don't know is what pythonic way? May be there is a better way to acheive decoupled chain? May be better to use event listeners/dispatchers?
Dear Pythonists, please, tell the truth, how do you cook it?
1 Answer 1
This looks unfinished to me and far from ready for review.
The docstring for the class
Processable
doesn't describe the class. What kind of thing is an instance ofProcessable
?The class
Processable
is misleadingly named — its instances are things that can process not things that can be processed so a name likeProcessor
would be better.The
execute
method has no docstring. What does it do?The purpose of the
execute
method is surely to process a request, but in that case shouldn't it take the request as a parameter?Why does the
execute
method return a string? This seems to be an unnecessary limitation: a processing chain might need to return a more complex object.In the chain of responsibility pattern, a processor's default behaviour is to forward the request to the next processor in the chain. But this seems to be omitted from the implementation here. Where is the next processor stored? How is the request forwarded?
It is not clear what happens if none of the processors in the chain want to handle a request.
staticmethod
is nearly always an anti-pattern in Python. In some languages (like Java) all functions have to belong to a class. But in Python you can just write a function, and there is no need to make a class whose only job is to act as a container for some functions.There are no docstrings for
process_directly
andprocess_subclass
. What do these functions do?process_directly
is coupled tightly tosettings.CHAIN_PROCESSORS
. It would be better for this to be a parameter, in case the application needs to construct its processor chain in some other way.process_directly
constructs aProcessable
instance and calls itsexecute
method. But doesn't there need to be a request somewhere?process_directly
throws away theProcessable
instance after it is finished. This means that if you call it again it has to go through the construction logic again, which is a waste of time, and it makes it hard to implement stateful processors (because the objects don't survive from one execution to the next).process_directly
callsprint
on the result. It seems unlikely that this is what real processor chains will want to do with the result.If the test
issubclass(cls, Processable)
fails, nothing happens. This means that typos and other mistakes in the setting will be silently ignored. It would be better to raise an exception.I don't understand the purpose of
process_subclass
.There's duplicated code (for loading a class from a module) between
process_directly
andprocess_subclass
.
Explore related questions
See similar questions with these tags.