The company I'm working at has strict policies that insist that all business logic is contained in Domain Models as part of DDD.
I can see how this approach could work well for something like a desktop app, but I'm having issues with making this fit with building a web app and was hoping someone could suggest some patterns / techniques for making it fit better.
Specifically, I'm having issues with PUT/POST APIs.
The DTO models received by the API have almost exactly the same properties as the domain models (with just a few missing) - the JSON is automatically deserialized into the DTO models.
I then need to load, update and save the domain models.
In the past, I've tended to put the domain logic into a service class, which allowed me to load the domain model, map the updated properties across from the DTO using AutoMapper (while leaving the properties that don't exist in the API models alone) and then saving the model again.
This approach isn't allowed as everything has to be done in the domain models, doubly so as the company has banned the use of AutoMapper and other mapping tools.
When creating objects, I have to call constructors, manually mapping all the API properties to constructor parameters, which I don't see any way round given the restrictions imposed by company patterns as all objects have to be valid at all times - we can't have an intermediate stage while building the object where the object is invalid, i.e. everything has to be created and set in one go.
The only approach that I can see for updates is to write code to manually copy each property in the DTO model across to the domain model one at a time, but this is extremely time consuming / tedious / error prone / a maintenance nightmare as some of our models have a few hundred properties across multiple layers of nested classes. It's made worse as we are also banned from using a DTO model with two APIs so they often have copies of identical classes that then need to be maintained as we add / update / remove properties from the domain classes.
We often have to keep setters on properties on the domain models private so that it can't get into an invalid state so it means calling methods with many parameters on the domain models that update multiple properties at the same time to keep things valid (I used to handle this by simply validating my DTO models before mapping).
Are there any patterns that I can use to help relieve some of this boilerplate hell?
If it makes a difference to any answers - I'm developing in C# / ASP.Net 4.7.2 (Framework)
-
1The standards you describe, while extreme for some organizations, are bang on good object oriented design: SOLID, encapsulation, data hiding, etc.Greg Burghardt– Greg Burghardt06/11/2025 12:19:22Commented Jun 11 at 12:19
-
I understand that theory; it's just when that theory hits reality. I end up having to manually write hundreds of lines of boilerplate and tasks that used to take a few minutes now take hours. I now also have huge blocks of code that's very difficult to test thoroughly and we are getting far more bugs than I've ever encountered in the past, largely caused by all the mapping code where a dev accidentally copies the wrong property (humans are fundamentally bad at repetitive tasks and make mistakes).Mog0– Mog006/11/2025 12:30:48Commented Jun 11 at 12:30
-
We had the same issue, but over a year the DTO and the internal data structure changed. The internal data structure got more data that was not available from the outside -- in my case a config object. First they were the same object and we had to do some magic to filter fields from the web api, later I separated both. Now there is a clean separation between internal model and the outside world.Apollo– Apollo06/12/2025 05:44:09Commented Jun 12 at 5:44
-
DDD is tuned for web. Desktop world is all about mutable state, working copies and low latency, which are all not business related but have to be included in the domain.Basilevs– Basilevs06/13/2025 12:21:40Commented Jun 13 at 12:21
-
1This question is similar to: Reflection: Is using reflection still "bad" or "slow"? What has changed with reflection since 2002?. If you believe it’s different, please edit the question, make it clear how it’s different and/or how the answers on that question are not helpful for your problem.Basilevs– Basilevs06/13/2025 12:31:36Commented Jun 13 at 12:31
3 Answers 3
Problem scenarios like these often don't have a clear smoking gun to indicate the problem. This tends to stem more from incongruous design decisions and/or developer expectations. I intend for this answer to mostly point out the incongruities so you can revisit them and reconsider if one of them has got to give in order to make your life easier and help resolve or mitigate the problem scenario.
Are there any patterns that I can use to help relieve some of this boilerplate hell?
Firstly, I want to point out that boilerplate hell is a very subjective term. I've seen it used both to refer to scenarios of truly excessive boilerplating, but I've also seen it used by developers who abhor any kind of DTO layer and hate the idea of mapping between data types in general. Based on your question it is not fully clear which side of the fence you're leaning on, because I don't have a good read on your actual codebase.
I can make an educated guess that your reliance on AutoMapper (don't get me wrong, I love the library) and your apparent distaste for any manual mapping process would suggest that you've not been leveraging the power of having distinct domain models and DTOs in an application that seems to very much thrive on the idea of partial DTOs, instead mandating that fields must be named identically in order to never have to configure any mapping manually.
If my educated guess is anywhere near correct, that kind of mandating of identical property names strongly hinders the value you derive from having a DTO layer since it must be designed to exactly match the structure and property names of the domain model.
When creating objects, I have to call constructors
Mandating constructors as the value-setting methodology as opposed to publicly settable properties (or equivalent setter methods if your language does not have publicly settable properties) is a choice.
I don't like that choice for most data-driven classes, but that's a long and subjective discussion on different developers' approaches to solving problems in certain ways.
The reason I point out that it is a choice is that the problems you are pointing out are caused (in part) by this choice. And regardless of subjective opinion on the specific choice being made, making any kind of choice and enforcing it inherently means accepting the knock-on effects that will arise because of that choice.
In other words, those who made the choice should be asked for advice on how to concretely implement it in the scenario you're faced with. Maybe your scenario is one they did not foresee, and they will revisit their choice. Maybe they did foresee your scenario, and they actively intend for you to do it this way (i.e. they expect you to change your expectations instead of trying to subvert their explicit design decision).
the company has banned the use of AutoMapper and other mapping tools
some of our models have a few hundred properties across multiple layers of nested classes
If you put these two together, the company has effectively mandated that you manually map hundreds of properties. That's not an unintended consequence, it's the intended goal. The alleged problem you point out isn't really a problem in that sense, it's just that you don't like doing what the company has decided you should do (or how you should do it).
There's a few possible outs here. One possibility is that the blanket rule is enforced by an idealist who has no real awareness of the kinds of domain models you are working with, and therefore did not understand the friction that their design choice has imposed on the developers who have to implement it.
Another possible out here is that this is a conscious design choice, not an undesirable side effect to be worked around. Your question is based on the latter, as you're trying to find a way to not have to manually map so many fields when the enforced design decisions specifically orient you towards manually mapping them.
some of our models have a few hundred properties across multiple layers of nested classes
which allowed me to load the domain model, map the updated properties across from the DTO using AutoMapper
The core issue I would address here is the sheer size of these domain models. I'm not saying that it definitely needs to be broken down, but I would definitely re-evaluate if it can be broken down. In other words, I would not leave the models to be of this size unless there was a damn good reason for it. And if there were, then I would have to accept the knock-on consequences from having such large models.
It feels to me that your reliance on AutoMapper has not been an act of clean design, but rather a workaround to avoid the drudgery of having to manage such large and unwieldy domain models.
Yes, I agree that generally speaking you shouldn't be writing logic to handle such a volume of properties manually.
No, that doesn't mean that AutoMapper was the right response to this issue - it feels more like an avoidance tactic of not wanting to address the elephant in the room: the inflated domain models.
the company has banned the use of AutoMapper and other mapping tools
Did you investigate the reason for the ban?
Because I could build an "AutoMapper Lite" version from scratch, which enables things like convention-based rules via reflection (this can be done between properties and constructor parameters if you design it to be this way), and allow for nested mapping profiles that help resolve any kind of deeply nested structure of classes.
The question here is: would those who banned AutoMapper accept this? I can't answer that, they have to answer that question.
It's possibly their ban is purely a response to an unhealthy dependency on it in the past by developers. It could stem from some oversimplified "third party library bad, homebrew code good" sentiment. Or it might not be about it being a library, but rather to drive developers to design things the way that the tech leadership wants them to be designed.
Based on the reason for the ban, how you can/cannot resolve this situation will differ. The most direct solution to your problem would be to roll something vaguely like AutoMapper and rely on the same kind of pattern to help mitigate the issue - but this might be exactly what the tech leadership will not allow.
Ask your leadership about the reasons for the ban, check for their awareness of your concrete situation, and ask if they have considered (or have any advice) on how you should approach your work going forward when working with these new restrictions.
-
I would go further and ban all reflection. A field should never be set if the access point can not be found by "grep".Basilevs– Basilevs06/13/2025 12:26:43Commented Jun 13 at 12:26
-
@Basilevs While reflection can be used to circumvent access modifiers and I agree this should not be done (barring possibly serialization), that is not the sole use case for reflection. Outright banning reflection seems an overzealous blanket ban to me.Flater– Flater06/13/2025 22:30:12Commented Jun 13 at 22:30
-
All benefits from reflection are tactical. Meanwhile I never saw a use of reflection in an active project that did not cause major issues in the long run. They are listed in yours and linked answers. Most frustrating are: 1) runtime binding 2) accidental coupling 2.1) freeze of a model by a backward compatible serialization format / reflective UI framework, etc. 3) impotent static analysis 4) security problems. I'll need to find a strategically valid usecase to use reflection and nothing comes to mind at the moment.Basilevs– Basilevs06/14/2025 11:37:30Commented Jun 14 at 11:37
-
@Basilevs The vast majority of the problems you point out can be mitigated or resolved via unit tests. For example, AutoMapper comes out of the box with a specific method intended to be called in a unit test which confirms that all mappings can be performed without any binding issues or missing registrations. I do agree, however, that with the advent of source generators there is likely a better solution to be found there, because then you also get the compiler benefit on top. But tools like AutoMapper far predate any reliable source code generation, unless a third-party clunky dependency.Flater– Flater06/15/2025 08:00:23Commented Jun 15 at 8:00
The only approach that I can see for updates is to write code to manually copy each property in the API model across to the domain model one at a time, but this is extremely time consuming / tedious / error prone / a maintenance nightmare as some of our models have a few hundred properties across multiple layers of nested classes.
If your language does not support named parameters then the only solution is to use builder pattern. Having large number of positional arguments in a single method is a code smell. Hard to maintain, hard to test, error prone.
It's made worse as we are also banned from using a model with two APIs so they often have copies of identical classes that then need to be maintained as we add / update / remove properties from the domain classes.
Why do you add properties to your API right after you add them to the model? You should only expand API when it is needed. So if you have n APIs covering same model you rarely need to expand all of them.
We often have to keep setters on properties on the domain models private so that it can't get into an invalid state so it means calling methods with many parameters on the domain models that update multiple properties at the same time to keep things valid (I used to handle this by simply validating my API models before mapping).
Hmm I think you have big mismatch between API and DDD model. In DDD you should not "update state", you should perform business actions. If you go with "dumb" HTTP API and DDD inside then you will need to interpret HTTP API calls as some business events. This might be tricky if multiple properties change in single API call.
Better way to do it is to have command/event API that accepts many different payloads - DomainObjectDeleted
, DomainObjectCancelled
, DomainObjectApproved
instead of "dumb" REST API that just have single {..., "status": "CANCELLED" ...}
changed among tens of other properties.
(after @Mog0 comment)
If your use case really involves big updates made manually by users then you can also have big update command that is created from DTO (builder pattern will be useful here too). Then your domain object will only have update(UpdateXCommand)
method. You can validate the update in domain object. You still need to map each property twice - API DTO into domain command and command inside domain object state. This way you don't need to have some arbitrary update methods in domain object just single one to update whole state (this is only acceptable if this is the way users are working with particular domain objects and you still want to have DDD - benefits of DDD are almost gone here).
To sum up:
- builder pattern for creating big objects
- "dumb" PUT API changed into command/event POST API with multiple payload shapes depending on command/event OR
- "dumb" PUT API DTO mapped to big domain update object through builder pattern and validated inside domain object just before updating it
-
1To clarify, we have a SPA that allows the user to edit something and then we have an API that needs to save all this data into the domain models. The business action is essentially overwrite almost everything :) This is where I'm struggling with it a bit - with a desktop app, this idea of only doing actions on your models makes 100% sense, but on a web app, many of the actions are done on the client side and then the results are pushed back to the model. It feels like this whole concept of pushing the data back, which is fundamental to web apps, breaks many of the principles of DDD.Mog0– Mog006/11/2025 15:13:08Commented Jun 11 at 15:13
-
100% this is why i recommend ADM approach with methods on services for APIsEwan– Ewan06/11/2025 15:18:26Commented Jun 11 at 15:18
-
Builder patter is the way to go hereAnton Pastukhov– Anton Pastukhov06/11/2025 17:20:49Commented Jun 11 at 17:20
-
@Mog0 I have updated the answer to include CRUD-like DDD designSankozi– Sankozi06/12/2025 06:33:18Commented Jun 12 at 6:33
my reaction reading through: ok ... yes.. urg i guess but..
...the company has banned the use of AutoMapper and other mapping tools...
wtf?
Obviously if you have a http rest api endpoint the data comes in as text. and needs to be "mapped" to an object.
Your main problem seems to stem from your use of DTOs rather than just sending the domain objects over the wire. But even if you do this, you still "map" the incoming json to the Domain Object's properties.
Use of a DTO and Automapper instead of just sending the Domain Object is an anti pattern, and I would agree with the principle they are trying to force.
BUT, if you also want to put business logic methods on your Domain Objects and hide their state in a OOP way, you are kinda forcing the use of DTOs.
... It's made worse as we are also banned from using a model with two APIs..
!!*&&*DS???
The whole point of a Domain is that it's the Domain and shared?
Obviously we only see your side of the story so it's easy to criticize these decisions, but it does seem like following their rules wont result in good software.
Are there any patterns that I can use to help relieve some of this boilerplate hell?
You don't state the language, but serialisation libraries do offer ways of using constructors. eg
https://www.newtonsoft.com/json/help/html/JsonConstructorAttribute.htm
If yours doesn't, you could make your own generic converter which deserializes the incoming json string to a dictionary and then uses reflection to populate the constructor. Essentially a "mapper" But call it a... "DomainFactory" instead, yeah that looks cool.
You probably need the DTOs anyway just to define the endpoints of your API, so an alternative would be to add a method to the base DTO class which serialises to a dictionary of properties and add constructors to your domain objects that accept a dictionary or parameter type.
Then you can have the same copy pasted constructor on all domain objects, where it just pulls its properties from the dictionary param/param array and have
public object? PostMyObject(DTOMyObject myDto)
{
var domain = new DomainMyObject(myDto.ToDic());
return domain.DoThing();
}
BaseDTO
{
Dictionary<string, object> ToDic()
{
//use reflection, loop through public properties
// myDic.Add(prop.Name, prop.Value)
return myDic
}
}
DomainMyObject
{
DomainMyObject(Dictionary<String, object) dic)
{
this.privateProp = <string> dic["privateProp"];
}
}
You can see how you might go even further and just have a generic mapp... DOMAINFACTORY that can take any object and use its properties to populate any other objects private properties. But at some point the DOMAINFACTORY will just get too complicated. So I would recommend stopping at this level. Where you can still justify a parameter array style constructor on domain objects and a serialise to array method on DTOs.
It's all just a hack to get around the rules though. You should be going full ADO and writing...
public someOtherDomainObject DoThing(MyDomainObject myObj) => this.domainService.DoThing(myObj);
No DTOs, no mapping layer, perfection!
-
To clarify, the JSON is deserialized into DTOs automatically. The problem is then getting the data from the DTOs into the Domain Models. It's normal to use DTOs for the API interface as it helps prevent accidentally allowing writing to certain props / leaking info by sending others you don't intend.Mog0– Mog006/11/2025 15:04:08Commented Jun 11 at 15:04
-
The boss says that he doesn't like Automapper as it's "magic" and can be abused so we're not allowed to use it.Mog0– Mog006/11/2025 15:04:52Commented Jun 11 at 15:04
-
To clarify, it's one DTO per API. He likes everything to be siloed with vertical slicing through the app (in addition to the horizontal slicing of layers) - kind of cubed :) Multiple DTOs (often identical) are used to update a single domain model so the mapping code has to be written multiple times and each time we update the domain model, we typically have to update 5 or 6 other related models to keep everything in sync.Mog0– Mog006/11/2025 15:05:58Commented Jun 11 at 15:05
-
Explore related questions
See similar questions with these tags.