I have two tables:
area (
id int PK autoincrement
code varchar
)
products (
id int PK autoincrement
name varchar
area_id int PK to AREA
...
)
The classes are persisted using eclipselink like this:
@Entity
class Product {
...
private Long id;
...
private String name;
...
@JoinColumn(name = "area_id", referencedColumnName = "id")
@ManyToOne
@Expose
private Area area;
...
}
This works just fine. But I'm making a rest service for an API, a simple PUT in http://x.x.x.x/product where the JSON data should be like this:
{
id: xxx
name: xxxx
area: xxxx
}
As you can see, I want that the area field to be sent differently. id and name are the same, but in this case the field area is the String that is in stored in the table.row area.code.
Aparently this cannot be made with JPA ( I asked this here: https://stackoverflow.com/questions/45086458/linking-two-object-by-code-not-id-using-eclipselink-jpa ) but there, someone said the following:
Don't mix your entity and webservices classes and you wont have problem like this.
So, I was thinking if I should have two Product classes. One for the service layer, the one that the customer will use when they call the API, like this:
public class Product implements Serializable {
private Long id;
private String name;
private String area;
}
And then when I'm handling the PUT/GET method just convert this class to the JPA one. something like this:
x.jpa.Product jpaProduct = new x.jpa.Product();
jpaProduct.setId(product.getId());
jpaProduct.setName(product.getName());
jpaProduct.getArea().setId( getAreaIdByCode(product.getArea()));
...
m.persist(jpaProduct);
-
1What are you asking?Samuel– Samuel07/21/2017 11:30:28Commented Jul 21, 2017 at 11:30
-
Should I have two Product classes?Laiv– Laiv07/22/2017 21:22:08Commented Jul 22, 2017 at 21:22
2 Answers 2
I did solve this issue so this approach is not neccesary.
Here is how I did it:
Using transformers. So the field area is defined like this:
@Transformation(fetch = FetchType.EAGER, optional = false)
@ReadTransformer(transformerClass = AreaAttributeTransformer.class)
@WriteTransformers({
@WriteTransformer(
transformerClass = AreaFieldTransformer.class,
column = @Column(name = "area_id", nullable = false))
})
@Expose
private String area;
Then those clases work like this:
AreaAttributeTransformer
public class AreaAttributeTransformer implements AttributeTransformer {
private AbstractTransformationMapping mapping;
@Override
public void initialize(AbstractTransformationMapping abstractTransformationMapping) {
this.mapping = abstractTransformationMapping;
}
@Override
public Object buildAttributeValue(Record record, Object o, Session session) {
for (DatabaseField field : mapping.getFields()) {
if (field.getName().contains("area_id")) {
EntityManager em = MyEntityManagerFactory.getENTITY_MANAGER_FACTORY().createEntityManager();
List results = em.createNamedQuery("Areas.findById")
.setParameter("id", record.get(field))
.getResultList();
if (results.size() > 0)
return ((Area) results.get(0)).getCode();
}
}
return null;
}
}
AreaFieldTransformer
public class AreaFieldTransformer implements FieldTransformer {
private AbstractTransformationMapping mapping;
@Override
public void initialize(AbstractTransformationMapping abstractTransformationMapping) {
this.mapping = abstractTransformationMapping;
}
@Override
public Object buildFieldValue(Object o, String s, Session session) {
if (o instanceof RouSub) {
EntityManager em = MyEntityManagerFactory.getENTITY_MANAGER_FACTORY().createEntityManager();
List results = em.createNamedQuery("Area.findByCode")
.setParameter("area", ((Area) o).getCode())
.getResultList();
if (results.size() > 0)
return ((Area)results.get(0)).getId();
}
return null;
}
}
-
1However, you still expose the persistence model to the APi. Your data model might change any time and you will have no chances to make changes backward compatibles. Once an API is published, to change its Interface and model is just the hell on Earth. Unless, backward compatibility is not a problem at all. Just curious, what kind of clients do consume the API?Laiv– Laiv07/22/2017 21:29:08Commented Jul 22, 2017 at 21:29
So, I was thinking if I should have two Product classes. One for the service layer, the one that the customer will use when they call the API.
Well, I think nobody is in a position to say what should you do. It depends pretty much on your requirements but having different representations is usually a good idea.
For example, the physical representation 1 and the logic 2 are not necessarily the same. And the public representantation 3 could be a total different one too.
The reason is simple. Decoupling. Having different representations allow us to model every layer with a certain degree of independency with regarding to others, in such a way that, if we change the physical representantation, the change won't necessarily affect the public one.
This is specially important for public Interfaces (API) because once in production, changing interfaces and models lead us to challenging situations. APIs design is specially hard for this reason. We will have to bear the burden of such decisions for good or for bad.
Right now, you are tightly coupling your physical representantation with everything else. Note that if you ever change Product
the API consumers will suffer the consequences too. At the moment, there's no way for you to guarantee backward compatibility if Product
changes. Not even versioning, unless you map twice the tables.
If all these stuffs matter or not depends on your specific situation, but at least they worth a mention.
1: The one we store
2: The domain data model
3: The public