2

I want to have enum as a field for my entity.

My application is look like:

Spring boot version

plugins {
 id 'org.springframework.boot' version '2.6.2' apply false

repository:

@Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
 ...

entity:

@Table("my_entity")
public class MyEntity{
 ...
 private FileType fileType;
 // get + set
}

enum declaration:

public enum FileType {
 TYPE_1(1),
 TYPE_2(2);
 int databaseId;
 public static FileType byDatabaseId(Integer databaseId){
 return Arrays.stream(values()).findFirst().orElse(null);
 }
 FileType(int databaseId) {
 this.databaseId = databaseId;
 }
 public int getDatabaseId() {
 return databaseId;
 }
}

My attempt:

I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303

So I've added bean

@Bean
public JdbcCustomConversions jdbcCustomConversions() {
 return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}

converters:

@WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
 @Override
 public Integer convert(FileType source) {
 return source.getDatabaseId();
 }
}
@ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
 @Override
 public FileType convert(Integer databaseId) {
 return FileType.byDatabaseId(databaseId);
 }
}

But I see error:

The bean 'jdbcCustomConversions', defined in class path resource [org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class], could not be registered. A bean with that name has already been defined in my.pack.Main and overriding is disabled.

I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.

20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
 Hint: You will need to rewrite or cast the expression.
 Position: 174 

I also tried to use the latest(currently) version of spring boot:

id 'org.springframework.boot' version '2.6.2' apply false

But it didn't help.

What have I missed ? How can I map enum to integer column properly ?

P.S.

I use following code for testing:

@SpringBootApplication
@EnableJdbcAuditing
@EnableScheduling
public class Main {
 public static void main(String[] args) {
 ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
 MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
 MyEntity entity = new MyEntity();
 ...
 entity.setFileType(FileType.TYPE_2);
 repository.save(entity);
 }
 @Bean
 public ModelMapper modelMapper() {
 ModelMapper mapper = new ModelMapper();
 mapper.getConfiguration()
 .setMatchingStrategy(MatchingStrategies.STRICT)
 .setFieldMatchingEnabled(true)
 .setSkipNullEnabled(true)
 .setFieldAccessLevel(PRIVATE);
 return mapper;
 }
 @Bean
 public AbstractJdbcConfiguration jdbcConfiguration() {
 return new MySpringBootJdbcConfiguration();
 }
 @Configuration
 static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
 @Override
 protected List<?> userConverters() {
 return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
 }
 }
}

UPDATE

My code is:

@SpringBootApplication
@EnableJdbcAuditing
@EnableScheduling
public class Main {
 public static void main(String[] args) {
 ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
 MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
 MyEntity entity = new MyEntity();
 ...
 entity.setFileType(FileType.TYPE_2);
 repository.save(entity);
 }
 @Bean
 public ModelMapper modelMapper() {
 ModelMapper mapper = new ModelMapper();
 mapper.getConfiguration()
 .setMatchingStrategy(MatchingStrategies.STRICT)
 .setFieldMatchingEnabled(true)
 .setSkipNullEnabled(true)
 .setFieldAccessLevel(PRIVATE);
 return mapper;
 }
 @Bean
 public AbstractJdbcConfiguration jdbcConfiguration() {
 return new MySpringBootJdbcConfiguration();
 }
 @Configuration
 static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
 @Override
 protected List<?> userConverters() {
 return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
 }
 @Bean
 public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
 NamedParameterJdbcOperations operations,
 @Lazy RelationResolver relationResolver,
 JdbcCustomConversions conversions,
 Dialect dialect) {
 JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
 : JdbcArrayColumns.DefaultSupport.INSTANCE;
 DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
 arrayColumns);
 return new MyJdbcConverter(
 mappingContext,
 relationResolver,
 conversions,
 jdbcTypeFactory,
 dialect.getIdentifierProcessing()
 );
 }
 }
 static class MyJdbcConverter extends BasicJdbcConverter {
 MyJdbcConverter(
 MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
 RelationResolver relationResolver,
 CustomConversions conversions,
 JdbcTypeFactory typeFactory,
 IdentifierProcessing identifierProcessing) {
 super(context, relationResolver, conversions, typeFactory, identifierProcessing);
 }
 @Override
 public int getSqlType(RelationalPersistentProperty property) {
 if (FileType.class.equals(property.getActualType())) {
 return Types.BIGINT;
 } else {
 return super.getSqlType(property);
 }
 }
 @Override
 public Class<?> getColumnType(RelationalPersistentProperty property) {
 if (FileType.class.equals(property.getActualType())) {
 return Long.class;
 } else {
 return super.getColumnType(property);
 }
 }
 }
}

But I experience error:

Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
 at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
 at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
 at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
 at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
 at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
 at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
 at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
 at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
 at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
 at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
 at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
 ... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
 at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
 at java.lang.Long.parseLong(Long.java:589)
 at java.lang.Long.parseLong(Long.java:631)
 at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
 ... 59 more
asked Jan 20, 2022 at 17:17

1 Answer 1

6

Try the following instead:

@Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
 return new MySpringBootJdbcConfiguration();
}
@Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
 @Override
 protected List<?> userConverters() {
 return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
 }
}

Explanation:

Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.

JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:

@Configuration(proxyBeanMethods = false)
@ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}

In turn, AbstractJdbcConfiguration has:

@Bean
public JdbcCustomConversions jdbcCustomConversions() {
 try {
 Dialect dialect = applicationContext.getBean(Dialect.class);
 SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
 : new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
 return new JdbcCustomConversions(
 CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
 } catch (NoSuchBeanDefinitionException exception) {
 LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
 return new JdbcCustomConversions();
 }
}

As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.

Update

As discussed in comments:

  • FileType.byDatabaseId is broken - it ignores its input param

  • as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries

  • for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked. As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a @Bean.

You need to override two methods

  • public int getSqlType(RelationalPersistentProperty property)
  • public Class<?> getColumnType(RelationalPersistentProperty property)

I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.

@Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
 return new MySpringBootJdbcConfiguration();
}
@Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
 @Override
 protected List<?> userConverters() {
 return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
 }
 @Bean
 public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
 NamedParameterJdbcOperations operations,
 @Lazy RelationResolver relationResolver,
 JdbcCustomConversions conversions,
 Dialect dialect) {
 JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
 : JdbcArrayColumns.DefaultSupport.INSTANCE;
 DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
 arrayColumns);
 return new MyJdbcConverter(
 mappingContext,
 relationResolver,
 conversions,
 jdbcTypeFactory,
 dialect.getIdentifierProcessing()
 );
 }
}
static class MyJdbcConverter extends BasicJdbcConverter {
 MyJdbcConverter(
 MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
 RelationResolver relationResolver,
 CustomConversions conversions,
 JdbcTypeFactory typeFactory,
 IdentifierProcessing identifierProcessing) {
 super(context, relationResolver, conversions, typeFactory, identifierProcessing);
 }
 @Override
 public int getSqlType(RelationalPersistentProperty property) {
 if (FileType.class.equals(property.getActualType())) {
 return Types.BIGINT;
 } else {
 return super.getSqlType(property);
 }
 }
 @Override
 public Class<?> getColumnType(RelationalPersistentProperty property) {
 if (FileType.class.equals(property.getActualType())) {
 return Long.class;
 } else {
 return super.getColumnType(property);
 }
 }
}
answered Jan 20, 2022 at 19:04
Sign up to request clarification or add additional context in comments.

18 Comments

During startup I see error: Parameter 0 of method setMappingContext in org.springframework.data.jdbc.repository.support.JdbcRepositoryFactoryBean required a bean of type 'org.springframework.data.relational.core.mapping.RelationalMappingContext' that could not be found.
See update - I added @Configuration on MySpringBootJdbcConfiguration (so that Spring uses beans defined in it) - similarly SpringBootJdbcConfiguration was annotated with @Configuration
than you! I see another issue now but I think it is not related to current topic... will let you know when I make sure that it work
Eventually It is started but I still see the same error: Caused by: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying Hint: You will need to rewrite or cast the expression. Position: 141 It is for spring boot 2.6.2
Also added a bit more details to the question. Maybe it will become clearer
|

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.