I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
@Configuration
@EnableBatchProcessing
public class BatchJobConfiguration {
@Primary
@Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
@Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
6 Answers 6
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
@Configuration
@EnableBatchProcessing
public class BatchConfig {
@Bean
BatchConfigurer configurer(@Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
2 Comments
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using @Component annotation:
The configuration class where @EnableBatchProcessing is placed we can annotate with @ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
@EnableBatchProcessing
@ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
@Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the @Primary annotation works for DataSource bean as in the example below:
@Configuration
public class BatchTestDatabaseConfig {
@Bean
@Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make @Primary annotation on DataSource work might be just adding @ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with @EnableBatchProcessing annotation:
@Configuration
@EnableBatchProcessing
@ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
2 Comments
I would like to provide a solution here, which is very similar to the one answered by @vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
@Configuration
@EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
@Bean
@Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
@Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
@PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
- This class is annotated with
@EnableBatchProcessingand@Configuration, as well as extended fromDefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer whenAbstractBatchConfigurationtries to lookupBatchConfigurer; - Annotate batchDataSource bean as
@Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables. - Override
protected JobRepository createJobRepository() throws Exceptionmethod, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
Comments
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
@Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* @param firstDataSource
*/
@Autowired
public MyBatchConfigurer(@Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
2 Comments
You can define below beans and make sure you application.properties file has entries needed for
@Configuration
@PropertySource("classpath:application.properties")
public class DataSourceConfig {
@Primary
@Bean(name = "abcDataSource")
@ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
@Bean(name = "xyzDataSource")
@ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
Comments
First, create a custom BatchConfigurer
@Configuration
@Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
@Autowired
@Qualifier("dataSource1")
DataSource dataSource;
@Override
public JobExplorer getJobExplorer() throws Exception {
...
}
@Override
public JobLauncher getJobLauncher() throws Exception {
...
}
@Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
@Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
@Configuration
@EnableBatchProcessing
@ComponentScan("package")
public class JobConfig {
// define job, step, ...
}
Comments
Explore related questions
See similar questions with these tags.
@Primary. Else you could construct aDefaultBatchConfigurerwhich requires a datasource as construct argument and pass it the one to use.