In this post, we will show you how to configure a Spring Batch job to read data from mysql database and write into XML file.
Project structure
This is a directory structure of the standard gradle project.
Project dependencies
task wrapper(type: Wrapper) {
gradleVersion = '3.2.1'
}
apply plugin: 'java'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
sourceCompatibility = 1.8
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
compile 'org.springframework.boot:spring-boot-starter-data-jpa:1.5.2.RELEASE'
compile 'org.springframework:spring-oxm:4.3.7.RELEASE'
compileOnly('org.projectlombok:lombok:1.16.12')
runtime('mysql:mysql-connector-java')
compile('org.springframework.boot:spring-boot-starter-batch:1.5.2.RELEASE')
testCompile('org.springframework.boot:spring-boot-starter-test:1.5.2.RELEASE')
}
buildscript {
repositories {
mavenLocal()
jcenter()
}
dependencies {
classpath "org.springframework.boot:spring-boot-gradle-plugin:1.5.2.RELEASE"
}
}
application.properties file
spring.datasource.url=jdbc:mysql://localhost:3306/spring_batch spring.datasource.username=root spring.datasource.password=santosh spring.jpa.hibernate.ddl-auto=update(追記) (追記ここまで)
Spring Batch Jobs
Create a job which will read from mysql database using named query on user table and write into XML file.
package com.walking.techie.mysqltoxml.jobs;
import com.walking.techie.mysqltoxml.model.User;
import com.walking.techie.mysqltoxml.orm.JpaQueryProviderImpl;
import java.util.HashMap;
import java.util.Map;
import javax.persistence.EntityManagerFactory;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.item.database.JpaPagingItemReader;
import org.springframework.batch.item.xml.StaxEventItemWriter;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.FileSystemResource;
import org.springframework.oxm.xstream.XStreamMarshaller;
@Configuration
@EnableBatchProcessing
public class ReadFromDB {
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Autowired
private EntityManagerFactory entityManagerFactory;
@Bean
public Job readUser() throws Exception {
return jobBuilderFactory.get("readUser").flow(step1()).end().build();
}
@Bean
public Step step1() throws Exception {
return stepBuilderFactory.get("step1").<User, User>chunk(10).reader(reader()).writer(writer())
.build();
}
@Bean
public JpaPagingItemReader<User> reader() throws Exception {
JpaPagingItemReader<User> databaseReader = new JpaPagingItemReader<>();
databaseReader.setEntityManagerFactory(entityManagerFactory);
JpaQueryProviderImpl<User> jpaQueryProvider = new JpaQueryProviderImpl<>();
jpaQueryProvider.setQuery("User.findAll");
databaseReader.setQueryProvider(jpaQueryProvider);
databaseReader.setPageSize(1000);
databaseReader.afterPropertiesSet();
return databaseReader;
}
@Bean
public StaxEventItemWriter<User> writer() {
StaxEventItemWriter<User> writer = new StaxEventItemWriter<>();
writer.setResource(new FileSystemResource("xml/user.xml"));
writer.setMarshaller(userUnmarshaller());
writer.setRootTagName("users");
return writer;
}
@Bean
public XStreamMarshaller userUnmarshaller() {
XStreamMarshaller unMarshaller = new XStreamMarshaller();
Map<String, Class> aliases = new HashMap<String, Class>();
aliases.put("user", User.class);
unMarshaller.setAliases(aliases);
return unMarshaller;
}
}
Map record from mysql user table User object and write to XML file.
A Java model class
package com.walking.techie.mysqltoxml.model;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.NamedQueries;
import javax.persistence.NamedQuery;
import javax.persistence.Table;
import lombok.Data;
@NamedQueries({
@NamedQuery(name = "User.findAll", query = "select u from User u")
})
@Table(name = "user")
@Data
@Entity
public class User {
@Id
int id;
String username;
String password;
int age;
}
JpaQueryProviderImpl provide mothods to create named query using Jpa
package com.walking.techie.mysqltoxml.orm;
import javax.persistence.Query;
import org.springframework.batch.item.database.orm.AbstractJpaQueryProvider;
import org.springframework.util.Assert;
import org.springframework.util.StringUtils;
public class JpaQueryProviderImpl<E> extends AbstractJpaQueryProvider {
private Class<E> entityClass;
private String query;
@Override
public Query createQuery() {
return getEntityManager().createNamedQuery(query, entityClass);
}
public void setQuery(String query) {
this.query = query;
}
public void setEntityClass(Class<E> entityClazz) {
this.entityClass = entityClazz;
}
@Override
public void afterPropertiesSet() throws Exception {
Assert.isTrue(StringUtils.hasText(query), "Query cannot be empty");
Assert.notNull(entityClass, "Entity class cannot be NULL");
}
}
Run Application
package com.walking.techie;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Output
Output of the application will store in xml/user.xml
<?xml version="1.0" encoding="UTF-8"?><users><user><id>1</id><username>santosh</username><password>password</password><age>30</age></user><user><id>2</id><username>santosh</username><password>password</password><age>24</age></user><user><id>3</id><username>santosh</username><password>password</password><age>22</age></user><user><id>4</id><username>santosh</username><password>password</password><age>28</age></user><user><id>5</id><username>santosh</username><password>password</password><age>34</age></user><user><id>6</id><username>santosh</username><password>password</password><age>20</age></user></users>
output on console
2017年03月26日 20:42:44.548 INFO 8027 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=readUser]] launched with the following parameters: [{}]
2017年03月26日 20:42:44.608 INFO 8027 --- [ main] o.s.batch.core.job.SimpleStepHandler : Executing step: [step1]
2017年03月26日 20:42:44.725 INFO 8027 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [FlowJob: [name=readUser]] completed with the following parameters: [{}] and the following status: [COMPLETED]
Note : This code has been compiled and run on mac notebook and intellij IDEA.
1 comment :
Hello,
Reply DeleteI took the same code as yours, except that I added the scheduler part (which runs every 10 minutes).
I customized the writer with this part of code:
DateFormat dateFormat = new SimpleDateFormat (PnpConstants.DATE_FORMAT_PATTERN, Locale.US);
final String timestamp = dateFormat.format (new Date ());
final String exportFilePath = Constants.OUTGOING_DIRECTORY_PATH + timestamp + "_" + Constants.FILE_NAME;
FileSystemResource resource = new FileSystemResource (exportFilePath);
StaxEventItemWriter staxItemWriter = new StaxEventItemWriterBuilder ()
.name ( "pnpWriter")
.marshaller (pnpUnmarshaller ())
.resource (resource)
.rootTagName ( "PNPs")
.overwriteOutput (false)
.shouldDeleteIfEmpty (true)
.transactional (true)
.encoding (StandardCharsets.UTF_8.toString ())
.build ();
staxItemWriter.afterPropertiesSet ();
return staxItemWriter;
The goal is to generate an output file at each job execution.
The first time the file is generated correctly, however, the second time, I had the following error:
[[org.springframework.batch.core.step.AbstractStep Encountered an error executing step step1 in job read org.springframework.batch.item.ItemStreamException: File already exists]]
When I debugged in the code, I saw the resource does not change, and it remains the same (timestamp of the first time the job starts). In addition, the second time, he does not go through my job? Is it normal ?
Thanks for your help.