This guide demonstrates how to read data from a SQL database and write it to a .dat
file using Spring Batch in a Spring Boot application. The .dat
file will contain comma-separated values (CSV format), and Spring Batch handles the job configuration, reading, and writing processes.
📦 Step 1: Add Dependencies (Maven)
Add the following to your pom.xml
:
xml
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<scope>runtime</scope>
</dependency>
</dependencies>
⚙️ Step 2: Configure application.properties
properties
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password
spring.jpa.database-platform=org.hibernate.dialect.H2Dialect
spring.batch.job.enabled=true
🧱 Step 3: Define Entity and Repository
MyEntity.java
java
import javax.persistence.Entity;
import javax.persistence.Id;
@Entity
public class MyEntity {
@Id
private Long id;
private String name;
// Getters and setters
}
MyEntityRepository.java
java
import org.springframework.data.jpa.repository.JpaRepository;
public interface MyEntityRepository extends JpaRepository<MyEntity, Long> {
}
🔁 Step 4: Batch Configuration
BatchConfig.java
java
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.batch.item.ItemProcessor;
import org.springframework.batch.item.ItemReader;
import org.springframework.batch.item.ItemWriter;
import org.springframework.batch.item.database.JpaPagingItemReader;
import org.springframework.batch.item.file.FlatFileItemWriter;
import org.springframework.batch.item.file.transform.DelimitedLineAggregator;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.io.FileSystemResource;
import javax.persistence.EntityManagerFactory;
@Configuration
@EnableBatchProcessing
public class BatchConfig {
@Autowired
private JobBuilderFactory jobBuilderFactory;
@Autowired
private StepBuilderFactory stepBuilderFactory;
@Autowired
private EntityManagerFactory entityManagerFactory;
@Bean
public Job job() {
return jobBuilderFactory.get("sqlToDatFileJob")
.incrementer(new RunIdIncrementer())
.start(step1())
.build();
}
@Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<MyEntity, MyEntity>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
@Bean
public ItemReader<MyEntity> reader() {
JpaPagingItemReader<MyEntity> reader = new JpaPagingItemReader<>();
reader.setQueryString("SELECT e FROM MyEntity e");
reader.setEntityManagerFactory(entityManagerFactory);
reader.setPageSize(10);
return reader;
}
@Bean
public ItemProcessor<MyEntity, MyEntity> processor() {
return item -> item; // No transformation
}
@Bean
public ItemWriter<MyEntity> writer() {
FlatFileItemWriter<MyEntity> writer = new FlatFileItemWriter<>();
writer.setResource(new FileSystemResource("output.dat"));
writer.setLineAggregator(new DelimitedLineAggregator<MyEntity>() {{
setDelimiter(",");
setFieldExtractor(item -> new Object[]{item.getId(), item.getName()});
}});
return writer;
}
}
📝 Explanation
- Job: The batch job is composed of a single step.
- Step: Configured to read entities from the database, process them (optional), and write them to a
.dat
file. - JpaPagingItemReader: Reads from the database using JPA.
- ItemProcessor: Currently a pass-through; you can modify it to transform data.
- FlatFileItemWriter: Writes each record as a line to
output.dat
, using comma-separated fields.
▶️ Running the Job
Run the Spring Boot application (main
method), and Spring Batch will:
- Read records from the SQL database (
MyEntity
table). - Write the records to a
.dat
file named output.dat
.
Each line in output.dat
will look like:
python-repl
1,John Doe
2,Jane Smith
...