1. Introduction
1.绪论
Unlike other Spring-based applications, testing batch jobs comes with some specific challenges, mostly due to the asynchronous nature of how jobs are executed.
与其他基于Spring的应用程序不同,测试批处理作业有一些特殊的挑战,主要是由于作业执行方式的异步性。
In this tutorial, we’re going to explore the various alternatives for testing a Spring Batch job.
在本教程中,我们将探讨测试Spring Batch作业的各种选择。
2. Required Dependencies
2.所需的依赖性
We’re using spring-boot-starter-batch, so first let’s set up the required dependencies in our pom.xml:
我们使用spring-boot-starter-batch,所以首先让我们在pom.xml中设置必要的依赖项。
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-batch</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<version>2.7.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<version>4.3.0.RELEASE</version>
<scope>test</scope>
</dependency>
We included the spring-boot-starter-test and spring-batch-test which bring in some necessary helper methods, listeners and runners for testing Spring Batch applications.
我们包括spring-boot-starter-test和spring-batch-test,它们为测试Spring Batch应用程序带来一些必要的辅助方法、监听器和运行器。
3. Defining the Spring Batch Job
3.定义Spring Batch作业
Let’s create a simple application to show how Spring Batch solves some of the testing challenges.
让我们创建一个简单的应用程序来展示Spring Batch如何解决一些测试难题。
Our application uses a two-step Job that reads a CSV input file with structured book information and outputs books and book details.
我们的应用程序使用一个两步骤的Job,读取一个包含结构化图书信息的CSV输入文件,并输出图书和图书细节。
3.1. Defining the Job Steps
3.1.界定工作步骤
The two subsequent Steps extract specific information from BookRecords and then map these to Books (step1) and BookDetails (step2):
随后的两个Steps从BookRecords中提取具体信息,然后将这些信息映射到Books(步骤1)和BookDetails(步骤2)。
@Bean
public Step step1(
ItemReader<BookRecord> csvItemReader, ItemWriter<Book> jsonItemWriter) throws IOException {
return stepBuilderFactory
.get("step1")
.<BookRecord, Book> chunk(3)
.reader(csvItemReader)
.processor(bookItemProcessor())
.writer(jsonItemWriter)
.build();
}
@Bean
public Step step2(
ItemReader<BookRecord> csvItemReader, ItemWriter<BookDetails> listItemWriter) {
return stepBuilderFactory
.get("step2")
.<BookRecord, BookDetails> chunk(3)
.reader(csvItemReader)
.processor(bookDetailsItemProcessor())
.writer(listItemWriter)
.build();
}
3.2. Defining the Input Reader and Output Writer
3.2.定义输入阅读器和输出写入器
Let’s now configure the CSV file input reader using a FlatFileItemReader to de-serialize the structured book information into BookRecord objects:
现在让我们使用FlatFileItemReader配置CSV文件输入阅读器,将结构化的图书信息反序列化为BookRecord对象。
private static final String[] TOKENS = {
"bookname", "bookauthor", "bookformat", "isbn", "publishyear" };
@Bean
@StepScope
public FlatFileItemReader<BookRecord> csvItemReader(
@Value("#{jobParameters['file.input']}") String input) {
FlatFileItemReaderBuilder<BookRecord> builder = new FlatFileItemReaderBuilder<>();
FieldSetMapper<BookRecord> bookRecordFieldSetMapper = new BookRecordFieldSetMapper();
return builder
.name("bookRecordItemReader")
.resource(new FileSystemResource(input))
.delimited()
.names(TOKENS)
.fieldSetMapper(bookRecordFieldSetMapper)
.build();
}
There are a couple of important things in this definition, which will have implications on the way we test.
在这个定义中,有几件重要的事情,这将对我们的测试方式产生影响。
First of all, we annotated the FlatItemReader bean with @StepScope, and as a result, this object will share its lifetime with StepExecution.
首先,我们用@StepScope注解了FlatItemReaderBean,因此,该对象将与StepExecution共享其生命周期。
This also allows us to inject dynamic values at runtime so that we can pass our input file from the JobParameters in line 4. In contrast, the tokens used for the BookRecordFieldSetMapper are configured at compile-time.
这也允许我们在运行时注入动态值,因此我们可以从第4行的JobParameters传递我们的输入文件。相比之下,用于BookRecordFieldSetMapper的令牌是在编译时配置的。
We then similarly define the JsonFileItemWriter output writer:
然后我们类似地定义JsonFileItemWriter输出写入器。
@Bean
@StepScope
public JsonFileItemWriter<Book> jsonItemWriter(
@Value("#{jobParameters['file.output']}") String output) throws IOException {
JsonFileItemWriterBuilder<Book> builder = new JsonFileItemWriterBuilder<>();
JacksonJsonObjectMarshaller<Book> marshaller = new JacksonJsonObjectMarshaller<>();
return builder
.name("bookItemWriter")
.jsonObjectMarshaller(marshaller)
.resource(new FileSystemResource(output))
.build();
}
For the second Step, we use a Spring Batch-provided ListItemWriter that just dumps stuff to an in-memory list.
对于第二个步骤,我们使用Spring Batch提供的ListItemWriter,它只是将东西转储到一个内存列表中。
3.3. Defining the Custom JobLauncher
3.3.定义自定义JobLauncher
Next, let’s disable the default Job launching configuration of Spring Boot Batch by setting spring.batch.job.enabled=false in our application.properties.
接下来,让我们通过在application.properties.中设置spring.batch.job.enabled=false,禁用Spring Boot Batch的默认Job启动配置。
We configure our own JobLauncher to pass a custom JobParameters instance when launching the Job:
我们配置自己的JobLauncher,以便在启动Job时传递一个自定义的JobParameters实例:。
@SpringBootApplication
public class SpringBatchApplication implements CommandLineRunner {
// autowired jobLauncher and transformBooksRecordsJob
@Value("${file.input}")
private String input;
@Value("${file.output}")
private String output;
@Override
public void run(String... args) throws Exception {
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
paramsBuilder.addString("file.input", input);
paramsBuilder.addString("file.output", output);
jobLauncher.run(transformBooksRecordsJob, paramsBuilder.toJobParameters());
}
// other methods (main etc.)
}
4. Testing the Spring Batch Job
4.测试Spring批处理作业
The spring-batch-test dependency provides a set of useful helper methods and listeners that can be used to configure the Spring Batch context during testing.
spring-batch-test 依赖关系提供了一组有用的辅助方法和监听器,可用于在测试期间配置Spring Batch上下文。
Let’s create a basic structure for our test:
让我们为我们的测试创建一个基本结构。
@RunWith(SpringRunner.class)
@SpringBatchTest
@EnableAutoConfiguration
@ContextConfiguration(classes = { SpringBatchConfiguration.class })
@TestExecutionListeners({ DependencyInjectionTestExecutionListener.class,
DirtiesContextTestExecutionListener.class})
@DirtiesContext(classMode = ClassMode.AFTER_CLASS)
public class SpringBatchIntegrationTest {
// other test constants
@Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
@Autowired
private JobRepositoryTestUtils jobRepositoryTestUtils;
@After
public void cleanUp() {
jobRepositoryTestUtils.removeJobExecutions();
}
private JobParameters defaultJobParameters() {
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
paramsBuilder.addString("file.input", TEST_INPUT);
paramsBuilder.addString("file.output", TEST_OUTPUT);
return paramsBuilder.toJobParameters();
}
The @SpringBatchTest annotation provides the JobLauncherTestUtils and JobRepositoryTestUtils helper classes. We use them to trigger the Job and Steps in our tests.
@SpringBatchTestannotation提供了JobLauncherTestUtils和JobRepositoryTestUtils帮助类。我们使用它们来触发我们测试中的Job和Steps。
Our application uses Spring Boot auto-configuration, which enables a default in-memory JobRepository. As a result, running multiple tests in the same class requires a cleanup step after each test run.
我们的应用程序使用Spring Boot自动配置,它启用了默认的内存JobRepository。因此,在同一类中运行多个测试需要在每次测试运行后进行清理步骤。
Finally, if we want to run multiple tests from several test classes, we need to mark our context as dirty. This is required to avoid the clashing of several JobRepository instances using the same data source.
最后,如果我们想从多个测试类中运行多个测试,我们需要将我们的context标记为dirty。这是需要的,以避免几个JobRepository实例使用同一个数据源而发生冲突。
4.1. Testing the End-To-End Job
4.1.测试端到端的工作
The first thing we’ll test is a complete end-to-end Job with a small data-set input.
我们要测试的第一件事是一个完整的端到端Job,输入一个小数据集。
We can then compare the results with an expected test output:
然后我们可以将结果与预期的测试输出进行比较。
@Test
public void givenReferenceOutput_whenJobExecuted_thenSuccess() throws Exception {
// given
FileSystemResource expectedResult = new FileSystemResource(EXPECTED_OUTPUT);
FileSystemResource actualResult = new FileSystemResource(TEST_OUTPUT);
// when
JobExecution jobExecution = jobLauncherTestUtils.launchJob(defaultJobParameters());
JobInstance actualJobInstance = jobExecution.getJobInstance();
ExitStatus actualJobExitStatus = jobExecution.getExitStatus();
// then
assertThat(actualJobInstance.getJobName(), is("transformBooksRecords"));
assertThat(actualJobExitStatus.getExitCode(), is("COMPLETED"));
AssertFile.assertFileEquals(expectedResult, actualResult);
}
Spring Batch Test provides a useful file comparison method for verifying outputs using the AssertFile class.
Spring Batch Test提供了一个有用的文件比较方法,用于验证使用AssertFileclass的输出。
4.2. Testing Individual Steps
4.2.测试个别步骤
Sometimes it’s quite expensive to test the complete Job end-to-end and so it makes sense to test individual Steps instead:
有时,对完整的Job进行端到端测试是相当昂贵的,因此,测试单个Steps是有意义的。
@Test
public void givenReferenceOutput_whenStep1Executed_thenSuccess() throws Exception {
// given
FileSystemResource expectedResult = new FileSystemResource(EXPECTED_OUTPUT);
FileSystemResource actualResult = new FileSystemResource(TEST_OUTPUT);
// when
JobExecution jobExecution = jobLauncherTestUtils.launchStep(
"step1", defaultJobParameters());
Collection actualStepExecutions = jobExecution.getStepExecutions();
ExitStatus actualJobExitStatus = jobExecution.getExitStatus();
// then
assertThat(actualStepExecutions.size(), is(1));
assertThat(actualJobExitStatus.getExitCode(), is("COMPLETED"));
AssertFile.assertFileEquals(expectedResult, actualResult);
}
@Test
public void whenStep2Executed_thenSuccess() {
// when
JobExecution jobExecution = jobLauncherTestUtils.launchStep(
"step2", defaultJobParameters());
Collection actualStepExecutions = jobExecution.getStepExecutions();
ExitStatus actualExitStatus = jobExecution.getExitStatus();
// then
assertThat(actualStepExecutions.size(), is(1));
assertThat(actualExitStatus.getExitCode(), is("COMPLETED"));
actualStepExecutions.forEach(stepExecution -> {
assertThat(stepExecution.getWriteCount(), is(8));
});
}
Notice that we use the launchStep method to trigger specific steps.
注意,我们使用launchStep方法来触发特定的步骤。
Remember that we also designed our ItemReader and ItemWriter to use dynamic values at runtime, which means we can pass our I/O parameters to the JobExecution (lines 9 and 23).
请记住,我们还设计了我们的ItemReader和ItemWriter在运行时使用动态值,这意味着我们可以将我们的I/O参数传递给JobExecution(第9和23行)。
For the first Step test, we compare the actual output with an expected output.
对于第一个步骤测试,我们将实际输出与预期输出进行比较。
On the other hand, in the second test, we verify the StepExecution for the expected written items.
另一方面,在第二个测试中,我们验证了StepExecution的预期写入项目。
4.3. Testing Step-scoped Components
4.3.测试步骤范围内的组件
Let’s now test the FlatFileItemReader. Recall that we exposed it as @StepScope bean, so we’ll want to use Spring Batch’s dedicated support for this:
现在我们来测试一下FlatFileItemReader。记得我们把它作为@StepScope Bean来暴露,所以我们要使用Spring Batch对此的专门支持。
// previously autowired itemReader
@Test
public void givenMockedStep_whenReaderCalled_thenSuccess() throws Exception {
// given
StepExecution stepExecution = MetaDataInstanceFactory
.createStepExecution(defaultJobParameters());
// when
StepScopeTestUtils.doInStepScope(stepExecution, () -> {
BookRecord bookRecord;
itemReader.open(stepExecution.getExecutionContext());
while ((bookRecord = itemReader.read()) != null) {
// then
assertThat(bookRecord.getBookName(), is("Foundation"));
assertThat(bookRecord.getBookAuthor(), is("Asimov I."));
assertThat(bookRecord.getBookISBN(), is("ISBN 12839"));
assertThat(bookRecord.getBookFormat(), is("hardcover"));
assertThat(bookRecord.getPublishingYear(), is("2018"));
}
itemReader.close();
return null;
});
}
The MetadataInstanceFactory creates a custom StepExecution that is needed to inject our Step-scoped ItemReader.
MetadataInstanceFactory创建了一个自定义的StepExecution,需要注入我们的Step范围的ItemReader。。
Because of this, we can check the behavior of the reader with the help of the doInTestScope method.
正因为如此,我们可以在doInTestScope方法的帮助下检查读者的行为。
Next, let’s test the JsonFileItemWriter and verify its output:
接下来,让我们测试一下JsonFileItemWriter并验证其输出。
@Test
public void givenMockedStep_whenWriterCalled_thenSuccess() throws Exception {
// given
FileSystemResource expectedResult = new FileSystemResource(EXPECTED_OUTPUT_ONE);
FileSystemResource actualResult = new FileSystemResource(TEST_OUTPUT);
Book demoBook = new Book();
demoBook.setAuthor("Grisham J.");
demoBook.setName("The Firm");
StepExecution stepExecution = MetaDataInstanceFactory
.createStepExecution(defaultJobParameters());
// when
StepScopeTestUtils.doInStepScope(stepExecution, () -> {
jsonItemWriter.open(stepExecution.getExecutionContext());
jsonItemWriter.write(Arrays.asList(demoBook));
jsonItemWriter.close();
return null;
});
// then
AssertFile.assertFileEquals(expectedResult, actualResult);
}
Unlike the previous tests, we are now in full control of our test objects. As a result, we’re responsible for opening and closing the I/O streams.
与之前的测试不同,我们现在完全控制了我们的测试对象。因此,我们要负责打开和关闭I/O流。
5. Conclusion
5.总结
In this tutorial, we’ve explored the various approaches of testing a Spring Batch job.
在本教程中,我们已经探讨了测试Spring Batch作业的各种方法。
End-to-end testing verifies the complete execution of the job. Testing individual steps may help in complex scenarios.
端到端测试验证了工作的完整执行。在复杂的情况下,测试单个步骤可能会有帮助。
Finally, when it comes to Step-scoped components, we can use a bunch of helper methods provided by spring-batch-test. They will assist us in stubbing and mocking Spring Batch domain objects.
最后,当涉及到步骤范围的组件时,我们可以使用spring-batch-test.提供的一堆辅助方法,它们将帮助我们存根和嘲弄Spring Batch域对象。
As usual, we can explore the complete codebase over on GitHub.
像往常一样,我们可以在GitHub上探索完整的代码库超过。