Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Commit 1046862

Browse files
fmbenhassinemminella
authored andcommitted
Update documentation for version 4.1.0
1 parent 547533f commit 1046862

File tree

3 files changed

+242
-97
lines changed

3 files changed

+242
-97
lines changed

spring-batch-docs/asciidoc/index.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ The reference documentation is divided into several sections:
1010
[horizontal]
1111
<<spring-batch-intro.adoc#spring-batch-intro,Spring Batch Introduction>> :: Background, usage
1212
scenarios and general guidelines.
13-
<<whatsnew.adoc#whatsNew,What's new in Spring Batch 4.0>> :: New features introduced in version 4.0.
13+
<<whatsnew.adoc#whatsNew,What's new in Spring Batch 4.1>> :: New features introduced in version 4.1.
1414
<<domain.adoc#domainLanguageOfBatch,The Domain Language of Batch>> :: Core concepts and abstractions
1515
of the Batch domain language.
1616
<<job.adoc#configureJob,Configuring and Running a Job>> :: Job configuration, execution and

spring-batch-docs/asciidoc/readersAndWriters.adoc

Lines changed: 87 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ out. Spring Batch provides three key interfaces to help perform bulk reading and
1515
`ItemReader`, `ItemProcessor`, and `ItemWriter`.
1616

1717
[[itemReader]]
18-
=== ItemReader
18+
=== `ItemReader`
1919

2020
Although a simple concept, an `ItemReader` is the means for providing data from many
2121
different types of input. The most general examples include:
@@ -63,7 +63,7 @@ exception to be thrown. For example, a database `ItemReader` that is configured
6363
query that returns 0 results returns `null` on the first invocation of read.
6464

6565
[[itemWriter]]
66-
=== ItemWriter
66+
=== `ItemWriter`
6767

6868
`ItemWriter` is similar in functionality to an `ItemReader` but with inverse operations.
6969
Resources still need to be located, opened, and closed but they differ in that an
@@ -93,7 +93,7 @@ one for each item. The writer can then call `flush` on the hibernate session bef
9393
returning.
9494

9595
[[itemProcessor]]
96-
=== ItemProcessor
96+
=== `ItemProcessor`
9797

9898
The `ItemReader` and `ItemWriter` interfaces are both very useful for their specific
9999
tasks, but what if you want to insert business logic before writing? One option for both
@@ -358,7 +358,7 @@ the `ItemProcessor` and only updating the
358358
instance that is the result.
359359

360360
[[itemStream]]
361-
=== ItemStream
361+
=== `ItemStream`
362362

363363
Both `ItemReaders` and `ItemWriters` serve their individual purposes well, but there is a
364364
common concern among both of them that necessitates another interface. In general, as
@@ -480,7 +480,7 @@ Delimited files are those in which fields are separated by a delimiter, such as
480480
Fixed Length files have fields that are a set length.
481481

482482
[[fieldSet]]
483-
==== The FieldSet
483+
==== The `FieldSet`
484484

485485
When working with flat files in Spring Batch, regardless of whether it is for input or
486486
output, one of the most important classes is the `FieldSet`. Many architectures and
@@ -510,7 +510,7 @@ potentially unexpected ways, it can be consistent, both when handling errors cau
510510
format exception, or when doing simple data conversions.
511511

512512
[[flatFileItemReader]]
513-
==== FlatFileItemReader
513+
==== `FlatFileItemReader`
514514

515515
A flat file is any type of file that contains at most two-dimensional (tabular) data.
516516
Reading flat files in the Spring Batch framework is facilitated by the class called
@@ -560,7 +560,7 @@ the input resource does not exist. Otherwise, it logs the problem and continues.
560560
|===============
561561

562562
[[lineMapper]]
563-
===== LineMapper
563+
===== `LineMapper`
564564

565565
As with `RowMapper`, which takes a low-level construct such as `ResultSet` and returns
566566
an `Object`, flat file processing requires the same construct to convert a `String` line
@@ -585,7 +585,7 @@ gets you halfway there. The line must be tokenized into a `FieldSet`, which can
585585
mapped to an object, as described later in this document.
586586

587587
[[lineTokenizer]]
588-
===== LineTokenizer
588+
===== `LineTokenizer`
589589

590590
An abstraction for turning a line of input into a `FieldSet` is necessary because there
591591
can be many formats of flat file data that need to be converted to a `FieldSet`. In
@@ -614,7 +614,7 @@ width". The width of each field must be defined for each record type.
614614
tokenizers should be used on a particular line by checking against a pattern.
615615

616616
[[fieldSetMapper]]
617-
===== FieldSetMapper
617+
===== `FieldSetMapper`
618618

619619
The `FieldSetMapper` interface defines a single method, `mapFieldSet`, which takes a
620620
`FieldSet` object and maps its contents to an object. This object may be a custom DTO, a
@@ -634,7 +634,7 @@ public interface FieldSetMapper<T> {
634634
The pattern used is the same as the `RowMapper` used by `JdbcTemplate`.
635635

636636
[[defaultLineMapper]]
637-
===== DefaultLineMapper
637+
===== `DefaultLineMapper`
638638

639639
Now that the basic interfaces for reading in flat files have been defined, it becomes
640640
clear that three basic steps are required:
@@ -1039,7 +1039,7 @@ file. `FlatFileFormatException` is thrown by implementations of the `LineTokeniz
10391039
interface and indicates a more specific error encountered while tokenizing.
10401040

10411041
[[incorrectTokenCountException]]
1042-
====== IncorrectTokenCountException
1042+
====== `IncorrectTokenCountException`
10431043

10441044
Both `DelimitedLineTokenizer` and `FixedLengthLineTokenizer` have the ability to specify
10451045
column names that can be used for creating a `FieldSet`. However, if the number of column
@@ -1064,7 +1064,7 @@ Because the tokenizer was configured with 4 column names but only 3 tokens were
10641064
the file, an `IncorrectTokenCountException` was thrown.
10651065

10661066
[[incorrectLineLengthException]]
1067-
====== IncorrectLineLengthException
1067+
====== `IncorrectLineLengthException`
10681068

10691069
Files formatted in a fixed-length format have additional requirements when parsing
10701070
because, unlike a delimited format, each column must strictly adhere to its predefined
@@ -1110,14 +1110,14 @@ line lengths when tokenizing the line. A `FieldSet` is now correctly created and
11101110
returned. However, it contains only empty tokens for the remaining values.
11111111

11121112
[[flatFileItemWriter]]
1113-
==== FlatFileItemWriter
1113+
==== `FlatFileItemWriter`
11141114

11151115
Writing out to flat files has the same problems and issues that reading in from a file
11161116
must overcome. A step must be able to write either delimited or fixed length formats in a
11171117
transactional manner.
11181118

11191119
[[lineAggregator]]
1120-
===== LineAggregator
1120+
===== `LineAggregator`
11211121

11221122
Just as the `LineTokenizer` interface is necessary to take an item and turn it into a
11231123
`String`, file writing must have a way to aggregate multiple fields into a single string
@@ -1138,7 +1138,7 @@ The `LineAggregator` is the logical opposite of `LineTokenizer`. `LineTokenizer
11381138
`String`.
11391139

11401140
[[PassThroughLineAggregator]]
1141-
====== PassThroughLineAggregator
1141+
====== `PassThroughLineAggregator`
11421142

11431143
The most basic implementation of the `LineAggregator` interface is the
11441144
`PassThroughLineAggregator`, which assumes that the object is already a string or that
@@ -1205,7 +1205,7 @@ public FlatFileItemWriter itemWriter() {
12051205
----
12061206

12071207
[[FieldExtractor]]
1208-
===== FieldExtractor
1208+
===== `FieldExtractor`
12091209

12101210
The preceding example may be useful for the most basic uses of a writing to a file.
12111211
However, most users of the `FlatFileItemWriter` have a domain object that needs to be
@@ -1242,7 +1242,7 @@ of the provided object, which can then be written out with a delimiter between t
12421242
elements or as part of a fixed-width line.
12431243

12441244
[[PassThroughFieldExtractor]]
1245-
====== PassThroughFieldExtractor
1245+
====== `PassThroughFieldExtractor`
12461246

12471247
There are many cases where a collection, such as an array, `Collection`, or `FieldSet`,
12481248
needs to be written out. "Extracting" an array from one of these collection types is very
@@ -1252,7 +1252,7 @@ the object passed in is not a type of collection, then the `PassThroughFieldExtr
12521252
returns an array containing solely the item to be extracted.
12531253

12541254
[[BeanWrapperFieldExtractor]]
1255-
====== BeanWrapperFieldExtractor
1255+
====== `BeanWrapperFieldExtractor`
12561256

12571257
As with the `BeanWrapperFieldSetMapper` described in the file reading section, it is
12581258
often preferable to configure how to convert a domain object to an object array, rather
@@ -1474,7 +1474,7 @@ With an introduction to OXM and how one can use XML fragments to represent recor
14741474
can now more closely examine readers and writers.
14751475

14761476
[[StaxEventItemReader]]
1477-
==== StaxEventItemReader
1477+
==== `StaxEventItemReader`
14781478

14791479
The `StaxEventItemReader` configuration provides a typical setup for the processing of
14801480
records from an XML input stream. First, consider the following set of XML records that
@@ -1509,8 +1509,7 @@ To be able to process the XML records, the following is needed:
15091509

15101510
* Root Element Name: The name of the root element of the fragment that constitutes the
15111511
object to be mapped. The example configuration demonstrates this with the value of trade.
1512-
* Resource: A Spring Resource that represents the file to be
1513-
read.
1512+
* Resource: A Spring Resource that represents the file to read.
15141513
* `Unmarshaller`: An unmarshalling facility provided by Spring OXM for mapping the XML
15151514
fragment to an object.
15161515

@@ -1631,7 +1630,7 @@ while (hasNext) {
16311630
----
16321631

16331632
[[StaxEventItemWriter]]
1634-
==== StaxEventItemWriter
1633+
==== `StaxEventItemWriter`
16351634

16361635
Output works symmetrically to input. The `StaxEventItemWriter` needs a `Resource`, a
16371636
marshaller, and a `rootTagName`. A Java object is passed to a marshaller (typically a
@@ -1748,6 +1747,64 @@ trade.setCustomer("Customer1");
17481747
staxItemWriter.write(trade);
17491748
----
17501749

1750+
[[jsonReadingWriting]]
1751+
=== JSON Item Readers
1752+
1753+
Spring Batch provides support for reading JSON resources in the following format:
1754+
1755+
[source, json]
1756+
----
1757+
[
1758+
{
1759+
"isin": "123",
1760+
"quantity": 1,
1761+
"price": 1.2,
1762+
"customer": "foo"
1763+
},
1764+
{
1765+
"isin": "456",
1766+
"quantity": 2,
1767+
"price": 1.4,
1768+
"customer": "bar"
1769+
}
1770+
]
1771+
----
1772+
1773+
It is assumed that the JSON resource is an array of JSON objects corresponding to
1774+
individual items. Spring Batch is not tied to any particular JSON library.
1775+
1776+
[[JsonItemReader]]
1777+
==== `JsonItemReader`
1778+
1779+
The `JsonItemReader` delegates JSON parsing and binding to implementations of the
1780+
`org.springframework.batch.item.json.JsonObjectReader` interface. This interface
1781+
is intended to be implemented by using a streaming API to read JSON objects
1782+
in chunks. Two implementations are currently provided:
1783+
1784+
* link:$$https://github.com/FasterXML/jackson$$[Jackson] through the `org.springframework.batch.item.json.JacksonJsonObjectReader`
1785+
* link:$$https://github.com/google/gson$$[Gson] through the `org.springframework.batch.item.json.GsonJsonObjectReader`
1786+
1787+
To be able to process JSON records, the following is needed:
1788+
1789+
* `Resource`: A Spring Resource that represents the JSON file to read.
1790+
* `JsonObjectReader`: A JSON object reader to parse and bind JSON objects to items
1791+
1792+
The following example shows how to define a `JsonItemReader` that works with the
1793+
previous JSON resource `org/springframework/batch/item/json/trades.json` and a
1794+
`JsonObjectReader` based on Jackson:
1795+
1796+
[source, java]
1797+
----
1798+
@Bean
1799+
public JsonItemReader<Trade> jsonItemReader() {
1800+
return new JsonItemReaderBuilder<Trade>()
1801+
.jsonObjectReader(new JacksonJsonObjectReader<>(Trade.class))
1802+
.resource(new ClassPathResource("trades.json"))
1803+
.name("tradeJsonItemReader")
1804+
.build();
1805+
}
1806+
----
1807+
17511808
[[multiFileInput]]
17521809
=== Multi-File Input
17531810

@@ -1835,7 +1892,7 @@ which is the `Foo` with an ID of 3. The results of these reads are written out a
18351892
maintaining references to them).
18361893

18371894
[[JdbcCursorItemReader]]
1838-
===== JdbcCursorItemReader
1895+
===== `JdbcCursorItemReader`
18391896

18401897
`JdbcCursorItemReader` is the JDBC implementation of the cursor-based technique. It works
18411898
directly with a `ResultSet` and requires an SQL statement to run against a connection
@@ -2250,7 +2307,7 @@ fetches a portion of the results. We refer to this portion as a page. Each query
22502307
specify the starting row number and the number of rows that we want returned in the page.
22512308

22522309
[[JdbcPagingItemReader]]
2253-
===== JdbcPagingItemReader
2310+
===== `JdbcPagingItemReader`
22542311

22552312
One implementation of a paging `ItemReader` is the `JdbcPagingItemReader`. The
22562313
`JdbcPagingItemReader` needs a `PagingQueryProvider` responsible for providing the SQL
@@ -2339,7 +2396,7 @@ match the name of the named parameter. If you use a traditional '?' placeholder,
23392396
key for each entry should be the number of the placeholder, starting with 1.
23402397

23412398
[[JpaPagingItemReader]]
2342-
===== JpaPagingItemReader
2399+
===== `JpaPagingItemReader`
23432400

23442401
Another implementation of a paging `ItemReader` is the `JpaPagingItemReader`. JPA does
23452402
not have a concept similar to the Hibernate `StatelessSession`, so we have to use other
@@ -2645,7 +2702,7 @@ implementations. This section shows, by using a simple example, how to create a
26452702
writer restartable.
26462703

26472704
[[customReader]]
2648-
==== Custom ItemReader Example
2705+
==== Custom `ItemReader` Example
26492706

26502707
For the purpose of this example, we create a simple `ItemReader` implementation that
26512708
reads from a provided list. We start by implementing the most basic contract of
@@ -2691,7 +2748,7 @@ assertNull(itemReader.read());
26912748
----
26922749

26932750
[[restartableReader]]
2694-
===== Making the ItemReader Restartable
2751+
===== Making the `ItemReader` Restartable
26952752

26962753
The final challenge is to make the `ItemReader` restartable. Currently, if processing is
26972754
interrupted and begins again, the `ItemReader` must start at the beginning. This is
@@ -2780,7 +2837,7 @@ output), a more unique name is needed. For this reason, many of the Spring Batch
27802837
key name be overridden.
27812838

27822839
[[customWriter]]
2783-
==== Custom ItemWriter Example
2840+
==== Custom `ItemWriter` Example
27842841

27852842
Implementing a Custom `ItemWriter` is similar in many ways to the `ItemReader` example
27862843
above but differs in enough ways as to warrant its own example. However, adding
@@ -2805,7 +2862,7 @@ public class CustomItemWriter<T> implements ItemWriter<T> {
28052862
----
28062863

28072864
[[restartableWriter]]
2808-
===== Making the ItemWriter Restartable
2865+
===== Making the `ItemWriter` Restartable
28092866

28102867
To make the `ItemWriter` restartable, we would follow the same process as for the
28112868
`ItemReader`, adding and implementing the `ItemStream` interface to synchronize the
@@ -2876,7 +2933,7 @@ Batch provides a `ClassifierCompositeItemWriterBuilder` to construct an instance
28762933
`ClassifierCompositeItemWriter`.
28772934

28782935
[[classifierCompositeItemProcessor]]
2879-
===== ClassifierCompositeItemProcessor
2936+
===== `ClassifierCompositeItemProcessor`
28802937
The `ClassifierCompositeItemProcessor` is an `ItemProcessor` that calls one of a
28812938
collection of `ItemProcessor` implementations, based on a router pattern implemented
28822939
through the provided `Classifier`. Spring Batch provides a

0 commit comments

Comments
 (0)