Today I found an article about Java8's Fork/Join-Framework and its usage for the parallel streams implementation. While I do understand the article, I'm not entirely sure what I should think of it.
Basically what it says it that F/J in conjunction with streams is next to useless, and especially so in the context of JEE applications. Quite a few specific arguments are listed, such as:
- it needs a massive volume of easily separable data (aggregate),
- creates copious threads without regard for others,
- has a high potential for stack overflows,
- has a high potential for massive memory usage,
- has a very, very narrow performance window,
- is only designed for one request at a time.
Moreover, it gives these arguments against F/J's recursive decompostion approach:
Recursive decomposition has an even narrower performance window. In addition to the above dynamic decomposition, recursive decomposition optimized for dyadic recursive division only works well:
- on balanced tree structures (Directed Acyclic Graphs)
- where there are no cyclic dependencies
- where the computation duration is neither too short nor too long
- where there is no blocking.
Since this is the only source I could find which complains about FJ, I'm not sure if this can be taken seriously. Are the above-cited, or other similar points a valid concern?
More specifically, does Oracle have an official position regarding the limitations of the F/J Framework as applied to the parallelization of streams processing? If so, does it have plans to do something about them?