Structured Concurrency and Project Loom
Author: |
Eric Kolotyluk |
Created at: |
Nov 2022 |
Updated at: |
Nov 2022 |

Writing concurrent code can be very challenging to get right, but over the years we have learned a lot more about how to design and implement it more effectively.
Recently I started a Proof Of Concept project I call Loom Lab, or Loom Laboratory. If you are curious, feel free to explore and comment on this page.
Java Concurrency
Starting in 1995 I began playing around with Java, and in particular the
Java Thread API. Java made concurrent programming seem so easy, and for
some of us hot-shot programmers, we thought it would be easy. Wow,
were we ever wrong!
![]()
Eventually, people like Brian Goetz wrote Java
Concurrency In Practice, and Doug Lea wrote
Concurrent
Programming in Java. After reading these, I felt very humbled learning
from experts. Mostly I learned how hard it is to design and implement
concurrency correctly.
![]()
By 2022, the most important lesson I have learned about concurrency is that over the years, much research and development has provided better insights and frameworks for dealing with concurrency. Project Loom is a prime example of this, bringing about a generational change in improved concurrency, both in actual performance and in reduced cognitive load reasoning about concurrency.
Reactive Systems
I dallied in Reactive Programming for a while, and it really did improve performance, but at a cost of much higher cognitive load, and implementations that were harder to troubleshoot when there were problems. See also Scala/https://akka.io[Akka] and Project Reactor.
Experience with Project Loom shows it to have the same performance
capabilities as Reactive Systems, but with much lower cognitive load and
better troubleshooting capabilities. This means it is easier to write
high performance concurrent applications that are safer and more robust.
![]()
Architecture
Architecture is largely about making important decisions that are difficult and/or expensive to change. Architecturally, the choice is between Reactive Systems and what Loom has to offer.
Loom is still new. As of Java 19 Virtual Threads is a preview feature, and Structured Concurrency is an incubating feature. Virtual Threads may change, but Structured Concurrency is more likely to change.
Latency vs. Throughput
One thing to keep in mind is that Project Loom benefits Vertically
Scaled environments, such as many CPU Cores, large caches, and lots of
RAM, so performance is capped by these constraints. For larger
scalability, we need Horizontally Scaled environments such as
Apache Beam. However, vertically scaled
environments, while capped, will offer lower latency, whereas
horizontally scaled environments cannot offer as low latency, but they
can offer better overall throughput.
![]()