The JVM automagically swaps out blocking network calls executing in a virtual thread to a non-blocking implementation, so potentially many other requests can be served in the same time.
So you get the benefit of async frameworks, with none of the negatives: the blocking model is trivial to reason about, read, maintain, and works perfectly with tooling (you won’t get an exception in some WhateveScheduler, but at the exact line where it actually happened. Also, can step through with a debugger).
Thanks. Any good books that cover the current java performance landscape but is grounded in the history of what came before?
I used to be heavily involved in java-based server performance and scalability work back in the 90s and 00s but have been working on other things the last decade. But it would be fun to learn more about where things stand.
So you get the benefit of async frameworks, with none of the negatives: the blocking model is trivial to reason about, read, maintain, and works perfectly with tooling (you won’t get an exception in some WhateveScheduler, but at the exact line where it actually happened. Also, can step through with a debugger).