> And most devs hating on Java are using an IDE written mainly in Java (all the JetBrains ones): the irony of that one gives me the giggles.
Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.
I think the problem is cultural, not technological. It seems like there's a lot of people in the Java community who still think OO is a great idea. Who think Bob Martin's awful, muddled code examples in Clean Code are something to aspire towards. People who claim the path of enlightenment requires them to replace concrete classes with interfaces - even when those interfaces only have 1 implementation.
Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.
This is what makes IntelliJ so impressive. It takes a master craftsman to forge something beautiful from such an unholy material. VS Code pulls off the same feat on top of election. In each case I can't tell whether to be horrified or impressed! Either way, I'm a huge fan.
Does anybody know where I can find a good, substantiated, critique of the common Java coding patterns, including on Android? This niche is such a huge mess that simply documenting all the bad things in a single codebase (along with explanation why they're bad) took me a month. It's tiring and mentally draining to do this: every other line of code you read makes you go "Why. Please, just tell me why anybody could ever think good code should look like this". Worse yet, there's hardly a discussion to be found about these things - it's "Uncle Bob" & co and the horde of their followers all the way, no dissenters. It looks like an echo chamber so hermetic that the most basic principles like DRY or YAGNI have a hard time penetrating it.
What is the most painful for me with all this, other than it being 99% self-inflicted and not caused (you could argue it's encouraged by it, but the ultimate cause is the culture) by the language, is the fact that it has infected Kotlin code. Kotlin was built to increase expressive power of the language, doing away with multiple limitations of Java and offering lots of modern-ish features on top of it. The community looks split in half: the Kotlin community tries to get the most out of Kotlin, and the Android community that does anything in their power to make Kotlin back into Java, writing code as if the limitations were still in place and new features didn't exist. I know that the churn and "production readiness" of things on Android generally favors a more conservative approach, but it's still too much. I cry tears of blood every other code review I'm forced to do.
If there's a single, definitive resource that I could point my coworkers to and eventually turn it into enforced guidelines, the author can count on a serious donation from me.
Hah. I’ve spent several years writing javascript for a living. You can always tell when code was written by someone who’s arrived fresh from Java or C++. Their code is full of hundreds of lines of useless classes which can often be replaced by a few simple object literals. Unlike class instances, object literals can be easily json stringified and parsed, too! You can torture people who are like this in code review: “This isn’t idiomatic. Please rewrite this code without the class keyword”. I’ve seen people people make a face like I just had their child expelled from kindergarten.
I don’t know any good resources unfortunately. I feel like we need a “motherfuckingwebsite” equivalent for this - “just use a motherfucking function”. I want to link it to whoever insisted on adding a useless TextDecoder class in javascript that you have to instantiate, instead of just calling textDecode(…, “utf8”) using a global function like the rest of the standard library.
I think part of the problem is that most people who hate enterprise java just learn a different language, set their resume on fire and start over somewhere better. That’s certainly what I did. I’m writing rust at the moment, and thankfully the lack of classes and distance from the JVM seems to keep most of this nonsense out. But having all the doubters leave makes the problem within the java ecosystem worse.
> I want to link it to whoever insisted on adding a useless TextDecoder class in javascript that you have to instantiate, instead of just calling textDecode(…, “utf8”) using a global function like the rest of the standard library.
I for one would rather punch the person who proposes such a global function as the only mechanism for conversion, because charset conversion is a reasonable thing to do on chunked partial inputs, and maintaining the state for conversion yourself is actually quite painful. Wrapping a stream converter into a one-shot function is much easier than the reverse, wrapping a one-shot function in a stream converter.
You can have the global function and make it work on "chunked partial inputs" while maintaining the state between calls (that state can be hidden in a closure or made explicit as an argument or a receiver - functions in JS can be called on objects even if they were defined outside of them). The very bad example of this is C's `strtok`, but it's pretty typical for JS to encapsulate the state in a closure to get similar functionality.
Said another way: the functional, simplified interface doesn't mean you have to get simplified or lacking functionality. Haskell wouldn't exist if that was the case. The simplified interface providing as much functionality as the more complex interface is possible because the expressive power of JS is leagues above Java - porting Java patterns that emerged due to Java's shortcomings (some call them "design decisions", and they're also right) to JavaScript is simply not a good use of JS as a language.
This is what GP is talking about. Chunking can be done cleanly with very few internal functions (and without changing the call signature, if you want), but you're implying it must be a class and already thinking about hypothetical hacks
> I for one would rather punch the person who proposes such a global function as the only mechanism for conversion, because charset conversion is a reasonable thing to do on chunked partial inputs
Fight me.
Javascript has a separate TextDecoderStream class if you want to support chunked, partial inputs. The TextDecoder class doesn't support streaming input at all. And it never will, thanks to the existence of TextDecoderStream.
TextDecoder only provides 1 method - TextDecoder.decode(). And the method is pure! So you could just make it a global method without any loss of functionality. The entire class has no mutable state whatsoever. Its just a container for some options, passed into the constructor. Those options could just have been passed straight to a global decode() method.
This might be idiomatic C++, but its terrible javascript. I had a friend who worked on the Chrome team years ago. He said a lot of the browser engineers who design javascript APIs are C++ programmers who don't know javascript properly. So of course they port C++ ideas into javascript directly. They don't know any better. This is why javascript's standard library is an incoherent mess.
I see a benefit from having this options container: you can have a central place to set the options and then only pass down the decoder and the user of the function doesn't have to bother with the configuration
Regardless, it’s a strictly worse api in the general case. And it’s on par with passing around an options object in the case you want to share the same options in multiple places.
If the current api is either the same or worse compared to a pure function version in all cases, I’d prefer the pure function version thanks.
Because a pure function is idiomatic, concise and clear. Its easier to use, easier to document and easier to understand. The only benefit of a class is that it encapsulates its state behind an interface. That makes sense for a BTree or a database. But TextDecoder has no mutable state. It has no state at all except for its passed-in configuration.
Decoding text is a verb not a noun. In programming, verbs are translated to functions, not classes. You don't have an "adder object" to add things. You don't have to instantiate a JSON stringifier before calling JSON.stringify. If thats not obvious, you may have spent too long in OO languages.
> “This isn’t idiomatic. Please rewrite this code without the class keyword”. I’ve seen people people make a face like I just had their child expelled from kindergarten.
Lol true
Don't forget all the getters and setters merely updating/reading a variable
Python says "explicit is better than implicit" but Java goes too far with it, and in the most verbose/inflexible ways possible
> Don't forget all the getters and setters merely updating/reading a variable
> and in the most verbose/inflexible ways possible
It's actually extra flexibility meant for two things: being able to override the getter/setter in a subclass, and keeping a consistent interface so users don't need to change how it's called if there was a refactor that adds something to the getter/setter (such as transforming the value because a different representation was more useful internally; particularly useful for libraries).
Python has @property to maintain that interface if need be, but these Java conventions started when no such thing existed in the language. I haven't done Java in a long time, so I don't know if it has it even now..
> It's actually extra flexibility meant for two things: being able to override the getter/setter in a subclass, and keeping a consistent interface so users don't need to change how it's called if there was a refactor that adds something to the getter/setter
This always strikes me as any-benefit mentality thinking. I agree there is some small marginal benefit to this pattern, but the cost (in time, decreased readability and lines of code) is massive. The benefit of being able to change your getters and setters later in a public interface almost never actually shows up.
Most getters and setters aren’t even part of a public interface anyway - because either they’re private or they’re in application code. In both of these cases, you can delay replacing a public class field with public getters and setters until you actually need to. When it actually provides value, it’ll take all of 5 minutes to do the refactor. Intellij can probably do it instantly. And, Spoilers: this will almost never come up in practice. Public fields are almost always fine.
It was originally for libraries that were distributed as jar or class files without the original source, that crept into general "best practices". Also IntelliJ didn't even exist in the 90s when this started.
It exist now. And even before intellij and eclipse had automated refactoring tools, it was like a 5 minute refactor. Just change your public field to be private, add getters and setters then play whack-a-mole mechanically fixing all the compiler errors that showed up.
I can see the argument for putting them in APIs exposed in jar or class files without the source. But the tasteless trend of adding getters and setters everywhere just looks to me like cargo culting. Its sheep programmers leading other sheep. You can tell its cargo culting because if you questioned anyone about the practice they would always ultimately justify their actions by saying "oh, I just do it because everyone else does it".
I believe its the responsibility of every engineer to decide for themselves what they think beautiful code should look like. You get some pointless arguments, sure, but the alternative is always a mess.
I agree with the cargo-culting, but the person I originally replied to seemed to think there was never any point, and that's what I was replying to - there was a reason it started.
> these Java conventions started when no such thing existed in the language.
Is there language support for these in the newer Java versions (I'm not up to date with newer features, since I won't be able to use them on Android anyway)? The reason for these getters/setters is as you said: a workaround for the language deficiencies. It's true for quite a few patterns, and it's not unique to Java; you get similar (in nature) patterns emerging in all languages. Greenspun's tenth rule and all that.
What's problematic is porting these workarounds wholesale to languages that don't have the limitations that originally led to their creation. In Kotlin, for example, every property has an implicit getter and setter, by default - you can override either easily with a dedicated syntax. In that case, insisting on writing explicit methods for getters and setters is simply a misuse of the language. Same in Python, as you note, where you can replace direct access to object attribute with a property without changing the user-facing interface of a class. I think JS also developed a feature like this? It's kind of impressive the OO languages managed to get this so wrong for so long, even though Smalltalk got it right in the 70s...
I think it's strange that people complain about the verbosity and boilerplate getters and setters in Java when this is entirely a non problem, provided your code is well designed.
If your class has any setter function, you're doing OO wrong. Mutating an object should 1) only happen if you have a very good, inescapable reason; 2) never be exposed directly to code outside the class, including children. If your class must have a mutating function, it should be a high level operation, not "set". If it really is "set" then that implies the field being set isn't a part of that object in any real sense.
A well designed class might have a couple of getters, but the inclusion of getters is a deliberate decision to allow client code to see the internal state.
In other words, blame the IDEs for the idea of auto-generating getters and setters. The language itself did a decent job of protecting class state.
OOP is routinely used with stateful, mutating objects; it has been hailed as a good way to manage that paradigm. If mutation is bad, so most objects don't mutate, you're talking about a functional niche in OOP, not mainstream OOP.
A class may need a setter function for some boring, pragmatic reason like, say:
- the language doesn't have keyword parmeters: constructors have only positional parameters
- the class has a large number of properties.
- most users set only a small subset of the properties, defaulting the rest. (And you can't easily predict which subsets are popular enough to get their own constructor variants.)
In that situation you might want to just construct the object in two steps: default everything and set a few selected properties (and then don't touch them). It's de facto immutable, just not around construction time.
I almost agree with you, I don't agree with the gatekeeping/nitpicking of saying "you're doing it wrong"
But I do agree that in most cases you don't need to call individual setters. And especially not automatically create one for every variable in your class
Word! I just left a Java-only shop for pythonic pastures and the culture is so much more pragmatic and to-the-point.
Hopefully soon enough ML models can be fed millions of line of code and produce the functionally equivalent thousands...
They're trained on billions of lines of code - most of them are not very good. I'm using Copilot, and the docstring/docs it suggests are so bad it hurts. If left alone, Copilot would happily generate those thousands of lines instead of helping reduce them to hundreds. It's still useful if given enough direction, but you need to be really careful not to overuse it or risk getting mistaken for a junior straight out of a bootcamp during code review :)
Yeah... I've noticed about half of what it comes up with needs tweaking... I really enjoyed it for SQL schema writing though... especially many-many table creation.
The problem with Uncle Bub is that his views only work in sync within very limited situations such as hobby coding and small projects by a handful of developers who are starting on the same page. A lot about Uncle Bub's teachings and FrAgile isn't practical in most of the real world besides a minority of outlier stories (and the problem with those stories is they don't track whether practices continue working long term).
Take his views on "clean code" for instance. Or really any conception of "clean code." Clean code is bullshit. I guarantee you can take anything Uncle Bub or other notable programmers say about clean code, apply them to a tee at your job, and be told that your code isn't clean or "feels icky" by whomever joined the team before you did. No description of clean code that I've read has ever been truly helpful in my career. The only thing you can really do is decide what you think is clean code and for you and your team to reach a level of agreed disagreement so that everyone can get their job done. One person's descriptive variable name is another person's "that's too long I can't read with all these confusing names", and one person's set of short purposeful functions is another person's "I can't tell what's happening cuz I have to jump between all these functions." At the end of the day, you barely have time to write "clean code", because your boss wants features rolled out ASAP.
Uncle Bub is also one of those guys who thinks good code doesn't need comments because it's self descriptive. This is one of the worst ideas to ever have met the software industry. No one's code is self descriptive. It's all a bunch of gobbledygook because it's meant to be run by a computer and just understandable enough for a human. It wouldn't kill us to just write some documenting comments detailing the intention behind code, but sadly most programmers either are too lazy or believe that if they need to add comments then that necessarily means their code smells. The result is that nobody knows anything about any given software project except for those who have been on the project the longest, and even they often don't know because... surprise... nobody wrote anything down! Just like with "clean code", it should be left up to teams how they want to add comments to code, and how much you comment your code shouldn't be influenced by memes from other programmers.
Don't even get me started on FrAgile. It's just a way to dupe programmers into taking on more work and doing the job of middle management for them.
Where are you getting this exclusivity from? Given your use of an equality operator, maybe you're thinking in code a bit too much.
> No, code isn't vague, and [good] code mostly it isn't long.
This isn't even remotely true. There are many dimensions to code that a human might subjectively use to determine whether code is "good." Code that is "clean" can often be a performance nightmare, but code that someone subjectively claims to be "unreadable" can be more performant, more fault resistant, future proof, and so forth. In the context of human interpretation, code can be vague regardless of how "good" that code seems to someone.
Also, the mere fact that anyone can disagree with you that good code "mostly" isn't long discredits the very idea in an objective sense. Plenty of programmers don't care whether code is "long" if it's written procedurally and/or with pure functions. If you haven't heard such opinions before, then you need to meet more programmers of varying disciplines.
> Variable names are human language btw. A programming language is a human language.
A programming language is for the benefit of both the human and the machine, though it's still mostly to the benefit of the machine. If it were solely a human language, then it would be closer if not identical to a language like English. And, if it were, it would be tremendously slow relative to traditional programming languages, and even generate more waste heat.
Mostly agree: the biggest factor in determining "readability" is simply familiarity of the reader with the particular syntax or style of programming. Learning many different languages allowed me to experience the evolution of code from "how do I even read this" to "well, it's clear and obvious what's going on" - without it changing one bit in the meantime. There's nothing you couldn't learn with enough effort, and the differences between learning time needed to get to mastery are in my experience small, save for a few outliers (IME: J, Forth)
At the same time I believe that there are a few objective metrics that seem to correlate with long term maintainability of a codebase. For example, the more contextual, relevant, and correct(!) information it contains, the easier it is to work with the code, especially once the original authors depart (and they will, sooner or later). Capturing that information and putting it in the code - in whatever way, including comments, diagrams (ascii art and graphical), particular tests and doctests, explicit pre and postconditions, other assertions, log calls - lowers the effort needed to maintain said code.
If the additional information threatens to obscure the the view on what's actually happening, you can refactor it the same way you'd refactor code. If you can introduce helper functions to kick details out of the way, you can also add footnotes and links to files with additional docs to get the level of detail manageable in the most often read code, while still providing enough information about that code.
What do you think? I came to this conclusion based on my experience with learning a very diverse set of programming styles and languages, experience with maintaining long-lived projects at work, and the "Programmer's Brain" book. The book has its moments, though for the most part it's just boring, but it did provide me with a few puzzle pieces I needed to make some sense out of the whole thing.
Those Kotlin issues are common when a lot of engineers from one language switch to another at the same time.
We used to see the same thing with companies that had moved C engineers over to Java. Lots of weirdly overcomplicated C constructs. Meanwhile the newbs who only knew Java were writing FactoryFactoryFactoryImpls. :facepalm
I think a lot of it comes from the Spring framework. People saw those gigantic stack traces with all the crazy abstractions and huge names, and took it as the norm.
Some genius ported the Spring framework to PHP (and called it Symfony) and we have to live with the BS from Java world in PHP land. Anemic models and lots of indirection. What a sad world we live in.
I did a lot of Java in the past, and ended up in PHP lately. Funny how PHP feels like 'I wanna be Java when I grow up', while Java (quarkus) says 'I dont wanna be Java anymore'.
Then again,Java itself has Spring, starting out as '4 classes for an EJB is insane architecture overload' and ended up in dynamic injection architecture astronaut land.
The enterprise java programming style is worth avoiding because it’s a productivity killer. This style obscures your business logic, it makes debugging harder through needless indirection, and it creates pointless busywork from the need to write, maintain and document reams of unnecessary boilerplate.
Foundationdb has official bindings in C, Python, go, Ruby and Java. The real bindings are in C, and all other languages’ bindings are well written, idiomatic wrappers around the same C library exposing the same functionality. The Java bindings need over twice as many lines of code as the Ruby and Python bindings to achieve the same thing.
Even if this style of Java is only 10% less productive than that of idiomatic Kotlin, Go or Python, you will probably break even on the investment of migrating languages after mere months. I think that undersells it. The productivity difference is probably much higher. Especially for large projects.
Improving your personal and teams long term productivity is just about the highest value work you can do.
All subjective. That style has plenty of people who don’t agree with any of your criticisms and could list many of what they consider benefits. It would also be very difficult to prove that extra lines of code have any financial impact, let alone one that could be recovered in a matter of months.
I understand your viewpoint, but I think it's a bit too pessimistic. While we essentially don't know how to consistently write good code and deliver quality products on time (the whole industry, save for some niches, has this problem), it's also improbable that nothing we could try would get us closer to that ideal. Not too close to it, perhaps, and not in a matter of months, and not by simply switching one set of rules of thumb for another, but surely, there's got to be something that can have a noticeable impact. Especially over longer-term and in larger codebases.
Trying to replace X with Y, where both have similar expressive power and similar drawbacks (i.e. FP vs. OOP), won't help much. Not on its own, and not without many other conditions being met, including completely non-technical ones like the personality of a hiring manager. Surely, though, in every paradigm or style, it's possible to write better or worse code, right? So it should also be possible to create an environment where the code quality, on average, would be just a bit better than the norm.
I'm not looking for quick gains for a single project, but rather a medium-term strategy that can save the effort required to develop and maintain products. People who claim to know how to "get good code quick" are mostly swindlers, and it's really hard to confirm causality in practice, but we shouldn't give up on finding ways to get better-than-average results in development.
I disagree. We might understand what I meant by "style"[1] differently. As I think of it, it's comprised of things that have an actual, measurable impact on the effort required to develop the codebase(s) over time. It's not about tabs vs. spaces, snake_case vs. camelCase, or anything even remotely like that. I'm not trying to establish a company-wide set of guidelines for the sake of it - I believe that relatively minor things (in the scope of a single project) can lead to significant savings at the scale of tens of projects and five years of a maintenance window.
As for the guidelines themselves: I don't care what they are, exactly, as long as a) they're there; b) they reduce said effort; and c) they're followed.
I've seen people take Bob Martin's concepts and do some truly awful things with it. Mind-bogglingly awful contortions of concepts into classes in arrangements that have to be sourced from demonic inspiration.
At the same time, I've seen the best code of my entire life formed from his concepts. Code that will last decades, far outlasting the UIs that feed it data or the databases that will store it.
I think the difference is all on whether the developers who wrote it understood that the "concepts" are not meant to be put into code on a 1 for 1 basis. For example, making a AddToDoUseCasePresenterInteractor class is literally taking the concept and making it 1 to 1 in the code. On the other hand, writing domain appropriate code, minimizing accidental complexity, and recognizing the clean code concepts as emerging from groups of classes and methods and packages in a code base leads to really clean, testable, maintainable, and FAST TO WRITE code.
I think the single biggest improvement for java programmers is to group all the classes related to a use case together in the same package - which means STOP MAKING "controller", "service", "model", etc packages where every different unrelated except by use is just dumped. If you're working on a part of the code base you should just have to change classes in one single folder. A new feature should just be a new folder. That change alone speeds up teams by huge factors.
Your last paragraph is really interesting to me. It obviously makes total sense. Yet, three environment I've used most in my career is Rails which also splits these things into folders by type rather than feature. It's never bothered me. Now I wonder if that because Ruby isn't Java and folders aren't packages or because I'm so very used to it.
I've worked in a few codebases that tried to group things by feature. In my experience, it never really worked that well.
Usually there would either be poor isolation between them or they'd be so well isolated that I'd wonder why they were even in the same project. In the latter, they'd often be difficult to maintain because of a web of dependencies pulled in by the little isolated subfeatures.
I prefer the separation by type, tbh. It also has the upside that it naturally encourages the developer to follow the same patterns within that particular package.
I'm not sure what I prefer because I'm most familiar with Rails. I guess I'm used to type-separation. An individual Rails app could be structured using "engines"[0], which could easily allow for this kind of separation. Each engine will have its own app/ directory which contains models/, controllers/, services/, etc.
The point is that the feature is its own "project" which would likely be loaded in the host application as a gem. I don't think this is actually a strong convention either way in Rails, so it would still be compatible with convention-over-configuration to build an app this way.
(Kinda just thinking out loud. FWIW, I have worked on an application with a similar sort of "engine-primary" structure but not what I currently work on.)
There's a certain amount of disciplined duplication that you have to adopt as well. I think this hangs a lot of people up and failure to do it right leads to the complications you're discussing.
Adding a todo note, marking it done, editing it, and deleting it are all different use cases and each have thier own package and classes. However, most people want to have some kind of single ToDoNote class which is the "To Do Note". Doing that means they have to decide which package to put it in and then pull it in everywhere else. And then "common" logic starts piling up and features depend on crap from other features and accidental complexity starts creeping in. Now you've got a single ToDoNote class that has different sets of properties null or pipulated depending on where it was instantiated and what use case its being fed to, all requiring the programmer to keep this in their head instead of getting the compiler to help.
The reality is that the set of data you need to create a todo note is different from the data you need to edit it which is different from that which you need to delete it. They share common elements, but never at the same time. The solution is to create "NewToDo", "EditedToDo", and "DeleteToDo" models that for each feature. Sure, some of the models will share a "title" property (new and edit), and some will share an "id" property (edit and delete) but never all at the same time. This offloads this complexity from the programmer to the compiler and speeds up development.
You already know youre operating at the lowest of the low leves of excellence if you rely on monkey see monkey do code standards. Having low complexity code with minimal programmer brain space needed to operate in is how you level up to higher levels.
> If you're working on a part of the code base you should just have to change classes in one single folder. A new feature should just be a new folder. That change alone speeds up teams by huge factors.
This is a neat idea. Have you done this in practice, and how does it work over a long time frame?
One of the big advantages of separate packages is purely for references: The model package has no reference to service or database code, so it's not possible to include SQL or other hidden service calls in it -- at least not without adding a new dependency which makes it blatantly obvious you're doing something wrong.
On the other hand, if your features are self-contained, and you have good unit test coverage of all the logic, then I guess it doesn't really matter as much what the structure is. The fact it's unit tested forces it to be loosely coupled, and testability is one of the main reasons to organize code into layers in the first place.
Been doing this for a few years. If I have an `Item` class, it's going into its own package. Along with `ItemService` (business logic), `ItemResource` (endpoint), `ItemDao` (persistence interface), etc. If `Widget` has a dependency on `Item`, then `WidgetService can either import `ItemClient` or roll its own.
Makes it super easy to split out microservices when the monolith gets big. Just keep from injecting one Service class into another, rely on the Resource or Client instead.
To the extent I feel the need to do this (in very large projects with many teams) I think it's best accomplished with separate modules. My favorite way is a single repo muktimodule project with the core code in one module, all the database implementations in another, and very small third "main" module that brings them together and has the startup code.
But there are other solutions. You could probably cook up some linters or some other compile time enforcer.
Been doing similar for years... I refer to it as feature oriented structure/organization. To me, that includes tests. I hate that so many code bases are effectively mirrored trees that are hard to break apart.
You can still have effective layers, even classes if you like them. But there's little reason they can't live next to each other on disk in the same project even.
> Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.
That describes just about every codebase I've worked with that relies on object-oriented patterns, which is basically every codebase. This is particularly bad with Java, but as you mentioned, this is a cultural problem and not so much a language problem. I like Java the language, but what kept me away from it was every Java codebase I've seen. Layers upon layers of needless abstraction, overly abstract names, and so much code is meant to describe things rather than a sequence of data changing.
It's not that OO is completely wrong, but it's a meme that we as programmers are refusing to shake. It's like people are still getting taught OO in college courses by programmers who haven't worked professionally in decades, and those students are still going into the real world thinking that everything's gotta be object oriented. And usually what OO ends up meaning is having classes and inheritance and "this" and mutability, as opposed to having objects that pass each other information. The latter doesn't need classes, or inheritance, or any of the other similar features in programming languages. But if you've got to write to a file, then you've gotta make a class that wraps the file system functions, right? /s
I’ll never forget working for a tiny startup under an ex Google CTO in 2013 who wrote plain Java code and chose simple libraries. Saying it was a breath of fresh air is an understatement.
> It seems like there's a lot of people in the Java community who still think OO is a great idea
If you're coding in Java, you've better think OO is a great idea. It's an object oriented language. And despite having loosely bolted on FP paradigms, that's not really going to change.
Although I feel most of the criticism against OO is actually more like a critique of the FactoryBuilderFactoryImpl-style application of design patterns, which is something else and really unfashionable today in Java.
> If you're coding in Java, you've better think OO is a great idea. It's an object oriented language.
There's plenty of room in Java for nice, clean code. All "java is OO" means in practice is that your code needs to be in classes. Some things that work great in java:
- Separates out value types from everything else. Value types should usually be tiny, and have public fields. They should not contain references to any other data, and any methods should be simple. ("getTempInCelcius()" is ok, "updateFromDatabase" is not.)
- Express saving / loading / processing work as (ideally) pure functions which operate on that data. You can use the class keyword + static functions to make a module / namespace in java. Use it.
- Use interfaces sparingly. Try not to use inheritance at all.
- (Controversial): If you find yourself with 18 different classes instantiated at runtime, where all those classes have exactly 1 instance (and probably all hold references to each other), you're doing it wrong. Use fewer, larger "controller" style objects (ideally 1), arranged in a tree (no references should point up the tree). If your classes get big, move utility code out into pure functions or short lived utility classes in adjacent files.
- Don't use factories. If you need to, use value types for configuration data.
Is this still "OO"? Depends what you mean by OO. I've heard this style of programming referred to as "data oriented programming", or something like that. Done well, data and data processing often end up in separate files. (Which is the opposite how OO is usually done). Features can often land in separate folders. Performance usually ends up better, because you have less pointer chasing and indirection. You often get much clearer separation of concerns between classes. And its usually much easier to write unit tests for code like this - since almost all classes can be instantiated and tested directly.
A lot of modern game development uses patterns like this, coded in C++. Most C code looks like this too. I've also read and written plenty of javascript / typescript which follows this pattern. Its very simple in JS/TS because you can use object literals (with interface definitions in TS) for your value types. In rust, your large "controller" style classes with a million methods can have all those methods spread out throughout different files in your project. (Ideally grouped by feature.)
That can be OO. I think in general object inheritance is pretty unfashionable, and while it occasionally does solve problems, it's rarely the first thing to reach for. It can be useful in library code though, were you might otherwise end up with significant code duplication.
Overall the move is toward having data classes (basically records) that are light on logic, as well as logic classes that are light on data. I don't think pure functions are necessary or in many cases even desirable in Java, as it largely lacks the tools required for this not to be crippling; although I will concede that any state changes typically ought to remain local.
Factories can be pretty good for separation of concerns, although in many cases it's been superseded by dependency injection frameworks in modern java. I still reach for it every one in a while though.
Also fwiw, game development is a bit of a weird case. It usually reaches for design patterns like ECS. This is in part a data locality optimization, since you can just iterate through all components in a "straight line", but mostly it's for the sake of malloc, which generally doesn't deal well with billions of objects being allocated and deallocated over and over randomly with growing fragmentation as a result. There are many types of programs this or other game dev patterns aren't suitable for.
> Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.
I have to say I already expected the comments to be good when I read the title but this nugget of pure gold - I would PAY to read comments like this!
I feel like if Java had immutability by default it would be such a better language to work with. It is so hard to determine what actually gets modified where in a large Java codebase.
It was designed when UML and tools like Rational Rose was being pushed hard by the big consulting firms on their clients as the One True Way to build Serious Business Software. JavaBeans was designed as a way to have a conventional interface that allows code components to be snapped into GUI applications. But someone thought it would be a great idea to use it everywhere else, too.
Can we all agree that the bean spec was not a good idea? No matter how old, or what other stupid ideas were fashionable back then. I cannot see how making things "beans" actually solves anything (besides maybe a slight reduction in boilerplate when (de)serializing bean-classes).
Java has been around for a longish time. Around the early 2000's there was at least a perception that you should avoid creating too many objects as that carried performance overhead for construction and garbage collection.
Immutable objects often require you to construct a new object to store an updated value and garbage collect the now unused previous object. So a lot of early Java code was written with mutable objects to avoid performance issues.
Bean spec is in no way mandatory. Plenty of existing code doesn't follow the bean spec. Immutable Java classes have been in vogue for many, many years.
Java now has sum types (sealed types) with exhaustiveness checking, pattern matching for records, and more general destructuring in the works for classes.
And most shops likely aren't using a JVM new enough to use records, and it will be a while before they are able to upgrade. And even then, most 3rd-party libraries out there won't use records, because they want to be compatible with the JVMs their users use. Hell, most popular libraries available on Maven Central are compiled with JDK8, or, at best, JDK11.
Aside from immutable instances, it would be nice if 'final' was the default, as well.
"Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company."
I've spent most of my programming life working in OOP. I see people critical of it, but I don't know what the alternative is for the kind of stuff I do for my job (not to say I have a choice in changing how we do it). Does anyone know of an open source project that implements a complex GUI app that doesn't use OOP so I can see what that code can look like?
We even use OOP in Javascript today, although capsulation is still a but sketchy and there is no support for interfaces and abstract classes. There is support for interfaces in Typescript, which was added because, well, it makes a lot of sense :)
Do you know of any well written articles of critique against OO? I've read a few against Clean Code, that actual went over the problems with the actual examples given. Nevertheless I hardly ever see good critiques against the principles themselves.
I'd be interested in reading about it in more detail.
"Rust highly rewards data-oriented design with simple, understandable ownership semantics, and this is great news because this is a really good fit for game development. I suspect this is also true generally, not just for game development! (but what do I know?)"
In my mind there are two major issues with java/c# style OO.
1) Subclass universality isn't great for cognitive comprehension
2) Subtyping component of subclassing has no compiler enforcement
Okay, for the first point, consider the universal NAND gates. You can build any circuit that uses not, or, and etc logic gates by converting it into only nand gates. This is great for manufacturing (I guess) where you can 'compile' a straightforward series of logic gates into just nand gates and then you only need to product one kind of gate in your hardware. However, imagine if you needed to program this way (say with an imaginary nand (!&) boolean operator):
Now, subclassing (and interfacing) provides an existential (to borrow from logic) property. That is, there exists some class such that the following methods exist (and maybe there's also some default implementations spread out amongst every super class in the inheritance hierarchy, but focus on the simple case).
Existential is also universal. You can use it to simulate a forall property (think generics), an or property, or an existential property. The problem is that now you're converting what would be a straightforward structure in a more fully featured language (like unions, or ADT, or Sum types, or discriminated unions) of This thing OR That thing into something much more complicated. It might be an infinite number of things which delegate logic that should exist right here into some other class someplace else where you cannot see it without difficulty or maybe cannot see it at all.
Or you can just cast the super class into a sub class ... which isn't great either.
Regardless, you also have to hope that nobody goes off and implements another case that shouldn't exist because subclassing is open where an OR property in something like discriminated unions is closed. Knowing you have all of the cases considered nigh impossible.
Now, this is good when you straight up want an existential property. It does happen that you get a thing and then you call a method on that thing and there really are an infinite number of things that it could potentially be both now and in the future such that you want support for that behavior. However, I assert that this requires much more complicated and careful coding and isn't applicable for most of the work that ends up needing to occur.
Part two is a bit more simple. When you subclass you're also declaring a subtype. The problem is that there's no compiler support or assistance to ensure that all subclasses are actually valid subtypes of the superclass. But it's a property that you get whether or not it's true.
So at any point in time you can have some object where you aren't supposed to know it's actual class, but for which if you don't know it's actual class you'll end up writing incorrect code. A superficial example can be had with exception (a whole other topic that will otherwise not be covered here). Imagine a Storage class which is just a key-value store. The CloudStorage class can throw NetworkExceptions, the InMemoryStorage class can throw OutOfMemoryExeptions, and the FileStorage class can throw FileNotFoundExceptions. Code that handles just Storage doesn't know which exceptions it might have to make sure it catches. The subclass isn't necessarily a subtype. [Of course you can open up a different discussion here about the appropriate way to handle exceptions, but I hope the simplified example here makes clear what the issue is. A more complex and realistic example can be constructed to show the same issue in a way that completely bypasses exceptions.]
>"It seems like there's a lot of people in the Java community who still think OO is a great idea."
Java is not the only language in existence and the OO is a great idea as well as many other paradigms when not being overused. Problem is with the programmers who learn one language / paradigm and would fight tooth and nails to solve all world problems with it no matter how poor for particular case.
Objects are a useful idea. Object Orientation is a terrible, terrible idea.
Functions + data (immutable when practical) is all you need for 90%+ of programming, and you should only reach for objects when necessary to manage some well-encapsulated but tricky state. Your default orientation should certainly not be towards objects.
You should trust your judgement. But good judgement comes from experience. Anyone who only has experience programming in one language, or one style, will have rubbish judgement.
Go learn Haskell, OCaml or Rust. Write some C for embedded platforms. Make a video game using your own hand spun ECS in C++. If, after all that, you come back and say "yeah lets use a class here", then I'll trust your judgement.
I am paid for designing and implementing products. Language for me is a screwdriver. Some convenient some not that much. I have programmed in many languages. I am familiar with the concepts but not going to learn OCaml or Haskel as there is zero ROI in it for me.
>"If, after all that, you come back and say "yeah lets use a class here", then I'll trust your judgement."
Look at it this way. I care what my customers say about my products because this is how I make my money for decades already. You - trusting my judgement - I do not give a flying fuck. Sorry for being direct.
Sorry - I just reread my reply above and I can imagine it reading more personal than I intended. I apologise.
The point I was trying to make is this - I’ve worked with plenty of engineers who want me to trust their engineering opinions. Some are experienced. Many are not. I have no idea where you sit on that spectrum - I don’t know you; obviously.
For what it’s worth, I think it’s the right call to trust your own judgement. I just don’t think enough people actually nurture their judgement by exploring and experimenting with a lot of languages and styles. After all, how would a self proclaimed “Java programmer” know what problems you can solve more easily in Python? I don’t think you can truly understand the strengths and weaknesses of OO (or any philosophy really) unless you spend serious time embracing other approaches.
Again, maybe you’ve done that and maybe you haven’t. I don’t know you.
Guilty as charged! I hate using Java because everything written in java seems to blend into the same indistinguishable swamp of classes with meaningless names, full of methods that constantly find new and interesting ways to obscure what your program is actually trying to do. Debugging very large Java codebases feels like living through Terry Gilliam's 1985 film Brazil.
I think the problem is cultural, not technological. It seems like there's a lot of people in the Java community who still think OO is a great idea. Who think Bob Martin's awful, muddled code examples in Clean Code are something to aspire towards. People who claim the path of enlightenment requires them to replace concrete classes with interfaces - even when those interfaces only have 1 implementation.
Anyone who writes class names like FactoryBuilderFactoryImpl and claims to walk in the light is at best a scoundrel, and at worst a follower of a religion so dark we don't name it in polite company.
This is what makes IntelliJ so impressive. It takes a master craftsman to forge something beautiful from such an unholy material. VS Code pulls off the same feat on top of election. In each case I can't tell whether to be horrified or impressed! Either way, I'm a huge fan.