I understand and I'm well aware that it has become "coder slang" - that's my point. On a thread about complexity worship, the first comment I saw was an unnecessary use of jargon that complicates things without new meaning. That, in my mind, results from a culture of complexity worship.
> Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
- orwell
I think the slang version of "big O" is so common that there's no chance of misunderstanding. If anything, it serves as more of a shibboleth - if you don't get it, you're not part of the group the story is intended for.
RE that Orwell quote, I have mixed feelings. I agree with it in so far as it means "pick easiest words for your audience at the precision level you need". But in general, words are not equivalents, even if they're listed as synonyms - each word has its own specific connotation. Say, "car" and "automobile". Technically, they refer to the same thing, but they feel different. That subtle emotional difference may not be important in formal setting, but it's an extra dimension of communication in informal cases (like e.g. this comment thread).
I'd say "perpendicular" but that's somehow even more niche, despite being a word everyone learns in high-school geometry.
I guess there just comes a point where you've solved so many optimization problems that it's hard to not think of a bunch of solutions with different attributes as being embedded in an N-dimensional space?
Also it does not communicate the precise meaning. "Orthogonal" fits when concerns may be related, but they're independent from one another, so you can discuss them separately.
That (Big O as slang) sounds like a horrible situation.
It feels analogous to the widespread use of "exponentially" to mean "a lot" or "quickly" which is a really bad, silly thing. The difference is that few physicists and mathematicians misuse "exponentially" in casual conversation, whereas you are claiming that software people deliberately misuse "Big O". I'm not sure I believe you but either way this seems regrettable.
"Exponentially" isn't all that bad; what people actually mean when they say it is usually "superlinearly", but there's no practical difference between the two when talking about e.g. scaling problems.
YMMV, but, outside of technical contexts, the phrase "increased exponentially" is often used by people who don't even know the difference between linear, geometric and exponential growth. In many cases they don't even understand that the word "exponential" refers to a rate of growth, if they've heard of the concept.
Take the context of a high-profile art magazine, Frieze. (I googled "frieze increased exponentially"). This is shooting fish in a barrel—but the most egregious example in the first page of hits is this one:
"Seppie in nero’ – squid in its own ink – is my favourite Venetian delicacy. Although customarily served with polenta, I prefer it on thick spaghetti since pasta exponentially increases the naturally squirmy quality of the creatures’ tentacles, creating a Medusa-like mound of inchoate, salty matter."
So you've got an art critic writing slightly pretentiously about food, and he throws in an "exponentially" which has nothing to do with a rate.
This and similar usages of "exponentially" are extremely widespread now. People talk about exponential increases without any mental model of the rate of growth as a function of time at all—just the woolly idea that something is growing fast.
"The term "exponentially" is often used to convey that a value has taken a big jump in short period of time, but to say that a value has changed exponentially does not necessarily mean that it has grown very much in that particular moment, but rather that the rate at which it grows is described by an exponential function."
It's unfair to dismiss that as sloppy slang; that's exactly what Big O notation exists for: to concisely express such concepts as "the number of classes increases linearly with the number of business requirements", as differentiated from "the number of classes is independent of business requirements" or "the number of classes increases with the square of business requirements".
The Orwell quote is valid at the heuristic level, but when
a) there's an installed base of people who know the jargon, and
b) the everyday English equivalent takes a lot more words to say the same thing, and
c) something coherent is meant by the jargon that could be so translated if necessary,
then that's exactly when you should use the jargon.
Give me "a^2 + b^2 = c^2" over "the sum of the squares of the lengths of a and b is equal to the square of the length of c".
The point there is that Big-O notation also implies that the growth has given shape above some large-ish n. And metrics like "number of operators" or "number of bussiness requirements" rarely are so large that this makes sense.
And in fact approaches to system design that try to make the code size independent of number of bussiness requirements and their possible changes in future lead to exactly the kind of "complexity worship" discussed in TFA (various ad-hoc turing complete VMs that interpret code represented as rows in bunch of relational tables and what not).
People who write "x is O(n)" usually actually mean to write "x ∝ n", but 1. ∝ is hard to type on a keyboard, and 2. even fewer people (including CS majors!) know what ∝ means, than know what O(n) means well-enough to allow them to figure out its non-jargon usage from context.
So you think that the imprecision of saying "O(n)" when the "large-ish" behavior is not relevant is worse than the verbosity of saying "scales directly/with the square of/not-at-all with [relevant constraint]"?
FWIW, Big-O itself, even in technical contexts, gets imprecise with e.g. calling hashtables O(1), which is not possible, even under the idealized computer model (instant memory access, etc).
Is there a shorter way of saying "scales proportionally with n" that you would suggest the tech community prefer because of its greater precision?
That's like saying there's no point in using pronouns, since you have to say the antecedent anyway.
It would be wrong in both cases because the context can make clear what a variable or pronoun refers to. If the problem context makes clear what the binding constraint is and you just need to talk about scaling behavior, then it is indeed shorter to say "O(1) rather than O(n)" vs "doesn't depend on the operations rather than being directly proportional".
>>what we ended up with was several classes for each operation.
>The second is shorter, and says no more than what is relevant.
It says less: the O notation is used to indicate that as you add more operations, you will need to add more classes, rather than only needing to add classes when there is logic the operations don't yet implement.
> Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. - orwell