Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most functions from the naturals to naturals are uncomputable, which I would think calls into question your first definition.

It's unfortunate that "computer" is the word we ended up with for these things.





Ah well, that's true -- so we can be more specific: discrete, discrete computable, and so on.

But to the overall point, this kind of reply is exactly why I don't think this is a case of L vs. S -- your reply just forces a concession to my definition, because I am just wrong about the property I was purporting to capture.

With all the right joint-carving properties to hand, there is a very clear matrix and hierarchy of definitions:

abstract mathematical hierarchy vs., physical hierarchy

With the physical serving as implementations of partial elements of the mathematical.


Word definitions are arbitrary social constructs, so they can't really be correct or incorrect, just popular or unpopular. Your suggested definitions do not reflect current popular usage of the word "computer" anywhere I'm familiar with, which is roughly "Turing-complete digital device that isn't a cellphone, tablet, video game console, or pocket calculator". This is a definition with major ontological problems, including things such as automotive engine control units, UNIVAC 1, the Cray-1, a Commodore PET, and my laptop, which have nothing in common that they don't also share with my cellphone or an Xbox. Nevertheless, that seems to be the common usage.

> Word definitions are arbitrary social constructs, so they can't really be correct or incorrect, just popular or unpopular.

If you mean that classifications are a matter of convention and utility, then that can be the case, but it isn’t always and can’t be entirely. Classifications of utility presuppose objective features and thus the possibility of classification. How else could something be said to be useful?

Where paradigmatic artifacts are concerned, we are dealing with classifications that join human use with objective features. A computer understood as a physical device used for the purpose of computing presupposes a human use of that physical thing “computer-wise”, that is to say objectively, no physical device per se is a computer, because nothing inherent in the thing is computing (what Searle called “observer relative“). But the physical machine is objectively something which is to say ultimately a collection of physical elements of certain kinds operating on one another in a manner that affords a computational use.

We may compare paradigmatic artifacts with natural kinds, which do have an objective identity. For instance, human beings may be classified according to an ontological genus and an ontological specific difference such as “rational animal“.

Now, we may dispute certain definitions, but the point is that if reality is intelligible–something presupposed by science and by our discussion here at the risk of otherwise falling into incoherence–that means concepts reflect reality, and since concepts are general, we already have the basis for classification.


No, I don't mean that classifications are a matter of convention and utility, just word definitions. I think that some classifications can be better or worse, precisely because concepts can reflect reality well or poorly. That's why I said that the currently popular definition of "computer" has ontological problems.

I'm not sure that your definition helps capture what people mean by "computer" or helps us approach a more ontologically coherent definition either. If, by words like "computing" and "computation", you mean things like "what computers do", it's almost entirely circular, except for your introduction of observer-relativity. (Which is an interesting question of its own—perhaps the turbulence at the base of Niagara Falls this morning could be correctly interpreted as finding a proof of the Riemann Hypothesis, if we knew what features to pay attention to.)

But, if you mean things like "numerical calculation", most of the time that people are using computers, they are not using them for numerical calculation or anything similar; they are using them to store, retrieve, transmit, and search data, and if anything the programmers think of as numerical is happening at all, it's entirely subordinate to that higher purpose, things like array indexing. (Which is again observer-relative—you can think of array indexing as integer arithmetic mod 2⁶⁴, but you can also model it purely in terms of propositional logic.)

And I think that's one of the biggest pitfalls in the "computer" terminology: it puts the focus on relatively minor applications like accounting, 3-D rendering, and LLM inference, rather than on either the machine's Protean or universal nature or the purposes to which it is normally put. (This is a separate pitfall from random and arbitrary exclusions like cellphones and game consoles.)


> That's why I said that the currently popular definition of "computer" has ontological problems.

Indeed. To elaborate a bit more on this...

Whether a definition is good or bad is at least partly determined by its purpose. Good as what kind of definition?

If the purpose is theoretical, then the common notion of "computer" suffers from epistemic inadequacy. (I'm not sure the common notion rises above mere association and family resemblance to the rank of "definition".)

If the purpose is practical, then under prevailing conditions, what people mean by "computer" in common speech is usually adequate: "this particular form factor of machine used for this extrinsic purpose". Most people would call desktop PCs "computers", but they wouldn't call their mobile phones computers, even though ontologically and even operationally, there is no essential difference. From the perspective of immediate utility as given, there is a difference.

I don't see the relevance of "social construction" here, though. Sure, people could agree on a definition of computer, and that definition may be theoretically correct or merely practically useful or perhaps neither, but this sounds like a distraction.

> I'm not sure that your definition helps capture what people mean by "computer" or helps us approach a more ontologically coherent definition either.

In common speech? No. But the common meaning is not scientific (in the broad sense of that term, which includes ontology) and inadequate for ontological definition, because it isn't a theoretical term. So while common speech can be a good starting point for analysis, it is often inadequate for theoretical purposes. Common meanings must be examined, clarified, and refined. Technical terminology exists for a reason.

> If, by words like "computing" and "computation", you mean things like "what computers do", it's almost entirely circular

I don't see how. Computation is something human beings do and have been doing forever. It preexists machines. All machines do is mechanize the formalizable part of the process, but the computer is never party to the semantic meaning of the observing human being. It merely stands in a relation of correspondence with human formalism, the same way five beads on an abacus or the squiggle "5" on a piece of people denote the number 5. The same is true of representations that denote something other than numbers (a denotation that is, btw, entirely conventional).

Machines do not possess intrinsic purpose. The parts are accidentally arranged in a manner that merely gives the ensemble certain affordances that can be parlayed into furthering various desired human ends. This may be difficult for many today to see, because science has - for practical purposes or for philosophical reasons - projected a mechanistic conceptual framework onto reality that recasts things like organisms in mechanistic terms. But while this can be practically useful, theoretically, this mechanistic mangling of reality has severe ontological problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: