If there's a default (I don't think there necessarily has to be one) there has to be somebody who decides what the default is. If most people trust them, that person is either very trustworthy or people just don't care very much.
> there has to be somebody who decides what the default is
Sure. This happens with ad blockers, for example. I imagine Elsevier or Wikipedia would wind up creating these lists. And then you’d have the same incentives as you have now for fooling that authority.
> or people just don't care very much
This is my hypothesis. If you’re an expert, you have your web of trust. If you’re not, it isn’t that hard to start from a source of repute.
"Half the mean width of a
polyhedron P is equal to the expected value of
max θ^T x
subject to x ∈ P,
where θ ∈ S^(d−1) is uniformly random distributed with respect to the Haar measure on the unit
sphere."
The expression max θ^T x is not translation-invariant: if you replace x with x + ∆x, you get (max θ^T x) + θ^T ∆x. But the expectation of θ^T ∆x is 0 so the expectation of the maximum is translation-invariant again.
Importantly, they're talking about continuous representations, i.e. the output logits. For there to be a loss, you'd need two different tokens to produce the exact same logits, which is even less plausible. But as soon as you sample discrete output tokens from the distribution defined by the logits, you do end up losing information. So the practical relevance of this paper is somewhat limited.
I wouldn't bet that every smart TV Crunchyroll wants to be available on has more processing power than your phone from 2015 (some of those TVs might be older than that), but yes, it's probably less about hardware capabilities than about platform limitations that make the usual solution of compiling libass into a blob and integrating it into the player not so easy to implement.
Or Vercel is just a commercial web host that the person who volunteered to maintain the docs likes to use?
The headline seems to imply that it’s some kind of gotcha that the marketing for an OS is being served by a different OS.
I don’t see what the issue is. Web hosts and Linux distros are a dime a dozen. It sounds impractical to choose a hosting company based on what flavor of Linux is used instead of the price and features of the platform you upload the site to.
Undecidable languages are formal languages, too, even though there's no Turing machine that can accurately determine for any string whether it is part of the language or not.
A formal language is a set of finite-length sequences (called "words") of symbols from another set (called the "alphabet"). It's essentially a very crude approximation of some strings of letters in an alphabetic writing system forming words in a natural language, while other combinations are just nonsense.
For a given formal language, there don't necessarily have to be any rules governing the words of the language, though the languages used for writing formal proofs are typically more well-behaved.
You're talking about formal languages in the context of computer science. Formal languages in the context of logic predate computer science (or could be said to be a direct precursor to computer science). These logic languages are also trivially decidable in the computer-science sense of formal languages, i.e. their set of strings is easily decidable. When we talk of decidability in those languages we ususally mean the decidability of whether a statement is provable or not (using the language's inference rules).
While my explanation of "formal" is meant to be introductory and not entirely precise, that some problem tackled by an algorithm is undecidable does not mean that that problem isn't precisely interpretable by the computer. A Python interpreter doesn't terminate for all inputs (and therefore doesn't decide halting), yet it does interpret all of its inputs precisely.
It does get worse in the sense that there could be languages whose description is incompressible (we can simulate this, assuming hash functions approximate random oracles, by saying "choose a secret key; now the language is 'every string whose HMAC value under that secret key is even'").
If you accept some axiomatic assumptions about infinite sets (that are common in mathematics; I'm not sure exactly what the weakest required axiom is for this), then you can even believe that there are infinite languages that have no finite description at all, very much akin to the commonplace claim that there are real numbers that have no finite description at all. There are formulations of mathematics in which this is not true, but most mathematicians seemingly work in formulations in which it is true.
I even expect that we can probably prove this directly using the powerset operation and diagonalization, which doesn't require particularly strong assumptions about infinities.
The award was announced, that's what happened. They found out who the winner would be at the same time as everyone else, and then it took them a week to decide that they disagree with the Nobel Peace Prize committee this year. That's actually quite fast as organizational decision-making goes.
"Only one or two" isn't zero. The problem isn't that a small community can only write a small Wikipedia, but that there's a global supply of fools who want to make every small Wikipedia bigger, even if they're completely unqualified to do so.
Wikipedia is built around the basic principle that if you just let everyone contribute, most contributions will be helpful and you can just revert the bad ones after the fact. This works for large communities that easily outnumber the global supply of fools, but below a certain size threshold, the sign flips and the average edit makes that version of Wikipedia worse rather than better.
So smaller communities probably need to flip the operating principle of Wikipedia on its head and limit new users to only creating drafts, on the assumption that most will be useless, and an admin can accept the good ones after the fact.
I'm not sure whether Wikipedia already has the software features necessary to operate it in such a closed-by-default manner.
naturally, graphite has similarly high thermal conductivity along the layer direction (which is basically graphene), and one would think that there should be some way to put such a thin layer of graphite/graphene on top (or inside) the chip to achieve similar results.