There's some speculation about where the next breakthroughs will come from at the end of the chapter; it will be interesting to look back in 10-20 years and see if Knuth was correct.
I also liked the Fermatesque "exercise 223 is also currently unsolved, although I've rated it only '40' because I once thought of an answer (which I have since forgotten!)".
I too saw this footnote near the beginning (page 9):
“At the present time very few people believe that P=NP. In other words, almost everybody who has studied the subject thinks that satisfiability cannot be decided in polynomial time. The author of this book, however, suspects that N^O(1)-step algorithms do exist, yet that they’re unknowable. Almost all polynomial time algorithms are so complicated that they lie beyond human comprehension, and could never be programmed for an actual computer in the real world. Existence is different from embodiment.”
But would the existence of such "step algorithms" be fully equivalent to P=NP or a limited case applicable to only a subset of NP problems (i.e. one's thought previously to be NP)?
I read that not as 'step algorithms' but as 'algorithms requiring N^(O(1)) steps'; in other words - polynomial time algorithms.
One way to think of this is in terms of galactic algorithms, which are algorithms that incrementally improve the exponent in a polynomial time algorithm, but have constant terms so huge that they're effectively useless, practically. He's suggesting that there could exist a polynomial time algorithm for an NP problem which is similarly useless - but with the additional caveat that we won't even be able to prove theorems about it!
> The US seems to have been in a cultural decline for much of the last 100 years.
Can you explain what, exactly, you mean by this? Looking back 100 years, I can't help but see an enormous amount of improvement in culture, at least in terms of the tangible impact of culture on the lives of people living in the USA.
I he think the regards the creation of a federal income tax as the beginning of the end of the American experiment, going by comments in some other threads.
And in fact, if you read the article about Tide, the people who are buying Tide, no questions asked, from anyone who wants to sell it to them are committing felonies and are tracked down in sting operations.
I've worked with a lot of contractors. Invariably, if you bitch about guberment and then have a polite chat with the ones who make these kinds of offers, they're trying to get around paying taxes. This might be anecdotal, but it's not pure speculation.
You can disagree with the tax system in the US, but that doesn't make tax evasion legal.
Not really. E.g. there's no ordering in function composition that isn't also in high school math. Also, you shouldn't depend on e.g. maps processing elements in a given order.
My impression is that statement ordering is easier for mediocre but competent programmers than it is for excellent programmers. The reason is that writing correct imperative code is non-trivial, but writing correct-y imperative code is pretty easy. So someone who's carefully thinking things through while learning will have a more difficult time with simple assignments, because -- at least while learning -- they're essentially trying to construct the correctness proof in their head. And that's pretty hard for imperative code.
> which incorporates the notion of time and communication of proof.
Certainly constructive logics (typically) incorporate a notion of evidence/proof. But constructive logics don't incorporate notions of time or communication (of proof, or in general).
> i didn't get whether they were able to show an actual path of how UA could have occurred
I think the answer is no -- there was never a clear demonstration of an extant buggy execution trace. Just a preponderance of evidence that such a trace could exist.
The lack of ECC memory means that you don't even need that... the effects of a little solar flare, or even noise from a powerline nearby could have caused the issue.
Yes, I'm sure it was a bug in this case. However, this paper shows that RAM errors are so common you really don't want non-ECC in anything safety-critical.
For example, we observe DRAM error rates that are orders of magnitude higher
than previously reported, with 25,000 to 70,000 errors per billion
device hours per Mbit and more than 8% of DIMMs affected
by errors per year. We provide strong evidence that memory
errors are dominated by hard errors, rather than soft errors, which
previous work suspects to be the dominant error mode
I believe that when the recalls occurred, at least several models of Toyota/Lexus cars were affected.
Ultimately, it's impossible to know without being inside. I would imagine there's a fair bit of reuse within each manufacturer and very little sharing between manufacturers
1) how much other components are shared between manufacturers, and having companies like Delphi Automotive doing a lot of work for many manufacturers
2) how often car manufacturers own parts of each other, and how often these ownership structures change (look at Ford and Volvo and Mazda, or Daimler-Chrysler, etc)
I would imagine there are shared platforms and components, also software, between manufacturers.
> Are the police allowed to act on charges and arrest individuals?
Yes. Although the arrest can happen before the charge, typically a judge will issue an arrest warrant as soon as the charges are filed.
And given the >90% conviction success rate in the USA (at least in federal cases), it's a pretty good bet any charges filed will stick if the prosecutor wants them to.