> I only use turn signals when there are other cars around that would need the indication.
That is a very bad habit and you should change it.
You are not only signalling to other cars. You are also signalling to other road users: motorbikes, bicycles, pedestrians.
Your signal is more important to the other road users you are less likely to see.
Always ALWAYS indicate. Even if it's 3AM on an empty road 200 miles from the nearest human that you know of. Do it anyway. You are not doing it to other cars. You are doing it to the world in general.
> AI False Information Rate Nearly Doubles in One Year
> NewsGuard’s audit of the 10 leading generative AI tools and their propensity to repeat false claims on topics in the news reveals the rate of publishing false information nearly doubled — now providing false claims to news prompts more than one third of the time.
Wonderful article showing the uselessness of this technology, IMO.
> I just realised the situation is even worse. If I have 35 sentences of circumstance leading up to a single sentence of conclusion, the LLM mechanism will — simply because of how the attention mechanism works with the volume of those 35 — find the ’35’ less relevant sentences more important than the single key one. So, in a case like that it will actively suppress the key sentence.
> I first tried to let ChatGPT one of my key posts (the one about the role convictions play in humans with an addendum about human ‘wetware’). ChatGPT made a total mess of it. What it said had little to do with the original post, and where it did, it said the opposite of what the post said.
> For fun, I asked Gemini as well. Gemini didn’t make a mistake and actually produced something that is a very short summary of the post, but it is extremely short so it leaves most out. So, I asked Gemini to expand a little, but as soon as I did that, it fabricated something that is not in the original article (quite the opposite), i.e.: “It discusses the importance of advisors having strong convictions and being able to communicate them clearly.” Nope. Not there.
Why, after reading something like this, should I think of this technology as useful for this task? It seems like the exact opposite. And this is what I see with most LLM reviews. The author will mention spending hours trying to get the LLM to do a thing, or "it made xyz, but it was so buggy that I found it difficult to edit it after, and contained lots of redundant parts", or "it incorrectly did xyz". And every time I read stuff like that I think — wow, if a junior dev did that the number of times the AI did, they'd be fired on the spot.
See also, something like https://boston.conman.org/2025/12/02.1 where (IIRC) the author comes away with a semi-positive conclusion, but if you look at the list near the end, most of these things are something that any person would get fired for, and are things that are not positive for industrial software engineering and design. LLMs appear to do a "lot", but still confabulates and repeats itself incessantly, making it worthless to depend on for practical purposes unless you want to spend hours chasing your own tail over something it hallucinated. I don't see why this isn't the case. I thought we were trying to reduce the error rate in professional software development, not increase it.
Exactly. I rather miss the 15.6" Toshiba Satellite Pro A300 I had when I emigrated, a decade back.
It wasn't very portable, no, but around the house, it was great. Good sized full-travel keyboard, numeric keypad, lots of ports, and a nice big clear comfortable eye-friendly screen. Two SATA bays, so I could have the affordable combination (a dozen years ago) of a small fast SSD for the OS and a huge big cheap HDD for the data. Tiny trackpad, but I used a mouse.
There is a 17" classic Thinkpad before they went to nasty thin fashion-follower keyboards, but they only seem to be available in the USA and even given my fondness for old Thinkpads, I am not willing to pay £1000 for a second-hand decade-old one.
Pairing is a pain, charging is a nuisance, battery life is a constant worry, responsiveness is dodgy... there is nothing good about it. Give me something built-in, cabled, and always-on.
I never met him -- he hated travel, and I never could afford to go to a US convention -- but from all I've read, no, the absolute opposite was the case.
Good correction. This is the important point here. And there is a sub-point which is nearly as important:
The 8086 was out there and selling for years. AT&T ported UNIX™ to it, meaning it was the first ever microprocessor to run Unix.
But even so, DR didn't offer an 8086 OS, although it was the dominant OS vendor and people were calling for it. CP/M-86 was horribly horribly late -- it shipped after the IBM PC, it shipped about 3-4 years after the chip it was intended for.
The thing is, that's common now, but late-1970s OSes were tiny simple things.
Basically the story is that there was already an industry-standard OS. Intel shipped a newer, better, more powerful successor chip, which could run the same assembly-language code although it wasn't binary compatible. And the OS vendor sat on its hands, promising the OS was coming.
IBM comes along, wanting to buy it or license it, but DR won't deal with them. It won't agree to IBM's harsh terms. It thinks it can play hardball with Big Blue. It can't.
After waiting for a couple of years a kid at a small company selling 8086 processor boards just writes a clone of it, the hard way, directly in assembler (while CP/M was written in PL/M), using the existing filesystem of MS Disk BASIC, and puts it out there. MS snaps up a licence and sells it on to IBM. This deal is a success so MS buys the product.
IBM ships its machine, with the MS OS on it. DR complains, gets added to the deal, and a year or so later it finally ships an 8086 version of its OS, which costs more and flops.
The deal was very hard on Gary Kildall who was a brilliant man, but while MS exhibited shark-like behaviour, it was a cut-throat market, and DR needed to respond faster.
This seems strangely parochial to me. It reads a little like an American who knows San Francisco and so knows about trams has tried to imagine what a European city and country is like, and hasn't quite made the pieces fit together.
It has what I guess are American references that are meaningless to me. What is or was The Homer? In what universe are mopeds some sort of unsuccessful trial? Much of Asia has travelled by mopeds for ~75 years now; the Honda C90 is the best-selling motor vehicle of all time, and it's not even close.
As a super-extended metaphor for computing, I don't think the timeline fits together: it has Xerox, Apple, and IBM in the wrong order, but I'd find that hard to nail down. There was overlap, obviously.
It feels to me like the big influences are squeezed in, but not the smaller ones -- possibly because they mostly aren't American and don't show up on American radar. Wirth and Pascal/Modula-2/Oberon, the Lilith and Ceres; Psion; Acorn; other Apple efforts notably the Newton and things it inspired like Palm; Symbolics and InterLisp.
Nice effort. I respect the work that went into it, but it doesn't fix Stephenson's effort -- it over-extends it until it snaps, then tapes the bits together and tries again.
If I mentioned every operating system that Apple was involved in, my original post would be twice as long. Acorn, Psion, Newton, and Palm in particular are historically relevant today[0] but have no bearing on what Neal Stephenson was writing about. He was talking exclusively about desktop operating systems running on personal computers. That's where I drew the line. If you didn't ship something that ran on a normal PC[1], you didn't make the cut.
Ok, I also swapped out Be for NeXT, mainly because NeXT was the one that actually got bought by Apple and ultimately had a lot more influence.
Xerox, Apple, and IBM were all releasing products concurrently to one another, so I kinda just had to pick a (wrong) order and stick with it.
I wasn't trying to make a ding at mopeds, I was trying to make a ding at the classic Mac OS. I guess if you want to fix that metaphor, the classic Mac OS was like a nice moped that had a bunch of shit added onto it until it became a really unstable but nice-looking car, while Microsoft just made a real car that looks like dogwater. If that still feels too American, well, I'm sorry but Neil started with a car metaphor, and I've already exhausted my permitted number of dings at American car centric urban design with the Linux bit.
The Homer is a Simpsons reference. The joke is that Homer Simpson designed a car in almost the same way that managers decided what features shipped in Copland.
[0] If this was a mobile OS discussion, I'd be dropping IBM, UNIX, and XEROX from the discussion to make way for Psion, Newton, and Palm. Microsoft would be pared down to "Well around the same time they were shipping real desktop OSes they also shipped Windows CE and Windows Mobile".
But even then, I almost feel like mentioning the actual inventors of the PDA is overindulgence, because absolutely none of those companies survived the iPhone. Microsoft didn't survive iPhone. Nobody survived iPhone, except Android, and that's only because Android had enough Google money backing them to pivot to an iPhone-like design. Even flipphones run Android now (or KaiOS). It's way more stark and bleak a landscape for innovation than desktop was in 1999 when Windows was king.
[1] OK, yes, both early Mac OS and early Windows were built in Pascal, not C. But neither of those are operating systems, and normal users would not be able to tell if their software was written in one or the other unless it crashes.
As a programmer, I can point out all the many, many flaws with its technical architecture. Or how Apple's managerial incompetence let Microsoft leapfrog them technologically. Or even how Microsoft eventually figured out how to give Windows its own visual identity[0].
But at the end of the day, people were buying Macs despite the company making them. Apple had built an OS that made everything else look like a copycat, by worrying about the little details that few else cared about. It's the only reason Apple survived where literally every other non-Wintel PC company died. Atari STs and Amigas might have been fondly remembered, but their fanbases all jumped ship for PC the moment DooM came out, and the companies in question all got sold off for peanuts.
[0] My personal opinion regarding Windows visual design:
- Windows 1.x-3.x (and also OS/2 1.x): Really clunky and piss-poor attempt at cloning the Mac. It has the "programmer art" feel all over it. 3.x is slightly better in that they actually figured out how to pick a good default color scheme, but it still doesn't even have a proper desktop, instead using the root window as minimized window storage.
- Windows 9x/NT/2000: Not only does Windows finally get a real desktop, but it also gets a unique visual design, and a good one. Hell, they actually leapfrogged Apple on this one; as Mac OS 8 would take a few more years to ship its Platinum appearance.
- Windows XP: Cheap. Toylike. Microsoft saw OSX's Aqua and realized they needed something for Whistler, but they didn't seem to know what, and this is what we got. Media Center Edition would ship a slightly less toylike Windows visual theme.
- Windows Vista / 7: The absolute pinnacle of Microsoft's visual design chops. Aero is the thing that Liquid Glass wishes it could be. The glass effects were a perfect way to show off the power of GPU compositing, and Microsoft managed to do it without sacrificing readability or usability.
> As a programmer, I can point out all the many, many flaws with its technical architecture.
I think, since we started out on history here, we must consider the history and its context.
1. Apple does the Lisa: a cheaper Xerox Alto, minus the networking and the programming language. Multitasking, hard disk based, new app paradign. The Future but 1/4 of the price of the original.
It's not cheap enough. It flops, badly.
2. Jobs repurposes the parallel information-appliance project into a cheaper Lisa. Remove the hard disk and the slots and all expansion, seal it up, floppy only, remove the fancy new app format & keep it simple: apps and documents. Smaller screen but square pixels. Keeps most of the Lisa good stuff.
It's still expensive but it's cheap enough. It sells. It gets Pagemaker. It changes the course of the industry.
But to get a GUI OS into 128kB of RAM, they had to cut it brutally.
It worked but the result is significantly crippled, and Apple spent the next decade trying to put much of that stuff back in again.
Remarkably enough, they succeeded.
By MacOS 7.6 it had networking, network-transparent symlinks, TCP/IP, a HiColour GUI, usable multitasking, virtual memory, and more. It was actually a bloody good OS.
Yes, it was very unstable, but then, remember so was DOS, so was Windows 3.
The snag is, that time was 1997 and by then MS had surpassed Windows NT and Windows 95 with NT 4.
NT 4 had no PnP, no power management, no working 3D except vastly expensive OpenGL cards, it lost a lot of NT 3.x's stability because of the frantic desperate bodge of putting the GDI in the kernel, but it was good enough, and it made Apple look bad.
Apple was ploughing its own lonely furrow and it made a remarkably good job of it. It was just too slow.
When Jobs came back, he made a lot of good decisions.
Junk most of the models. Junk all the peripherals. Make a few models of computer and nothing else.
Junk Copland, Pink, Taligent, all that.
Meanwhile, like Win9x + NT, 2 parallel streams:
[a] Win9x parallel: salvage anything good that can be stripped out of Copland, bolt it onto MacOS 7.x, call it 8.x and kill off the clones.
[b] NT parallel: for the new project, just FFS get something out the door ASAP: Rhapsody, then Mac OS X Server. All the weird bits of NeXTstep that were to avoid Apple lawsuits (vertical menus, scrollbars on the left, no desktop icons, columnar file browser, etc.): remove them, switch 'em back to the Apple way.
Meantime, work on a snazzy facelift for the end-user version. Make the hardware colourful and see-through, and do that to the OS too.
I think, looking at the timeline and the context, all the moves make sense.
And I used MacOS 6, 7, 8 and 9. All were great. Just such a pleasure to use, and felt great. I didn't care that NT was more solid: that was a boring reliable bit of office equipment and it felt as exciting as a stapler. NT 3.51 was fugly but it worked and that's what mattered.
Emulators are fun, but not something I would really want to use for production use. I don't even use WINE much. I only keep it around for WinWord, and I only keep WinWord around for Outline Mode which no FOSS (or MacOS) word processor supports any more.
Now I have LogSeq, I don't even need outline mode for note-taking. LogSeq is lovely but it's for notes, not for long-form writing.
The point of indicating is that it's even more important to the people you didn't notice.
reply