Crowdstrike marketing slogan on their website: "A radical new approach proven to stop breaches". I'll give them that: Putting all Windows computers within a company into an endless BSOD loop is a very radical approach to stop breaches. :)
As a sidenote, there is even an open-source recreation of the TOS operating system for the Atari 16/32 bit computers called EmuTOS that is in active development to this day. It just had a new release a few days ago: https://emutos.sourceforge.io/. And this new release happens to have better support for the Falcon video chip.
I don't get the purpose, at least of his minimal example. The author says he wants to make his code position-independent, i.e., so that it can be executed from everywhere in memory (without relocation). But that is defeated by the...
They should have put labels in front of and after the string bytes, then most assemblers would evaluate "(labelafter - labelbefore)" to a constant integer giving the length as needed. No need for a runtime sub instruction either, then.
Yeah the example wont work, but since it's only used for getting the length of the string its an easy to fix to instead use pascal/counted strings with a length prefix byte.
As other commenters have pointed out, this is a C64 controlling an MP3 player.
People old enough to remember the late 1990s / early 2000s might recall that back then it was popular, too, to hook up an external MP3 decoder, e.g., to the parallel port; so that the PC's CPU would not be at 100% just for playing MP3 music.
Wait, was that popular? Now, I was mostly into the C64 and later Atari ST, with some insight into other platforms, but I don't think I've ever heard about an external MP3 decoder. On what's platform was this available?
The closest I can think of is the external clock chip for the Atari Falcon that allowed you to decode 44.1 kHz MP2.
Save a few dollars, times 10's of thousands of vehicles in higher production models.
I've posted this before in more detail, but the short version is that when I was working at Ford Motor Co in the late 90's I remember seeing some internal documents championing how they saved ~$200 off a production Taurus (at the time a ~$20,000 vehicle) via a bunch of $10 and $20 individual cost savings. It was a big deal, added up to real dollars.
There is more cost savings than you might think in a simplified wiring harness.
Some 3rd party organization builds the harness, then it ships to the manufacturer, where it is usually installed by humans.
Besides the material savings of less actual wire, you most likely have labor savings on the harness build, and possibly on the installation if the new harness is easier to install based on the reduced overall weight and complexity.
The networking methodology side would likely not be overly complex. We already have CANbus device networks, and the associated software stacks. Changing to an ethernet based approach is a well-understood transition that would not require major changes, at least not beyond the incremental updates and other things that the engineers are already likely to be working on.
There are benefits weight and cost and system complexity and network specifications, etc. There are certain things you just cannot do with CAN bus that manufacturers want to do. Using traditional Ethernet is possible (and I've done it), but the second you bring it up to penny pinchers they get a headache and the conversation is over. Having something like this allows you to make the transition while not just increasing capabilities, but actually decreasing costs.
Dollars? You know that automotive calculate in 1/10 of cents. Every cable needs an appropriate connector. Every wire needs its dedicated pin in the connector. Did you know, that the connector housing is directly molded into the case because its cheaper?
Alone Ford at one point sold over 6.6 million vehicles a year.
So, you're telling me, the manufacturers would've switched to this new amazing way of doing networking in the car, but it would've cost them an extra dollar and added a few hundred grams to the weight of the car, so they just had to wait for this standard to come along?
10/100BASE-T1 and its relatives are a long time coming. Car manufacturers are actively making these standards, they aren't "waiting for it to come around".
And if it was only in one place, yea, maybe not worthwhile. But what happens if you save $5 each on 300 different systems in a production run of a million cars?
But you don't save 5 dollars on 300 systems, you save less, per car. The price and weight of the cables has very little to do with the decision to use this vs other ethernet standards.
Automotive manufacturers will spend 6 figures in NRE costs to save $0.003 in unit costs. If they only redesign every few model years, use the same hardware across several models, and they sell millions of cars per year, the math works out pretty well.
I've said it in another reply, but the automotive industry would've switched to this amazing new standard, except it would've added a couple of dollars to the cost and a few hundred grams to the weight of the car, so they just had to wait for this?
10BASE-T1S is more of alternative/replacement/upgrade to CAN and CAN FD, which use a single wire pair as bus, too. And automotive 100 MBit/s and 1 GBit/s Ethernet are also single-pair. (But they are point-to-point and therefore more expensive, as the article explains.)
Not really. There is no ring, neither physically, nor logically. It's based on CSMA/CD, like the "good old" coaxial cable Ethernet. (Optionally it can use additional methods to avoid collisions, though.)
There isn't the connector for 10BASE-T1S or other Automotive Ethernet standards such as 100BASE-T1. OEMs (= car manufacturers) will often define their custom connector. E.g., the Ethernet signal might just occupy two pins on an much bigger ECU connector.
As for "there seem to be quite a few different Atari 1040ST board designs" noted in the article. There is actually a huge number of different board layouts. Look at the entries for "STF" or "STFM" in this overview: https://temlib.org/AtariForumWiki/index.php/Atari_ST_motherb....
As for why Atari redesigned the board so many times: no idea.
When I had my "I want to build my own SDR" phase ;) ca. 15 years ago, I also used analog switches as a switching mixer - with a PLL running at 4x the desired frequency so I could utilize dividers to get nice 0°, 90°, 180°, 270° LO signals for IQ mixing. Brings back memories...
In the 70's the company I worked for made a receiver for fixed frequencies within 10KHz-13KHz. The architecture roughly was discrete time analog, and an analog switch was available that took a digital selection and routed the indicated analog input to the analog output. That block was used to control the sampling for a filter by cycling through the selections. The rest of the design was over my head in those days, since I was the software guy and the software wasn't controlling that.