People used to get productive work done on DECstations, they were big and expensive in their time. Now we can recreate them for just a few dollars (plus the cost of a screen and keyboard). Today almost everything we do relies on the internet, so a wifi driver would be useful as well.
Many things we do today require more processing power, but many things do not. Writing, terminals (well SSH could be a problem), email, hn. We used to do raytracing on a DECstation, had to use a remote X window to view the finished image in colour.
You would think that a certain subset of people would quite like a simpler system today to work on, but I guess it's just easier to buy something modern with all the extra layers of complexity.
Maybe this is because today programming largely relies on having access to the accumulated knowledge of the internet, and a very complex web browser.
My PhD was done in a DECstation 3100. The physics lab was a VAX environment (everyone had VTxxx terminals in their desk) but someone had bought a 3100, not figured out how to use it, and it was sitting in a corner - normally switched off. I managed to persuade them I could put it to use when I joined the group, and about 6 months later everyone else in the group had Unix workstations too… we named them all after asterix characters, mine was getafix.
The article title reminded me of when I was young and used to read Byte Magazine. Byte used to cover a wide range of topics, and could get quite technical, but the big thing that is vastly different to today is that you would get a monthly digest of articles that were selected by the editors, not by yourself. And I used to read it cover to cover. There was a lot I didn't understand, but also I feel like I gained a wider knowledge than if I only read what I was interested in, and many times the ideas that I was exposed to turned out to be useful much later in life.
Some of them ended up being distractions too, like playing with hardware, or writing a compiler, but it was all very interesting.
Byte magazine was a terrific publication. There's nothing similar in print these days that I'm aware of. Certainly, Byte couldn't be accused of dumbing down the content to reach a wider audience, unlike many of today's supposedly technical magazines. I learned a lot from Byte and experimented frequently with the knowledge and understanding I gained from Byte.
No, and there was no (easy) way to detect the vertical retrace. For a lot more on that topic have a look at the Apple II mouse card, they needed to synchronise with the video and did work out a software based way of doing it, but the final product added hardware to make it possible.
Some other hardware features were very good for the time. It gets a lot of heat for the initial reliability issues, but they were eventually solved. They also limited the Apple ][ emulation to 2+ features, so no 80 columns, and that was probably a mistake. On the other hand the good features were:
- Profile hard disk (but would have been better if you could boot from it).
- Movable zero page, so the OS and the application each had their own zero page.
- As mentioned, 80 column text and high resolution graphics.
- Up to 512k addressable RAM, either through indirection or bank switching.
It was probably the most ambitious 6502 based computer, until the 65816 based IIgs came along. And SOS was better than ProDOS.
The main constraint appears to be the number of available GPIO's. For 8-bit projects the bus can take a large number of pins, not leaving a lot for the other functions.
Annoyingly the RP2040 has 30 GPIOs but the official Pico boards only break 26 of them out, which seems like an unforced error given the layout would have enough pins for all of them if it had less than 8 redundant GND pins. Those spare GPIOs are used to connect the WiFi coprocessor on the Pico W, but on the regular Pico one of them is just used for the onboard LED and the other 3 aren't connected to anything.
The redundant GND pins are necessary for signal integrity and low EMI when running at higher speeds; the high speed signals need a return current path with as little loop area as possible, so you want those signal pins close to GND on the connector.
On the RP2040 this is solved by having a huge ground pad in the middle[1], as it has somewhat modest needs.
On modern CPUs there are tons of "redundant" ground pins sprinkled all over[2]. Just about every high-speed GPIO pin and power pin has it's own ground pin right next to it.
Keep in mind "high speed" mostly relates to having fast edges. You can have EMI issues with a "slow" 1MHz signal if the edges have rise/fall times are say a few ns, which modern microcontrollers can do.
At least I think I can tell when I am reading AI generated content, and stop reading and go somewhere else. Eventually though it'll get better to the point where it'll be hard to tell, but maybe then it's also good enough to be worth reading?
When something I work on gets cancelled or not released, I try to feel better about it by thinking of the learning involved. Sometimes when I've been on projects that are going nowhere I will try and find ways to learn new things. A very minor, but useful example was a project early in my career where we were told to stop work while the management re-evaluates the project. I learnt to touch-type that month, a useful skill for my whole career. That was my first year contracting, they gave us a months notice on that project and I started the next contract the next Monday, so I billed 13 months that year.
I feel for architects that see their own buildings torn down in their lifetimes and replaced. I'd find that hard to deal with.
That reminds me of a story about a friend of mine from high school. In fourth form (age 14) he did every exercise in our maths textbook and sent the author a list of corrections to the printed answers. I don't think this was very well received by the author. That friend went on to miss 7th form year, got preferential entry into a maths degree, and had his degree by the time I finished year 1.
It was big. Compared to textbooks today it was probably very good. Hardcover and probably 500 pages. This was 1984, in New Zealand, but I think the textbook was British.
I tried to help my daughter with some maths a few months ago and I couldn't believe how bad her textbook was. It didn't appear to explain anything, just used exercises to show results.
That would be Project Builder and Interface Builder. Project Builder was kind of a graphical make tool and Interface Builder was the real star of the show, it was the graphical user interface builder. And there was Edit for source code editing.
There were sold as separate product, NeXTStep Developer that had all the development tools, but if you bought the education version of NeXTstep the developer tools came bundled.
Yep, the tape to tape copying days were really happening around 1984 so the manual in either red or green paper was the only way to keep it selling. The sprite library package and Forth's closeness to machine code were its strengths. Not easy to work with though.
Many things we do today require more processing power, but many things do not. Writing, terminals (well SSH could be a problem), email, hn. We used to do raytracing on a DECstation, had to use a remote X window to view the finished image in colour.
You would think that a certain subset of people would quite like a simpler system today to work on, but I guess it's just easier to buy something modern with all the extra layers of complexity.
Maybe this is because today programming largely relies on having access to the accumulated knowledge of the internet, and a very complex web browser.