Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o
Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).
Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.
FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.
We aren't stuck with the terminal and CLIs. We stick with them, because they actually do have value.
80 columns is a reasonable compromise length, once you've accepted monospace text, that works with human perception, visual scanning of text etc. But many programmers nowadays don't feel beholden to this; they use any number of different IDEs, and they have their linters set varying maximum line lengths according to their taste, and make code windows whatever number of pixels wide makes sense for their monitor (or other configuration details), and set whatever comfortable font size with the corresponding implication for width in columns. (If anything, I'd guess programmers who actually get a significant amount of things done in terminal windows — like myself — are below average on AI-assisted-programming adoption.) Meanwhile, the IDE could trivially display the code in any font installed on the system, but programmers choose monospace fonts given the option.
As for "plain ASCII", there just isn't a use for other characters in the code most of the time. English is dominant in the programming world for a variety of historical reasons, both internal and external. Really, all of the choices you're talking about flow naturally from the choice to describe computer programs in plain text. And we haven't even confined ourselves to that; it just turns out that trying to do it in other ways is less efficient for people who already understand how to program.
Monitors, keyboards, programming in textual representations, all seem quite unnatural. They were all the result of incremental technical progress, not the result of an ideal thought process. Just look at the QWERTY layout, and the limited number of people actually able to do programming.
If one reads science fiction novels from the 1970s, this is typically not the way people envisioned the 21st century.
I agree that the solutions have value, but I'm certain that we are stuck in a local optimum, and things could have been wildly different.
https://youtu.be/XX53VbgcpQ4?t=793
In the same video the salesman was selling a Pentium 75MHZ machine. So it must have run on a PC of similar specification.
People had seen the tech working in some form on TV for some time. It just wasn't mainstream.
The same when I sat in the hills of Griffith Park with a Ricochet modem and a tiBook, wondering how much ssh'ing and CUSeeMe I'd be able to do until the batteries ran out.
Once these kinds of activities became integrated into a laptop, the magic of all of the pasts' future predictions definitely became atmospheric.
On occasion it was nice to know when some tech was also in the closet, in case I knew their # and could get them to flick a switch or two, on my behalf, in lieu of the 1 or 2 hour bike ride (depending on traffic) I'd have had to endure to use my own fingers...
[1] - https://news.artnet.com/app/news-upload/2021/09/1280px-Franc...
Yeah, that really just never figured into my visions of "the future" as anything significant, I have to say. And even nowadays it doesn't seem like people often make video calls, given the opportunity. They'd rather take advantage of those displays to doomscroll social media, or see amazingly crisp (and emoji-filled!) text in a messaging app.
Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …
And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …
… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:
https://80sheaven.com/jupiter-ace-computer/
Second Edition Manual: https://jupiter-ace.co.uk/downloads/JA-Manual-Second-Edition...
Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.
The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.
By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.
Vectrex. Jaz drives. MiniDisc. 8-track. CB Radio.
The more I notice, the less I feel there is a discussion to be had over this distinction.
The sci-fi predictions all came true - many of them, also came to pass, which is to say that the weight of the accomplishment of speculation to reality becomes immediately irrelevant in the context of the replacing technology.
Star Treks' communicators did catch on - among the content creation segment - but on the other hand, we also got the 'babelfish'-like reality of EarPods ..
I think the never-ending march of technology becomes fantastic at first, but mundane and banal the moment another fantasy is realised.
His doctor advisor was Claude Shannon and some of his students include the founder of Adobe, The founder of SGI and the creators of both Phong and Gouraud shading.
He also ran the pioneering firm Evans & Sutherland, a graphics research company starting in the 1960s. They produced things like https://en.wikipedia.org/wiki/Line_Drawing_System-1
He was a key person during the Utah school of computing's most influential years - when the Newell's famous Teapot came out for instance.
Saying his predictions are right on is kinda like saying Jony Ives predictions about what smartphones would look like was accurate
When they were showing old photos of Ivans VW Bug they were taking measurements of, there was an obvious grief pause whenever he was in one.
There's videos of these on YouTube. I sat next to Sutherland in the room btw
That is 2nd to when I had a buffet breakfast showing up early to an event when it was just me and woz in the room and I talked to him for an hour not realizing who he was.
Actually my bullshit flags went up thinking "this guy sure likes to tell fantastic exaggerated stories!"
Kinda like talking to, say Harrison Schmitt, not knowing him, and saying "you? Landed on the moon? Sure old man. Stepped foot on the moon... then you were a senator? alright."
I'm not a big fan of his climate change denialism but yeah, he did walk on the moon.
"""Some years ago, I was lucky enough invited to a gathering of great and good people: artists and scientists, writers and discoverers of things. And I felt that at any moment they would realise that I didn’t qualify to be there, among these people who had really done things.
On my second or third night there, I was standing at the back of the hall, while a musical entertainment happened, and I started talking to a very nice, polite, elderly gentleman about several things, including our shared first name. And then he pointed to the hall of people, and said words to the effect of, “I just look at all these people, and I think, what the heck am I doing here? They’ve made amazing things. I just went where I was sent.”
And I said, “Yes. But you were the first man on the moon. I think that counts for something.”
And I felt a bit better. Because if Neil Armstrong felt like an imposter, maybe everyone did. Maybe there weren’t any grown-ups, only people who had worked hard and also got lucky and were slightly out of their depth, all of us doing the best job we could, which is all we can really hope for. """
from: https://journal.neilgaiman.com/2017/05/the-neil-story-with-a...
I felt space age.
The old Teletype in question was a Baudot machine with a 60 mA current loop, rather than ASCII and 20 mA loop for the Model 33.
One additional example of the technology of that time. In 1968, I was a computer science student, and found myself called upon to arrange a demonstration of remote computing. The university at that time had no timeshared computing facility, so we used IBM's Call/360 service. The terminal was an IBM 1052 (big clunky printing terminal) with an acoustic coupler. To move this across campus, we arranged for a truck with 2 or 3 people to put the thing on a dolly, put it into the truck, and move it into the student union building. Later that day, the truck, and the helpers, came back and we reversed the process.
I really like my ThinkPad!
It's also interesting to note his lack of adeptness at typing (sign of the times, I suppose).
Episode 1 - “It’s Happening Now”: https://www.youtube.com/watch?v=jtMWEiCdsfc
Episode 4 - “It’s on the Computer”: https://www.youtube.com/watch?v=UkXqb1QT_tI
Episode 5 - “The New Media“: https://www.youtube.com/watch?v=GETqUVMXX3I
Episode 10 - “Things to Come”: https://www.youtube.com/watch?v=rLL7HmbcrvQ
By the mid-70s the studio had turned into this:
https://www.thewire.co.uk/audio/tracks/listen_peter-zinovief...
hilbert42•5mo ago
That BBC news report is interesting as it puts about 60 years of tech/computing progress into perspective.
Now extrapolate 60 years hence—right, today's mind just boggles.
cgsmith•5mo ago
hilbert42•5mo ago
So I was ready when the early processors, 8080/5, Z80 and 8086 arrived a few years later. In fact I was running both DR's CP/M and Tim Patterson's SCP DOS on an S-100 system around the time Microsoft bought the product and renamed it MSDOS.