Agree with the author’s thesis of how the folks that “grew with computers” have an advantage over those approaching them now, in terms of understanding the inner workings. I’m not sure that this matters much in terms of solving actual problems though, which is probably a good thing.
But I somehow find it a little bit sad that this is the case, so I’ll plug my own https://www.endbasic.dev/ because it’s very fitting in this context :) I’ve been building it precisely as a way to understand everything that’s going on by offering that 1980s experience (although it’s still far from fulfilling the full promise).
Also, buried in the article is a reference to the https://10print.org/ book. I recently came across it at HPB and it has been a pretty entertaining read. Couldn’t believe there was so much to write about such a simple little program!
> I think the people that first encountered computers when they were relatively simple and constrained have a huge advantage over the rest of us.
Even if they were simple it was hard to program them because there was a very limited amount of information on how to do it. The few books available didn't cover a lot of things. Learning was mostly done lots of things until you succeeded. And that took a lot of time and patience. The Internet, stackoverflow, Reddit, YouTube, forums, Udemy, Github, and the thousands of tutorials, examples and documentation site make things a lot easier.
I started to learn programming on an 8 bit Sinclair ZX Spectrum and the only good things that came out of that is that it teached me to work on very constrained systems and to build up patience and will to try and fail until I succeed.
4 or 5 years later when I've experienced IBM PCs at school, it felt like going from horse and carriage to a rocket. Yes, the rocket might be a bit harder to maneuver but you can do much more things, faster.
The manuals that came with the Commodore VIC-20 and 64 were excellent; they gave a good foundation for learning BASIC, and teased you into getting into lower-level machine code. And once you got to that point, the Reference Manual really go you to very low level detail.
https://archive.org/details/Personal_Computing_On_The_VIC-20... https://archive.org/details/VIC-20_Programmers_Reference_Gui...
Easier after you fight through hours of fruitless configuration and version differences.
This is a lovely ode to the period in which I developed my love for computers. Although like many growing up east of the Atlantic Ocean, my computer of choice was the Sinclair ZX Spectrum. Without the budget to afford to buy much in the way of cassette-based professional software, like most I resorted to manually entering programs from magazines like Sinclair User[1] and Personal Computer World[2], which got me into the habit of reading others' code. When they were inevitably incorrectly entered, I was introduced to debugging techniques for the first time. After a while, I figured that I could write a computer program of my own; and not long after that, I realized I could submit my own program to a magazine and even earn money from it[3]! I don't think I would be working at Google now if it were not for this first-hand, somewhat unforgiving education I got in the 1980s.
That said, those who missed out on this era also missed out on quite how limited the available sources of information were. Without online services to consult, the primary source of information being trial and error, and one incorrect machine code instruction leading to loss of all data entered, progress was very slow going. A committed learner could certainly make far faster progress with a more modern environment.
For today's generation, I'm grateful for books like Charles Petzold's Code [4], which constructs a computer architecture from first principles. The joy is still there waiting to be found!
[1]: https://archive.org/details/sinclair-user-magazine-033/page/...
[2]: https://archive.org/details/PersonalComputerWorld1984-01/pag...
[3]: https://archive.org/details/sinclair-user-magazine-044/page/...
[4]: https://www.microsoftpressstore.com/store/code-the-hidden-la...
Fellow Spectrum user here, I bought Your Sinclair and Sinclair User.
I started with BASIC, from the orange manual, moved onto Z80 assembly, and also made submissions to the magazines - in my case POKEs for infinite lives/energy/time, for games.
I didn't really write any new software, but I did hack a lot of games. That was almost more fun than playing them.
But you're not wrong about information being hard to acquire. I had a couple of books from the local library about assembly programming, but they were very basic (pun intended!)
I'm trying to collect more of the older books anew, but it's hard to find them and get hold of them these days.
So if there's anybody reading who has any books on Z80/Sinclair coding, or documentation please feel free to get in touch. (PDFs are nice, but physical books are best.)
If you haven’t seen them, Acorn Books in the UK have put out a lovely set of hardback reprints of some of the classic Melbourne House titles. Well worth a look: https://acornbooks.uk/retro-reprints/
Like many others before me, my occasional weekend hobby has revolved around writing a ZX Spectrum emulator as a way to better understand these primitive but fascinating machines: https://github.com/timsneath/cambridge. Nothing special, but a fun project nevertheless.
Thank-you, I hadn't seen those. I'll be taking a look properly shortly :)
I knew enough about the Commodore 64 Basic that I could leave the mall Radio Shack with it flashing wild colors or streaming a sentence. (Which was a highlight of going to the mall.) I started actually coding when we got a PC clone around 1984. First the Basic that came with it, and occasional forays into Assembly, which always felt like black magic. But soon enough Turbo Pascal was the real deal. And then I got hooked up with Turbo C, and boy was I hooked. I probably learned the most about the internals with C, because of pointers within a context I could actually easily reason about.
I had the unfashionable but actually pretty cool Amstrad CPC464. I wanted to play games but also program. Hacking games was also much fun: figuring out how to give yourself 100 lives and so on.
Many weekends were spent typing lines and lines of code to make simple games. Then we'd save stuff to tape - after we'd spent hours and hours debugging, of course.
Elite occupied me for months. Then Forest at World's End, which I mapped out on a bunch of sheets of A4 taped together.
When I got older I hacked my joystick port and connected it to a water chaos wheel I'd made out of an old bicycle rim and some other bits, then I wrote a BASIC program to visualise the movement of the wheel, monitoring direction via the hacked joystick port.
Oh man. Fun, fun times :-)
Anyone up for this exercise, I might recommend QBASIC from Microsoft instead. Built in help manual and a "real-ish" IDE. And it was on the Windows 98 CD!
I value "infinite online resources" but having integrated books of documentation in the IDE includes such valuable writing. I miss it so much when going through the hastily-written "getting started" tutorials I end up with nowadays (the scope of problems trying to be solved is way different of course)
Yes, but how I coveted QuickBASIC as a kid - it could compile to an executable!
I learned to program with basic around 1990, using books from the late 70s. What taught me the most were books of computer games… I learned so much because I had to make slight modifications for them to work on the version of QBasic I had, and trying to figure out what to do to make them work was the best teacher I had.
It was amazing what you could accomplish in QBasic. As a child the possibilities were nearly endless and with the built in help it was so approachable.
To be fair, those memory mapped locations of the C-64 were APIs… for accessing the graphics, sound, and i/o capabilities of the system. It was a pretty nice design. One abstraction made the entire system programmable in a flexible, intuitive way and played well with the CPU’s native machine language. You just needed a good memory map. (A knack for memorizing useful memory locations didn’t hurt either)
Casey Muratori proposes going back to the memory-mapped-io-as-universal-hardware-api in "The Thirty Million Line Problem" https://www.youtube.com/watch?v=kZRE7HIO3vk
It may have been possible to use them as an API, but it was different in the sense that you were interacting with the hardware directly. There were some nasty consequences to that (it was difficult to maintain compatibility between hardware models, nevermind hardware vendors). There were also some cool consequences to that (you could bend the hardware to do a lot of undocumented stuff).
That said, even firmware calls (the closest thing to a modern software API) were a different beast back then. You could buy documented assembly dumps of the ROM for some computers. Not only was it useful for figuring out how things worked, but you could figure out alternate entry points to use that firmware in undocumented ways.
A memory mapped API isn't inherently more (or less) subject to compatibility issues and versions than any other kind of API. With the C-64 it was hard-wired but that's not any more necessary than burning the OS on to ROM. (Which is to say: totally necessary at the time, for a machine of that capability and cost, but not at all necessary for systems that have the APIs we're comparing it to.
In the early 1980s, graduate MBA (business school) students at Univ. of California at Berkeley took a required intro computing course, which was mostly BASIC programming. The computer lab was a room full of Lear Siegler ADM3A terminals connected to DEC VAX running Unix. The login shell was the BASIC interpreter, and Unix was hidden.
That was another way to learn BASIC like it's 1983 that I haven't seen mentioned yet.
I’ve had great fun playing with (an excellent implementation of) Basic on a Colour Maximite. Highly recommended to those who would like to run Basic on the bare metal.
https://micromite.org/product-category/maximites/colour-maxi...
The article delves only very shallowly into programming, so it stays away from this problem, but Commodore 64 BASIC is fundamentally unsuitable for programming anything larger than a few lines of code.
The most widely used version of Basic which Commodore (and other) platforms used does not have functions. This makes Basic programs tend to spaghetti and unreadable code, especially considering the constant memory constraints of those platforms. I grew up on these systems, and every time I think back on it I wish something like Forth would have taken its place – i.e. something with a clean and scalable pattern for abstraction. Basic, on the other hand, doesn't do abstractions. It barely has data types and what it calls “functions” are an even more limited form of Python's style of lambdas; every subset of code which can actually do something looks like “GOSUB 11600” when you call it. No naming, no abstractions, nothing. (No parameters or return values, only global variables.)
(This is in some ways even worse than assembler, which usually has labeled goto’s.)
When I programmed in Basic those many years ago, I was stalled when my programs reached a certain level of complexity. I was then mostly halted in my education and development as a programmer for many years, because the language did not make program composition easy. It was not until I had the opportunity to learn other languages with did have proper functions and other methods of program composition that I could break through the barrier, so to speak.
(Reportedly, BBC Basic on the BBC Micro did have proper named functions, and later versions of Basic like on the Atari ST and Amiga also had them. I believe that those versions of Basic would have been vastly more productive and taught people the usefulness of abstracting things as you go, building ever higher abstractions, etc. But this is never the version of Basic which people talk about, or used by all those listings in magazines, etc. These are, for all intents and purposes, not the “80s style Basic” which everybody remembers with such apparent and baffling fondness.)
(Mostly a repost of a previous post of mine: https://news.ycombinator.com/item?id=34033513)
the first kids that grew up with computers in the uk (about 1979) learned that running into a wh smith or similar store, where primitive computers (spectrum, vic) were on sale and ran demoing, and then doing reprehensible stuff like this:
10 print "fuck you"
20 goto 10
and then running out.
some may be running major companies now.
>In 1983, though, home computers were unsophisticated enough that a diligent person could learn how a particular computer worked through and through. That person is today probably less mystified than I am by all the abstractions that modern operating systems pile on top of the hardware.
Not entirely true. While I've learned as a kid many of the insides and outsides of my ZX Spectrum clone, from the limited info I could gather and from tinkering, I tried to learn about most complex systems later, as much as I could.
I learned x86 assembly under MS DOS, I learned writing device drivers in C for Windows, I learned a bit of Linux system programming in University, I learned a bit of OpenGL and shaders, I learned a few bits about hardware, I learned about logical gates like NAND and simple digital circuitry. And those are basic things I've learned long time ago.
Having low level knowledge is useful but also having a higher level knowledge. I think concepts like algorithms, parallel and concurrent programming, formal languages and automata theory, cryptography, statistics, machine learning and other high level stuff I've came across in University were equally useful.
I tackled many areas of programming, desktop software, device drivers, embedded software, video games, mobile apps, web front-end, web backend. Now I am building microservice based apps with Kubernetes and Azure. I am thinking of brushing up my knowledge on ML.
I liked pretty much everything I did and I approached everything with a learning mentality.
One can't learn everything like in the '80s but one can learn a lot of things to keep him entertained and help him accomplish great things while having enough knowledge of how things work under the hood.
I am probably not an expert in any one field of programming but know enough things to be useful in many areas. I rather like being a jack of all trades than highly specialized because there is more than one thing that interests me and I am always curious about different things and I like to learn. That being said, being an expert in one thing is not a bad place to be and experts can be paid a lot.
I was a teen in the 80s. I was lucky enough to hang out at the local university and got to use their terminals that had a printer instead of a display.
School had an Apple II but wouldn't let students use them. Instead they forced them to use punch cards so I didn't take the class. Instead I used a Commodore PET that was on display at a department store.
Eventually owned an Apple II clone and and IBM PC clone. First work computer was a Compaq Luggable with an amazing orange phosphor monochrome display.
Before the Internet was available we used BBSs, and Compuserve. BBSs were horrible little fiefdoms run but basement dwelling trolls.
Networking was still a toss up between Ethernet and Arhnet. I liked Arcnet. You had to configure interrupts, ports, buad rates, stop bits, and parity for the IDE network cards. It was a pain.
Most business LANS used Novell Netware or Lantastic. I loved Lantastic, it was easy and even had a voice over network feature. Still have a t-shirt from them somewhere.
The Internet arrived before Windows was usable and Microsoft wasn't ready. So you had to use a SOCKS client.
I made a lot of money in those days simply by hanging out in the computer section of the big book store. Managers would wander in like Bambi on a highway. When they saw me reading a book on computers they inevitably asked questions. It turned in to consulting work.
Fun times but also very frustrating. No real multi user, buggy products and operating systems, Linux was still very much 'assembly required'.
Now we have non-typed, high level, abstracted languages, and agile methodologies which are possibly a step too far in the other direction.
Shout out to Rodney Zaks and his Z80 assembly programming book. Books were so important back then to learn things, especially if you were somewhat isolated.
Those of us who owned offbeat PCs in the early 1980s probably were more motivated to learn our machines as the games and whatnot were much more limited. Wish I had never given away my Microbee...
Looks like this is the one? https://en.wikipedia.org/wiki/Programming_the_Z80
OMG - the moment I clicked the link and saw the cover I felt a wash of fond nostalgia crash over me!
Sadly my copy is long since gone...
I started off with a BBC Micro, followed by an Acorn A3000. My first 'PC' was a 486 card for the RISC PC - now there's an interesting architecture: the machine had two processor slots, but didn't require that the processors to have the same architecture. You could use the 486 as a very janky floating point accelerator for the ARM chip as well as to run DOS and Windows.
An interesting thing is that RISC OS is still available for the Raspberry Pi and it's a direct descendant from the operating system of the BBC Micro - not emulated. It still has the same level of direct hardware access, so if you ever wanted to use peek and poke (well, those are the ! and ? operators in BBC BASIC) on some modern graphics hardware, there's a way to do it. There's a built-in ARM assembler in there too.
What I think was really different about the time was the quality of the documentation. Nothing modern has the same sense of empathy for the user or achieves the same combination of conciseness and comprehensiveness. For instance, here's the BBC Micro's Advanced User Guide: https://stardot.org.uk/mirrors/www.bbcdocs.com/filebase/esse... (it's of particular historical note, because today's ARM architecture grew out of this system). You could build the entire computer from parts using just this 500 page manual, and you'll note that it's not actually a huge amount more complicated than Ben Eater's 6502 breadboard computer.
Weird thing: RISC OS actually has backwards compatibility with some of the old APIs so some of the stuff in the advanced user guide still works today on a Raspberry Pi (plus it comes with a BBC Micro emulator which was originally written because Acorn didn't want their new machine to fail due to a lack of software). These days there's also https://bbcmic.ro of course :-)
The Programmers Reference Manual for RISC OS is similarly well written, and surprisingly quite a lot of it is still relevant: most things still work on a Raspberry PI, and even modern operating systems still work pretty much the same way on the architecture. While things like MEMC, IOC and VIDC are long dead, there's a pretty direct lineage for the modern hardware to these older chips too.
My first computer was the Acorn Atom. Wasn't as "cool" as the Vic-20 or the ZX-81, but the Atomic Basic language had on killer feature: the ability to embed assembly directly inside a basic program!
Learnt so much about the 6502 that way!
Discussed at the time (of the article):
Learning BASIC Like It's 1983 - https://news.ycombinator.com/item?id=17900494 - Sept 2018 (160 comments)
It is still possible to get a 8088 (or similar) developer board and code like it is 1983. What's missing is how we get from the "raw" experience to the "everything is abstraction/virtualisation" world we have now.
For me it all started with Spectrum ZX. I liked the fact that all of the commands were printed on keyboard keys. It made learning BASIC syntax much easier.
If you want to learn how computers work now is easy. Learn C and a bit of assembly, read about hardware and try to do low level stuff. Play with memory, play with system calls.
Learn a bit about GPUs, some OpenGL, DirectX, shading language or Vulkan and tinker with the GPU.
idk about that
I mean, all of that is grand but you remain for, for from the metal in all these instances and even in assembly you cannot go under several layers of abstraction with modern architectures
in the days of 6502 and Z80 compatible microprocessors, you could go down to the NAND gate level, to the transistor level - it was doable, not that it made much sense to go there for most people, but it was nearly a requirement to know how many cycles did each instruction take and lock conditions like bus access when dealing with different parts of the memory, etc
there were essentially no hard black boxes then, and now there are many at the hardware level and even the OS level is too impenetrable; not even the CTOs of major OS companies and initiatives have a complete understanding of the OS they produce, perhaps a few select ones have a functional understanding with only 2 or 3 black boxes involved (specific driver magic, hardware detail that goes under C and assembly optimisation etc - which typically outsourced specialists deal with)
GPUs alone these days are more complex than the entirety of the systems back then, and the implications of this are two-fold:
- the culture on the producer end is that software, machines and accessories are sold as black boxes, both by expectations and by enforcement on the manufacturer's end that gets no incentive whatsoever to expose internals of their product, and many reasons to the contrary as the competition might use it; these dynamics are extending to repair-ability as well
- the culture on the consumer end is that internals are not possible to deal with, are best taken for granted and the curiosity is just not there anymore; also if things break you just get new things, which disincentives also producing anything meant to last, as the assumption is that it will be obsolete soon anyway, or the system will just stop working, reinforcing that dynamic and cheapening the value of work
In essence, the hardware hasn't changed a lot. There was less focus on efficiency, much more on price - computers were expensive. In the (early) 80s, memory came at a premium, most micros had no permanent storage (except audio cassettes), so CPU, peripherals and OS abstraction weren't though of as the bottleneck.
Given only kilobytes of memory, the problem basically came down to: how much of your computer can you make get to do something useful or fun within those RAM limits. I think that perspective allowed room for (and necessity to) learn about and understand all levels of everything inside the piece of hardware you bought. Abstraction was still relevant, but detailed documentation on the actual guts was much more necessary then than it is today.
To exaggerate: why would you need to understand how to bitshift/add/compare to perform multiplication or division if you never need to think beyond python 3/typescript/java syntax?!
there have been many developments that have been strongly anti-user
partly because of the limitations of these days, there was a strong incentive to expose every detail of products to consumers that were not abstracted away from professionals or developers at any level
the abstractions of the day were solidly grounded by the nature of the underlying hardware, whereas nowadays we've used the freedom and leeway of more powerful hardware to make intermediate abstractions that are more transient and throw-away
for instance, everybody doing amateur level computing in the mid 80s had a strong understanding of he concept of interpretation vs compiling, understood binary coding, base conversion, some decent amount of information theory and boolean logic, addressing modes, fundamental data structures... those were all basic requirements and a lot of that is knowledge that won't fully go obsolete per se
whereas now, nothing of that being a requirement, people are often just given some explicit indications to do something and it just "automagically" happens, and this comes pretty much by definition with an increasingly "smart" interface and "dumb" user who is usually defeated if he or she tries to think outside the box (and thus doesn't)
Nah, the proper way it's SICP wich Chicken scheme. Much better to learn about CS theory without having to deal with C and memory. If you want to play with C, UNIX and virtual memory, get The C programming language 2nd ed and Advanced Programming in the Unix Environment books.
Now, on Scheme:
sudo chicken-install srfi-203
sudo chicken-install srfi-216
cat ~/.csirc
(import scheme)
(import (srfi 203))
(import (srfi 216))
It has a game engine written in Scheme called Hypergiant
bound to OpenGL and epoxy which is really fun to
thinker with. sudo chicken-install hypergiant
To compile stuff: csc -L -lGL example.scm
On assembler, get some NES emulator and run 6502 code
on it with Pasmo and some helping libraries to
initialize the most common rutines.6502 it's much easier than the Intel clusterfuck.
Either that or RISC-V.
>. In the 8-bit computer era, many games were available only as printed BASIC listings in computer magazines and books.
Actually the best games and software were programmed in assembler and you had to load them from cassette tapes.
You couldn't do much in Basic.
Often they were programmed in machine code. The BASIC listing was just a tiny loader and the actual program in hexdecimal strings. The loader decoded the strings, poked them into memory and then called the start address.
You could LOAD a massive array of numbers and POKE them into memory ...
Such things were printed in magazines as BASIC listings and they were, in fact, "programmed in assembler" and typed in by hand.
Nothing like typing in a 2 page program full of DATA statements and number values only to find out you made a mistake SOMEWHERE and they didn’t have any checksum routines.
I don’t know how many times I typed in that damn MAD magazine program and never got it to work. It took 30+ years to find out it had a error, would never run anyway and seeing someone else fix the bug and watching the program execute after all those years was finally closure.
Compute! Magazine for the Commodore 64 had a special input program that did checksums on their input listings, allowing you to type in long lists of numbers accurately and eventually end up with the excellent SpeedScript word processor, among other programs.
I understand the point the author is trying to make: computers were simpler in 1983 than today, so you could actually understand how they worked, if you studied the technical specs. There was not some big operating system in the way. Operating systems were literally referred to as "disk operating systems," because that's pretty much what they did: mediate loading and saving data to disks.
And that's all a good story, except that it mostly didn't happen.
Not everybody was typing in BASIC listings, then spending hours debugging where the typo was. There was more than enough premade software, including games, out there for the major computers on the market (C64, Apple II, TI-99) that you could have a lot of fun without ever seeing a BASIC prompt.
And, while the manuals were much lower level then than they would be just a few short years later, and you could certainly learn a lot about how to control the machine by PEEKing and POKEing memory locations, the fact that if you POKE some address and the screen changes color doesn't tell you anything about how it all happened. It's just as mysterious as how moving the mouse on a modern computer moves a pointer on the screen.
BASIC is too high level to teach you anything meaningful about what's going on under the hood. But, fortunately, the machine languages of the home computers of this era were generally pretty well documented. However, even then, while you'd know there's this thing called a CPU and that it has things called registers and that you can give it instructions to read and write memory locations, those things are still pretty big abstractions if you want to claim to understand "how it all works."
> the fact that if you POKE some address and the screen changes color doesn't tell you anything about how it all happened.
Simply typing in a PEEK or a POKE may not have told you much about how it happened, but peel back that one layer and you are learning about how memory is organized, hardware registers, and other low level fun. Peel that back another layer and you are starting to learn about EE.
Contrast that to today. Chances are that you are going to have to peel back several layers before you even bump into the OS kernel. Depending upon the language and the libraries, few of those layers will say anything meaningful about how the computer works. Things progress somewhat more rapidly after that, in the sense that each layer of abstraction will be more meaningful, but you will still have to peel back multiple layers before you touch hardware.
I don't think the article was saying that you automatically learned more about computers through the simplicity of older systems. I think they were saying that it was much more accessible to those who decided to do so.
FWIW, I literally used to type in programs from magazines (starting in 1984), and spent many hours poring over a bootleg commented disassembly of Microsoft's BASIC implementation for the RadioShack/Tandy Color Computer II.
hmm.. the Philips MSX manuals were pretty good. Explained basic, guided you through creating a small game.
Couldn't find it only, but found the sony one at https://hansotten.file-hunter.com/uploads/files/sonymsx2basi... fun to see the difference
Going back to your experience with programming, it's inspiring to hear how you progressed from BASIC to Assembly, Turbo Pascal, and eventually Turbo C. It's great that you found C to be a language that allowed you to reason about pointers and understand the internals of how programs work.
No, the author didn't miss out by not having been born in the late 70s or 80s and not experiencing this form of development.
8-bit micro BASIC development was based on the idea that it's the programmer's job to produce a complete program, and to understand all that it does from the time you type RUN until the program ends or is interrupted (by pressing Ctrl-C or RUN STOP, resetting the computer, etc.).
Today, most software developers develop program fragments that are plugged into a framework. The framework takes care of most of the details and only calls into your code for specialized logic. If you grew up programming BASIC (or Forth or Logo or Turbo Pascal), it can be confusing and frustrating to work this way because your intuitive sense of the program flow is completely disrupted.[0] I've found that younger programmers have fewer issues writing framework code. When their brains were still pliable, they learned that this is what programming is, so they adjusted to it. Even game programming, long the purview of hardcore bit diddlers, is high-level and framework-oriented thanks to engines like Unreal and Unity. Older programmers like me, sometimes their instincts and intuitions got in the way. The ones who thrived are the ones who adapted, who stopped worrying and learned to love the framework.
The entire discipline of programming is going to be disrupted again -- by AI. So today's programmers are going to be confused and frustrated when their jobs switch from writing code to prompt-engineering a model into writing code. But Gen Z will be right at home with it.
[0] I've found that working in Spring is for me an aggravating process because it involves guessing how to make ill-specified framework magic do what I want instead of, you know, writing a program that does what I want.
> (Your sister protested this decision, but it was 1983 and computers weren’t for her.)
oh dearest, save us from false revisionist shit political takes on gender/race/politics in every modern journalistic piece. There was no stigma or bias for women going into CS/programming in 1983.
To be fair to the author, they are using that sentence to link to the (admittedly fairly lightweight, but somewhat interesting) NPR article.
https://www.npr.org/sections/money/2014/10/21/357629765/when...
Mine was the Atari 800XL. Atari Basic hadn't PEEK and POKE as the C64 (though something similar which I can't remember right now), but some more possibilities to switch graphic modes and plot things. But I never learned how the machine worked from that, it was still magic.
I tried typing in listings, but you only knew that there was a typo somewhere after you finished hundreds of lines. Finding the typo was out of the question for me. Also, it was obvious that the result of the listing, usually a game, was of much lower quality than the games I already had or could copy from friends.
The reason I learned Basic was that I wanted to know how games work. They always fascinated me since I saw Space Invaders somewhere. I quickly understood, mostly with the help of more advanced friends, that you couldn't make games with Basic, it was too slow. You had to learn machine language.
So that's what I eventually did, and that's how I really understood the machine down to the point where I could tell what almost every single of the roughly 40000 available bytes did. It took a long time to get there, those 8-bit machines where already quite layered in hindsight when you think about: How 6502-instruction, assembler, disk I/O, joystick input and graphical output where tucked together with what today be probably called the Atari-API was not immediately obvious and the result of 20 years of technical development, but nowhere explained for a 12 year old!
My enlightment moment was this dialog with a friend: Me: "Why does this assembler program crash? Why do I have to reset? Why can't the computer handle it?", friend: "Because deep down, executing your program is also a program. If your program crashes, that program crashes.". I think that was the most profound lesson ever to me. It's programs all the way down!
So, yes I know that CPUs have their own instructions and that every programming language ultimately compiles to that. But that knowledge helped me little with what I consider the next large learning steps over the decades: Learning C on x86, learning how Unix/Linux works, learning what the internet is fundamentally build up on, learning Javascript+HTML5, learning how fundamentally different asynchronous programming is if you can't assume that I/O might not respond immediately and possibly never.
My favorite language today is vanilla javascript. I love the simplicity, no compiler insisting on type safety, a great UI, almost platform independent, lots of cool APIs. I think JS is as remote from Assembler as you can get.
Bottom line, I think it really doesn't matter to know about machine instructions, same as it didn't matter at the time how CPUs worked on the hardware level. That still mystifies me: The 6502-equivalent of an if-clause was branch-not-equal (BNE), but how did that work in reality? What's happening on the silicone then? How can a lifeless thing make a decision? Never really understood whats beneath the turtles.
Atari BASIC does have POKE and PEEK but unlike C64 BASIC, it also has some graphic and sound commands to change the screen mode and draw stuff and play some notes. Atari BASIC also checks for syntax errors when you enter a line of code. C64 BASIC doesn't check until you run it.
> I’m not sure that this matters much in terms of solving actual problems though ...
That is certainly correct. But The Industry hasn't successfully abstracted things such that pure problem solving is all there is to software creation. Current abstractions create a reliance on (probably inefficient, likely costly) infrastructure. One needs to understand how so much of a system works so that changes can be made to improve performance and/or reduce costs.
I took a look at EndBASIC and I find it rather awesome. But I also think it's limited to folks who have experience a perspective similar to my own. When looking this from my 7yo's POV, she would have no idea where to start. I could guide her, but it would be nothing like how I learned, alone, on an Apple //c. I'm certainly biased in thinking "how I learned was a pretty good process," and I wouldn't want to assume anyone else learns like that, but exposure to a simple (but complete) system and the ability to program it seems like a phenomenal way to learn.
----
I was thinking recently that it would be beneficial to teach the last century of technological advancement at a slow-ish pace. Kids spend a few months getting to know/understand how early telephone systems routed calls by using the same equipment from the day. Advance up through rotary dial, touch tone dialing, cordless phones, mobile phones ... over the course of a couple years. Parallel to that, the same kind of thing with computers and programming - switches, punch cards, tape, disk; mainframe, mini-, micro-computers; ...
It'd be pretty expensive though... :(
Why start at rotary phones? Why not start them out on Morse code? Semaphore? Naval flags moved data faster than horses for centuries. Should kids learn how to cut pens out of quills?
We should absolutely include the telegraph, wired and wireless. Which leads into radio…
I just picked an arbitrary starting point to illustrate the hands-on approach I would like to see.
It’s been a few years but the book Code by Charles Petzold covers a number of those topics which I found interesting to learn about (but not study).
https://www.codehiddenlanguage.com/
We made and used quills in primary school, in a sort of arts and crafts/history crossover, I suppose. Didn't hold us back any, hopefully.
While I agree with the thesis, I also believe it is possible to over emphasize it. A couple of examples from someone who is a product of that generation:
Given my age at the time, programming was accessible but modifying hardware was hard. Part of the reason is that few sensible adults would hand a soldering iron to a six year old. Part of the reason is that software is easier to learn than electronics. Part of the reason is the hardware was the product back then, something you did stuff with rather than something you did stuff to (contrast the 1977 Apple ][ to the 1982 Commodore 64, and you'll get an idea of how quickly the mindset changed). It wasn't until my early 20's until I learned that people were not only modifying the hardware, but making their own peripheral cards. Someone ten years older than me likely had a different perspective simply because electronics was more relevant ... since software hadn't started eating the world.
Even those people were likely in the same boat as I if they took a moment to think about the generation prior to them. I remember reading something by one of the grandfathers of computing. They blew my mind when they started talking about hard drives, not in terms of the logical layout of the disk nor in terms of the electronics but in terms of the physics. There was a time when people had to think about computers in those terms because they were in the process of developing those lowest layers of abstraction.
I guess what I'm saying is that each generation will have a different perspective on what computers are and how computers work. They will also have different perspectives on what "low level" means. While it is sad to see a lot of the old knowledge and old ways fade into obscurity, we shouldn't pity the younger generation for not having what we had. First of all, the old knowledge hasn't really disappeared for those who choose to pursue it. Second, they are going to be building a new layer on the technological stack anyhow. What they need to understand is the layer directly below them, not what's twenty layers down.
[dead]
We still have that 80s experience. Lest you forget, the TI83/84 is the only Collegeboard approved graphing calculator for the SAT and is thus the public school standard through to today. Well into the 2000s and probably still now bored teens have TI BASIC with them.
On US.
That was never a thing in many European countries.
For example I was always a Casio user.
FX-4500P, FX-880P and CFX-9850
My high school in Norway used TI-84+ in the math classes and exams.
Me and a friend in my math class spent a lot of time writing programs in TI-BASIC.
One of my “proudest” creations was an implementation of Snake. My version of Snake has a particularity to it tough. The features I was using to keep track of segments of the snake were slow to access, and the slowness increased with more data.
So whereas the real snake goes faster and faster over time to make it harder and harder, my snake went slower and slower. And when the snake in my game reached a certain length, the game crashed :p
But that’s ok, I had a lot of fun anyways. My version of snake did not need to be perfect. I liked to play it anyways.
I also transferred my Snake game to the calculators of some other people in my classes so that they could play it too. I don’t remember if that was a case of other people asking for a copy of the Snake game I’d made, or if it was more like me convincing others to allow me to put a copy of the game on their calculator. I like to think it was the former, but it is just as likely that it was the latter :p
For me it was making an implementation of tic tac toe that never lost, just so that I could put 0 players as an interface option
https://www.youtube.com/watch?v=s93KC4AGKnY
Endbasic: marvelous execution, bravo!
let's not overly romanticise it though
the vast majority of people with access to computers in the 80s didn't harness the opportunity to be able to understand a lot of what was happening
that was always going to be the case, and similarly people nowadays more often than not don't harness the vast opportunities that they have at their fingertips
It's true that most kids used their computers just to play games (plus ça change ..). However for me as one of those kids who did love programming my ZX Spectrum as well as the odd game, there were a couple of fun crossovers:
- Working out the 'infinite lives' Poke for a game; this typically involved loading the game's machine code, searching for the 'dec a' and 'dec (hl)' instructions and poking a 'nop' into each location in turn and trying the game. This crude approach was surprisingly effective
- When the microdrive (a fast tape device) came out, getting your games stored in that. This was a challenge as the 'drive required extra ram but there was none spare if the game was 48k. I worked out an elegant hack that involved multiple loads of the game into different areas of memory, including one that wrapped around the top of memory, 'wrote' most of the remainder into ROM and then the last few K into the screen buffer, which is how you got the extra space needed. Then a short routine to reorder everything tacked on :)
I don’t think that growing up with a c64 made kids into great programmers, but it did ingrain the notion that a computer is a programmable machine at its core. That complex things are just collections of simpler things.
It’s very hard to have the same realization growing up with current technology. It just too easy to see computer(/phone/tablet) as a complete appliance, which operation you never really alter. The same way you don’t mess with your washing machine outside of its designed functions.
Yeah, sort of the same today.
The C64 was incredible but I thought skipping Basic and going straight to 6510 assembler was the way to go. Got all those extra sprites by interrupting the scan lines, the huge performance advantage, etc
More than once I wiped my source code while zeroing memory.
I enjoyed Byte magazine but Compute! was my favorite:
https://www.commodore.ca/commodore-gallery/compute-magazines...
In my opinion, it’s even more exciting learning machine learning today than 6510 assembler on the C64.
I’d skip early the 1980’s nostalgia and begin the next quest. Kaggle is a good place to start:
https://www.kaggle.com/mmellinger66
And this book:
https://www.oreilly.com/library/view/hands-on-machine-learni...
I always wonder what could have been, if the Commodore 64 and similar machines would have come with Forth instead of Basic.
I bought one of these 2nd hand in early nineties after being introduced to the RPN way of doing things with a Hewlett Packard 28S calculator. Had fun with this machine although it was similar in many respects to ZX81 (and supposedly Spectrum) I grew up with.
The soft- and hardware had similar issues. For a meaningful programs in both cases you would quickly need to descend to machine code, which on both systems was very enjoyable experience. (E.g. BBC micro had far more mature software stack.)
But you are probably right in implying that Forth probably was too alien to ever gain mass adoption, even if the soft and hardware would have been superior to more common BASIC alternative.
the C64 didn't ship with it, but there were Forth systems available, notably "64 Forth" which was a superset of Fig-Forth
I'd add that right around 1983 is sort of a special date. It was just around that time that it was becoming reasonable for a middle-class person to buy a home computer. Go back a few years from that time and you could potentially program in Basic or Fortran but you were doing so over a terminal or with punch cards and didn't really have any direct interaction with the computer except in relatively specialized situations.
1983 was when I got my first computer, a Timex/Sinclair 1000, followed up about a year later by a Commodore 64. Unlike many who had their first computing experiences then, I was already an adult, married and almost 30 years old. My experience and enthusiasm for small computers led directly to the IT career I started in 1989.
Agreed. I'd guess that for every hobbyist programmer, there were probably at least 10 people who just wanted to play Ultima or something.
I was literally the person that just wanted to play Ultima (Underworld). But when things didn't work because the sound card didn't have the right jumper settings, or there wasn't enough free ram, or I just didn't understand how to start the game up in the most optimized way, I had to figure that stuff out, without the internet. Ultima jump started my IT career.
Most definitely did not, but for those that put some interest, it seems like it was much easier than these days.