There's a scene in "That 70's Show" where Kelso and Red bond over Pong and decide to mod the game to make it harder. And a few hours later with a soldering iron, smaller paddles!
The first time I saw that episode was at a friend's house. I felt so smart telling him that was impossible because you can't mod software with a soldering iron. Then his dad poked his head out from the kitchen and told me Pong didn't have software.
Turns out the only impossible part of that episode is the idea of it taking a few hours. Changing the paddle size was a mod already supported by the hardware and the manual gave details on how to do it. Though it wasn't necessarily intended as a difficulty setting, it was intended to support different sizes of TVs. iirc, all you need to do is solder 1 jumper.
I imagine this is how some sci-fi technology works. Certainly looks like a more low-tech version of some of it: Star Trek has both isolinear chips [0] in Federation technology and isolinear rods [1] in Cardassian technology that can be rearranged to change ship systems and Stargate has control crystals of various designs used similarly [2][3][4].
Mind blowing; both ways I guess. I’d like a tv show where someone building this gets to interview a time traveller from the 2020s and the latter knows nothing useful about hardware but can provide many unhelpful details about having GBs of storage, or things like DRM, pentalobe screws or electron framework.
There was a downplayed running joke where Kelso was actually extremely skilled at engineering. In another episode, he repairs a car with Red and explains what he’s doing in high technical detail while Red just pretends to know what he’s talking about because he doesn’t want to admit that Kelso is smarter than him at something.
As a Computer Science graduate without an Electrical Engineering background, I was trying to wrap my head around how that was even possible.
I wish more CS curricula would start with digital logic and stay there for a little longer before going into the full stored-program computer. That used to be the norm.
> I wish more CS curricula would start with digital logic and stay there for a little longer before going into the full stored-program computer. That used to be the norm.
I remember sneaking[1] into a few of those courses during university. Loved them.
The problem is that most computer science students viewed them as useless. They wanted to take, what they thought of as, useful courses. If it wasn't C++, it wasn't useful. Even the computer science students who were interested in pure math, these were computer science courses after all and the university offered a separate software engineering degree, didn't see them as useful.
A slight tangent: one of the courses I had the most fun wasn't even a computer science course. It was a philosophy course that spent a lot of time on computability. Cramming a bunch of genuinely interested social sciences students into the same room as a bunch of genuinely interested computer science students made for an absolutely amazing course. In contrast, the computer science department version of the course was populated by a majority of students who were only interested in the mandatory credit. It made for an astoundingly terrible course.
(These thoughts are from the perspective of a mid-1990's physics graduate.)
[1] Sneaking may not be the right term here. People in the department knew that I was doing this, to the point where I managed to get access to department computers and even managed to take a course for credit, with special permission from the professor and without the prerequisites, but the administration certainly didn't know about it since I didn't even bother with the official "auditing a course" route.
This right here is what a university is for, and why it's useful to cram a bunch of people with widely-varied specialities into a shared physical location. It's about exploration of knowledge through cross-polination of ideas. Social science and computer science students in those classes learned things from each other that they wouldn't have otherwise.
We see over and over again through history (Renaissance Italy, Bell Labs, Romantic-period Berlin, early-Google's cafeterias) how cross-disciplinary conversation inspires innovation.
The "useful" classes are necessary, but if institutions (or students) stop there then they have missed the whole point. They're treating class-work as office-work, and making that the summit of their ambition.
Indeed, back in the 90's our software engineering degree had lots of shared lectures with electrical engineering.
Digital logic would go all the way from boolean logic to designing our own toy CPU, with implementation using discrete logic components left as optional for those that felt like going for the top score on the assignment.
Additionally we also had stuff like EEPROM programming as optional selection for the total credits.
Interestingly enough, stuff like Prolog and LP, was also a required lecture for both engineering degrees.
In Sweden, rather than a "computer science", there's a "computer engineering" degree that also retains a lot of the electrical engineering stuff such as building a computer starting with a diode, building a flip-flop, a shift register, etc etc. Karnaugh Maps and everything.
It's no joke - my Computer Engineering degree overseas was literally that. The average completion time was 6 years, I speed ran it and completed in 5 1/2. Nobody AFAIK could ever do it in less than 5.
In Portugal, before Bologna changes, a degree would take 5 years by law, but on average would be around 7.
After Bologna, no one really takes the plain three years version, as the old degree was upgraded to include Msc, and everyone with the old degree also got equivalence to the new one with Msc, and no one wants to search for a job having only the lowest level degree.
I always have problems after all these years understanding US concept of CS being mostly theory, because in Portugal that isn't something we have as such, that is rather a specialisation of Maths degree (Maths applied to Computation, loosely translated), which only math nerds take, not those into computing.
Even if we translate our degree names to CS, the contents of those 3 to 5 years are very much hands on, with many lectures requiring successful delivery of project assignments before attending respective exams.
Yeah, other than students who get interested in all the hardware stuff (for whom there would likely be dedicated courses), I don't see how general EE courses help computer science study or career. If anyone really needs that later in the career, it won't be hard to pick it up.
That’s not what I meant at all, but then this involves a general conflict between a university and a vocational school (the second should aim to be useful even at the expense of comprehensiveness; the first comprehensive even at the expense of usefulness), or a CS and an SE degree.
I’m just wondering… did none of your universities offer Computer Engineering? I did it for one year before switching to CS because I just couldn’t grok how a bloody OpAmp worked back then and didn’t want to spend the next 50 years doing that lol.
My CE course was essentially an Electrical Engineering course with power distribution subjects replaced with programming subjects
We had a course like that and I absolutely loved it. It's the most fun I've ever had in an educational setting.
It followed such a lovely flow, starting with the absolute lowest level of computing (binary maths, diodes, transistors, and building logic gates with these) and kept on combining these building blocks until we arrived at modern computers and software.
Even if you're just a "modern age" developer that only ever uses modern programming languages, just understanding how everything is built makes you make better decisions all around.
Go through the Ben Eater channel on YouTube. It'll get you up to speed with early 80s computing. Computers quickly stop feeling like magic after you grasp the basics. Watching him build a VGA card should be enough to intuit how Pong was made.
I get that there's a timeline, and that following the timeline is a good way of building a base but...
But... you could also add it to the middle or end of a course. I feel students would be more likely to appreciate the material. Would also be good to have "easy" courses that aren't fluff.
Pong doesn't have an ALU. The image and scoring emerge directly a forest of logic gates - mostly by timers and counters that scan the display, read the input controllers, and trigger logic events at various points in the scan cycle.
Memory used to be ridiculously expensive, and it was cheaper to build a board full of dedicated logic than a simple CPU with a full-screen frame buffer.
I once wrote about this on the example of Computer Space (1971) — the very first coin-op video game and also the first one using this technology — and how this relates to the Atari VCS / 2600 and its TIA chip.
How were pinball machines built without a programmable computer?
https://www.youtube.com/watch?v=ue-1JoJQaEg
I think the arcade industry was already comfortable dealing with complexity to make the mechanical games. The Rube Goldberg nature of the early video games probably weren't that much of a jump in effort/engineering.
Apparently Dave Nutting https://www.youtube.com/watch?v=UzkGNL2AxP0 's reimplementation of Taito's 1975 Western Gun/Gun Fight for Midway was the first arcade video game to use a microprocessor, though I don't know if the original implementation was as 'hard-wired' as Pong or if it had evolved more towards being a full computer in TTL. https://en.wikipedia.org/wiki/Gun_Fight
Basically by doing in hardware the same kind of logic decisions as we do in software.
Naturally this doesn't scale beyond basic games, due to the hardware requirements.
I have somewhere on my parents home a book from the 70's, from my father, dedicated to this kind of games, the precursor of BASIC games books from the 80's.
If you want to experiment this today, there are companies that sell such kits still,
NTSC composite video isn't all that hard, you have voltages for VSync, HSync, VBlank, HBlank, Black and White. Generate the correct voltages at the correct times and you have a TV picture.
But TVs then didn't have composite video inputs, so you also needed an RF modulator.
Don't forget you need a balun to convert the RF from the coax output to the parallel-pair antenna input. Of course, the antenna input was already in use by the rabbit ears so you have to double up.
Olimex just released a 1 euro game ‘computer’ based around a tiny RISC-V microcontroller. It drives a monochrome VGA display by bit-banging a couple of GPIO lines in software. Obviously the use of a microcontroller is cheating a little, but likely makes the system cheaper than anything which could be built with discrete logic or FPGA.
It looks complicated but it’s really not if you break it down into small bits and think of it like you would with a piece of software I.e. abstractions.
It covers some things that are rather counterintuitive, especially if you come from a modern programming background.
Now is it complicated? No not really, I read the answer and immediately understood what was going on.
But no modern programmer would ever come up with the solution of addressing x and y positions by setting timers to wake at the times when the point in the scan-line or the scan-line in the frame was reached (although sleep-sort does exist).
If anything, the point of the post is the fact that it's very easy to understand, despite how counterintuitive it may be.
Ehm, now I feel really old. I just did a comment about this in some other thread. It's one of those timing things where it is hard to use a debugger. The scanlines scan in whatever frequency they are set to. You can influence the frequency and what color it should hit but you don't have exact control of the speed and have to set a timer or interrupt to fire when the electron beam is on the location you want to paint red. There are many programmers still alive that knows how this stuff works and now I feel like a dinosaur. :-)
> But no modern programmer would ever come up with the solution of addressing x and y positions by setting timers to wake at the times when the point in the scan-line or the scan-line in the frame was reached
I would say it’s only fairly recently we stopped needing to do this - when we moved to graphics mode operation systems (oh god I say fairly recently but thinking now it’s probably close to 30 years ago yikes). I’m thinking Garmin app developer may still need to do it
If you're generating a PAL/NTSC/composite signal it's pretty intuitive to think of lighting up point X as turning on the signal (and this the electron gun) at a specific time because that's the way the "protocol" works. There's only one "wire" and the data is purely serial and synchronized to a specific scan speed.
This is also how the video hardware on CPU based consoles and home computers worked. They had counters and used them to either index into a frame buffer or look up hardware sprites, or both. Some machines did it more or less entirely in software (e.g. the ZX-80).
I've seen a schematic of a Pong like game made with digital circuits like counters and comparators.
Each player has a Y position controller. So and Y1 and Y2 register. Then the ball has an x and y position. The game logic is controller by comparators to detect events like reaching borders or within the paddle width range. So for example if you reach the left border and is within left paddle range then bounce the ball back right otherwise the left player loses.
In terms of drawing to the screen, again compare the screen pixel position to each of the two paddles and ball position. Drawing the score was a little more complex but your have a counter for each players score and that determines which lines to draw. So you OR together the output from many comparator logic to determine the pixel should be lit or not.
Truly a precursor to games like pong. It's just that relays turned into transistors and so we're able to operate at speeds that made manipulation of a TV signal possible.
Have a look at Tiny Tapeout which is running a demoscene competition - essentially building hardware just like this (making hardware counting scan lines and pixels to make VCA)
IMO you can only be confused by this if you think computers are magic. They aren't magic! Build your own computer on a breadboard! I did this and it finally got rid of the last bit of magic in computers for me. Now I can appreciate the beauty fully.
A friend's Dad has Wozniak's old VCR. We used to watch movies on that all the time as kids. Interestingly this person was also working on Pong, specifically on the ball device that used to move the paddle around.
some early video games were from Commodore; they had a box that generated a TV signal.. the box had knobs or perhaps a joystick. The games played by assigning a TV channel on the box, then changing the TV channel (with a knob on the TV) to that channel. The video game is now playing.
The original PlayStation didn’t come with an RF modulator. Neither did the Xbox or the N64.
The Super Nintendo was the last console I remember having one. The Genesis must have too.
But by 95 (PS in US) there were no longer the default. They may have still been available, I don’t know. Kind of doubt it but maybe I just didn’t notice.
I don't know about other regions, but in the US console-specific RF adapters were available separately up to the Dreamcast/PS2/GameCube/Xbox generation, i.e. until HD over HDMI became the prevailing standard (Wii notwithstanding).
For the RF switch part, Nintendo actually recycled the NES design for all of them. It's kind of funny seeing that chunky gray box next to a GameCube logo.
I think it depends on market, at least some European / PAL PlayStations did come with an RF modulator[1]. The Dreamcast also did (UK at least), but that was an outlier, other consoles of that generation had RCA composite cables and an RCA to SCART adaptor.
You (nearly) always had to buy the RGB SCART cable you actually wanted for a good picture separately.
[1] random eBay listing, with RFU Adaptor pictured / listed in contents: https://www.ebay.co.uk/itm/364907339156 (Note Europe uses Belling-Lee connectors for TV antennas, so the connector is probably different to North American style RF boxes).
Today it means: computer.
But long long ago there were things called computers (something that computed) that weren't programmable. E.g the computer that aims battleship guns in WW1.
It is more like a pinball machine than a video game.
Now that I think of it it is really a rather unexplored field. Much more should be possible.
Imagine a car printed in the center of some transparent foil then you rotate the foil to turn. For the background you could have a giant map by projecting a tiny part of a rolled up slide. You could put the road logic on a large drum or hurdy-gurdy punch card roles.
This is exactly what EM (electro-mechanical) arcade games were, which were eventually replaced by video games. There were driving games, flight simulators of sorts, bombing missions, shooters, etc. Some of the earlier video games were directly copying/recreating earlier EM games. Video games eventually won, because they were lower maintenance. Also, video games allowed for what was hardly possible with EM technology, namely player to player competition, like in Pong. (This was probably the true revolution of video games: a competitive game, where players could match on equal ground, regardless of age, sex, size, physical strength, etc.)
Searching for "EM arcade game" on a video platform like YouTube may be worth it…
On paper. I remember my cousin who is 50 now writing me letters, handwritten with entire programs in it. All i had to do was writing it and compile it. It often worked out of the box and was never longer than a few pages. I dont know how he did it.
That was a miracle. I used to type in programs from magazines and books and nothing ever worked. That did however teach me how to fix things and was very productive in the long run!
You still use paper. Instead of writing the game in a general purpose programming language you write it using logic gates. You get abstraction and modularity by designing larger components (adders, flip-flops, timers, shift registers) on separate pieces of paper and then including them as named black boxes in a higher-level diagram.
The good news for the Pong developers is that most of those larger components were already available off-the-shelf. Common families of these chips, such as the venerable 4000-series and 7400-series logic families, began to appear on the market in the mid-1960's.
Edit just to add another bit of nuance. If it still seems like an extremely difficult task without much precedent, I think the lineage of these early arcade games can be traced back through their older arcade siblings: pinball machines. People had been building more and more sophisticated pinball machines over the decades since their inception in the early 1930s. For a look into pinball machines, some of their history, and an amazingly deep dive into the workings of a 1970's model, check out Alec's pinball series on Technology Connections [1].
The first time I saw that episode was at a friend's house. I felt so smart telling him that was impossible because you can't mod software with a soldering iron. Then his dad poked his head out from the kitchen and told me Pong didn't have software.
Turns out the only impossible part of that episode is the idea of it taking a few hours. Changing the paddle size was a mod already supported by the hardware and the manual gave details on how to do it. Though it wasn't necessarily intended as a difficulty setting, it was intended to support different sizes of TVs. iirc, all you need to do is solder 1 jumper.
Many old systems stored their software in a diode matrix, which could be modded with a soldering iron: https://www.cca.org/blog/20120222-Diode-Matrix.shtml
[0] https://memory-alpha.fandom.com/wiki/The_Naked_Now_(episode)...
[1] https://memory-alpha.fandom.com/wiki/Isolinear_rod?file=Isol...
[2] https://stargate.fandom.com/wiki/Control_crystal?file=Contro...
[3] https://stargate.fandom.com/wiki/Control_crystal?file=Door_c...
[4] https://stargate.fandom.com/wiki/Control_crystal?file=Contro...
My favorite page remains the one about his computer:
https://cca.org/dave/tech/machine.html
I think I had thought it was possible. But in my mind there was no way Red or Kelso could possibly know how to do it.
The fact it was in the manual helps make that more possible. Don’t think the episode showed/implied that though.
I wish more CS curricula would start with digital logic and stay there for a little longer before going into the full stored-program computer. That used to be the norm.
One of the first Pong ICs, the AY-3-8500, is reverse-engineered in this series of articles: https://nerdstuffbycole.blogspot.com/2018/01/reverse-enginee...
I remember sneaking[1] into a few of those courses during university. Loved them.
The problem is that most computer science students viewed them as useless. They wanted to take, what they thought of as, useful courses. If it wasn't C++, it wasn't useful. Even the computer science students who were interested in pure math, these were computer science courses after all and the university offered a separate software engineering degree, didn't see them as useful.
A slight tangent: one of the courses I had the most fun wasn't even a computer science course. It was a philosophy course that spent a lot of time on computability. Cramming a bunch of genuinely interested social sciences students into the same room as a bunch of genuinely interested computer science students made for an absolutely amazing course. In contrast, the computer science department version of the course was populated by a majority of students who were only interested in the mandatory credit. It made for an astoundingly terrible course.
(These thoughts are from the perspective of a mid-1990's physics graduate.)
[1] Sneaking may not be the right term here. People in the department knew that I was doing this, to the point where I managed to get access to department computers and even managed to take a course for credit, with special permission from the professor and without the prerequisites, but the administration certainly didn't know about it since I didn't even bother with the official "auditing a course" route.
We see over and over again through history (Renaissance Italy, Bell Labs, Romantic-period Berlin, early-Google's cafeterias) how cross-disciplinary conversation inspires innovation.
The "useful" classes are necessary, but if institutions (or students) stop there then they have missed the whole point. They're treating class-work as office-work, and making that the summit of their ambition.
Digital logic would go all the way from boolean logic to designing our own toy CPU, with implementation using discrete logic components left as optional for those that felt like going for the top score on the assignment.
Additionally we also had stuff like EEPROM programming as optional selection for the total credits.
Interestingly enough, stuff like Prolog and LP, was also a required lecture for both engineering degrees.
https://catalog.utdallas.edu/2024/undergraduate/programs/ecs...
After Bologna, no one really takes the plain three years version, as the old degree was upgraded to include Msc, and everyone with the old degree also got equivalence to the new one with Msc, and no one wants to search for a job having only the lowest level degree.
Even if we translate our degree names to CS, the contents of those 3 to 5 years are very much hands on, with many lectures requiring successful delivery of project assignments before attending respective exams.
My CE course was essentially an Electrical Engineering course with power distribution subjects replaced with programming subjects
It followed such a lovely flow, starting with the absolute lowest level of computing (binary maths, diodes, transistors, and building logic gates with these) and kept on combining these building blocks until we arrived at modern computers and software.
Even if you're just a "modern age" developer that only ever uses modern programming languages, just understanding how everything is built makes you make better decisions all around.
https://www.nand2tetris.org/
I get that there's a timeline, and that following the timeline is a good way of building a base but...
But... you could also add it to the middle or end of a course. I feel students would be more likely to appreciate the material. Would also be good to have "easy" courses that aren't fluff.
Memory used to be ridiculously expensive, and it was cheaper to build a board full of dedicated logic than a simple CPU with a full-screen frame buffer.
Go to the computer spiele museum https://www.computerspielemuseum.de/
https://www.tnmoc.org
And of course Tennis For Two was also purely analog pre-dating ICs completely.
http://www1.cs.columbia.edu/~sedwards/papers/edwards2012reco...
https://www.masswerk.at/rc2017/04/02.html#basics
Simmilarly, when the IBM 1401 https://www.ibm.com/history/1401 launched in 1959 it replaced large numbers of plugboard-based tabulating systems http://www.columbia.edu/cu/computinghistory/plugboard.html in offices.
It's about an arcade game from the 70's called Sega JET ROCKET:
https://www.youtube.com/watch?v=D0qlfEuzj6U
Naturally this doesn't scale beyond basic games, due to the hardware requirements.
I have somewhere on my parents home a book from the 70's, from my father, dedicated to this kind of games, the precursor of BASIC games books from the 80's.
If you want to experiment this today, there are companies that sell such kits still,
https://www.ic0nstrux.com/products/gaming-systems/game-conso...
Anything but the simplest programs will require way too much logic to implement with discrete components.
But TVs then didn't have composite video inputs, so you also needed an RF modulator.
One beat tool are the shift registers -- https://bobek.cz/traffic-light/
https://www.olimex.com/Products/Retro-Computers/RVPC/open-so...
It looks complicated but it’s really not if you break it down into small bits and think of it like you would with a piece of software I.e. abstractions.
It covers some things that are rather counterintuitive, especially if you come from a modern programming background.
Now is it complicated? No not really, I read the answer and immediately understood what was going on.
But no modern programmer would ever come up with the solution of addressing x and y positions by setting timers to wake at the times when the point in the scan-line or the scan-line in the frame was reached (although sleep-sort does exist).
If anything, the point of the post is the fact that it's very easy to understand, despite how counterintuitive it may be.
>Now is it complicated? No not really, I read the answer and immediately understood what was going on.
Seems like you agree? Not sure why you asked if the parent poster RTFA.
Many come here just for the comment section and the discussions, not the article per se.
Much like he seemed to have missed the point of the post, you seem to have missed my point about him seeming to miss the point of the post.
I would say it’s only fairly recently we stopped needing to do this - when we moved to graphics mode operation systems (oh god I say fairly recently but thinking now it’s probably close to 30 years ago yikes). I’m thinking Garmin app developer may still need to do it
This is also how the video hardware on CPU based consoles and home computers worked. They had counters and used them to either index into a frame buffer or look up hardware sprites, or both. Some machines did it more or less entirely in software (e.g. the ZX-80).
http://blog.tynemouthsoftware.co.uk/2023/10/how-the-zx80-gen...
Of course there are modern programmers who still do this today, bit-banging VGA on little microcontrollers and the like.
Each player has a Y position controller. So and Y1 and Y2 register. Then the ball has an x and y position. The game logic is controller by comparators to detect events like reaching borders or within the paddle width range. So for example if you reach the left border and is within left paddle range then bounce the ball back right otherwise the left player loses.
In terms of drawing to the screen, again compare the screen pixel position to each of the two paddles and ball position. Drawing the score was a little more complex but your have a counter for each players score and that determines which lines to draw. So you OR together the output from many comparator logic to determine the pixel should be lit or not.
Somehow related pinball machines were way more complex than I'd ever imagine.
BLIP video game by Tomy commercial 1979:
https://www.youtube.com/watch?v=lPA7SQbwDOQ
Blip - 1977 Mechanical Pong:
https://www.youtube.com/watch?v=BSvZbcwqlTw
https://eater.net/8bit
While the PlayStation 2 shipped with composite cables, even it had a coaxial adapter available for tuning to channel 2 or 3.
The Super Nintendo was the last console I remember having one. The Genesis must have too.
But by 95 (PS in US) there were no longer the default. They may have still been available, I don’t know. Kind of doubt it but maybe I just didn’t notice.
For the RF switch part, Nintendo actually recycled the NES design for all of them. It's kind of funny seeing that chunky gray box next to a GameCube logo.
You (nearly) always had to buy the RGB SCART cable you actually wanted for a good picture separately.
[1] random eBay listing, with RFU Adaptor pictured / listed in contents: https://www.ebay.co.uk/itm/364907339156 (Note Europe uses Belling-Lee connectors for TV antennas, so the connector is probably different to North American style RF boxes).
https://news.ycombinator.com/item?id=41740978
Now that I think of it it is really a rather unexplored field. Much more should be possible.
Imagine a car printed in the center of some transparent foil then you rotate the foil to turn. For the background you could have a giant map by projecting a tiny part of a rolled up slide. You could put the road logic on a large drum or hurdy-gurdy punch card roles.
Lots of possibilities.
Searching for "EM arcade game" on a video platform like YouTube may be worth it…
Anyway one typo and it borked the whole deal. Lol it is truly difficult sometimes
The good news for the Pong developers is that most of those larger components were already available off-the-shelf. Common families of these chips, such as the venerable 4000-series and 7400-series logic families, began to appear on the market in the mid-1960's.
Edit just to add another bit of nuance. If it still seems like an extremely difficult task without much precedent, I think the lineage of these early arcade games can be traced back through their older arcade siblings: pinball machines. People had been building more and more sophisticated pinball machines over the decades since their inception in the early 1930s. For a look into pinball machines, some of their history, and an amazingly deep dive into the workings of a 1970's model, check out Alec's pinball series on Technology Connections [1].
[1] https://www.youtube.com/watch?v=ue-1JoJQaEg&list=PLv0jwu7G_D...
A very simplistic and non-general purpose computer, but a computer nonetheless?