Jump to content
ZeroByte

Getting the younger generations more involved

Recommended Posts

I'm sure most of us are already doing what we can to share our passion for this stuff with our own kids / grandkids, but I've been thinking lately that there will never be a better time than now to "infect" as many people with the retrocomputing bug as possible. Right now, retro computing is at an apex - pixel art is popular. Indie games tend to use pixel art style. It's never been easier to resurrect old boxes with things like SD replacements for IDE/SCSI/PCI/ISA/etc. This hobby is to gen-X what tinkering around with muscle cars was to the boomers. I have been thinking about what sort of things would be required to start a successful retro computing club here in my home town, meeting at one of the libraries. What kinds of activities would be the most accessible and interesting to younger people, especially teenagers?

I mean, having a "seminar" / lecture / presentation on how to do BASIC programming or Assembly or whatever would be useful, but such topics aren't exactly great starter material for getting new blood interested in this stuff. Should it be integrated with Raspberry Pi / Arduino type stuff? Clearly, it would suck to have to lug a bunch of this kit down to the library every 4th Saturday, yet I think it would be really important to have real HW for people to touch and interact with. It's a lot more cool to see than just some blue window on a Lenovo laptop.

I've got a nephew who has a mind capable of gobbling up content like Ben Eater, but he's just interested in making stuff in Minecraft or watching TickTock videos. If he lived in the same city as I do, I'd bring him over to help build the 6502-on-a-breadboard project.

I know this post has been kinda rambly, but it's been on my mind lately, and I really am itching to share this stuff with people who would love it if only they knew it was a thing, and were able to get away from the brain candy for long enough to find out how fun and fulfilling it is to actually make something instead of just watch others do it on YT, or worse - just watch people floss dancing with "snapchat" filters over their faces on TikTok.

  • Like 6

Share this post


Link to post
Share on other sites

I was thinking about similar things myself, but I could not think of any plan which would work. Youngsters don't have this nostalgia, that we have. They grown up with totally different technology around them (cellphones, tablets, etc.) Interesting them might be a difficult task, maybe close to impossible.

I thought they could be interested based on historical interest. So, maybe start with some sort of interactive museum where you can learn compuer history and try using some machines yourself as a bonus. It could be not a real museum, but virtual. Or partialy virtual. Maybe somebody would like that very much and got dragged in.

  • Like 1

Share this post


Link to post
Share on other sites

That's a tough question. I have a nephew who is into Minecraft, Roblox and Fortnite. He makes YouTube videos about these games. When you come down to it, it's pretty cool and creative. It is just completely alien to us, "GenXer's". He's also starting to code in Python. Why would he bother programming on old machines? Or "old new" like the X16?

The thing old an "old new" machine has over modern machines is coding to the metal. But it's a tough sell. It's not like say, old music or movies which you can appreciate immediately. Coding to the metal is very rewarding but requires some serious commitment.

I've seen young kids/teenagers appreciate old music and movies (by old I means 70's,80's and 90's). There is an immediate appeal. Not so much with old machines. Like Cyber, I don't see how you could interest them in old machines. That's too bad because I really thing the closeness to the hardware is a very learning experience.

 

 

 

  • Like 2

Share this post


Link to post
Share on other sites
Posted (edited)

But, it's not a modern pop culture experience; even in the 80s, kids played on the C64 because it had current graphics and sound and had a vast current library, not because it was otherwise special.

People are the same today.  They play with computers that have current graphics and sound and a vast current library.

The niche that are engineers is smaller but extant.  Some of them appreciate the 6502, but they have to like digital logic and assembly language to begin with...

 

Edited by rje
  • Like 1

Share this post


Link to post
Share on other sites

I think Arduino would make a good gateway drug. A friend used a pair of WiFi-capable units to make her own wireless NES controller adapters- very few lines of code were necessary, actually.

  • Like 1

Share this post


Link to post
Share on other sites

Not sure if the rambling below is very helpful, but well...

I think the others have already pointed out some possible pitfalls in the idea. There won't be the same sense of nostalgia (although, as I have mentioned in other posts, the same goes for me and that is not necessarily a barrier to getting into retrocomputing). I think you could draw in people with the idea that they can "learn to program their own games", but I expect they'd rather learn to use the more recently developed tools for that. I expect that the young of today would want learn to how to code something like the next PUBG, rather than something like boulder dash. Yet, maybe you could make use of (1) the fact that things like pixel art and such indeed seem to have grown in popularity and (2) that you can actually feasibly create a retro-style game by yourself (even though that point is of course flawed, because you can equally well develop simple games with more recent tools). 

Another thought: Maybe the focus should not be on the 'retro-ness' of computers but on their relative simplicity. Because of their relative simplicity and their limitations, it is actually more feasible than with more recent machines to get to a point where you get close to "understanding everything about your machine". Coding "to the metal" is part of that, and with simpler, more limited computers it is much more of a necessity (and therefore more inviting) than with today's highly overpowered machines and compilers that probably do a much better job than an enthusiast could ever do. I don't have hands-on experience with Raspberry Pi or Arduino, but I imagine that on that front they have an appeal that is similar to that of retro-computers, while having a more 'modern edge'? Affordability is also a nice benefit of something like Raspberry Pi. 

 

  • Like 2

Share this post


Link to post
Share on other sites
Posted (edited)

Well, as someone who is probably younger than a lot of people here (I was a kid of the 90's) I can say it is possible to get into this sort of stuff as a kid way after its prime.  My circumstances were different from most 90's kids though, who were playing Doom or Commander Keen on their 90's computers.  My parents were always very late/hesitant adopters of technology when I was a kid (not so much now) and so we never even had a computer at my house until probably the late 90's. However, my uncle was a Commodore buff and my earliest memories of computers was playing games on his C64.  I was also introduced to BASIC through him and I found it fascinating.  And then, for my 10th birthday (which would have been in '97) he gave me one of his C64's (which he took back and gave me a C128 a year later).  I spent many hours playing games and typing in programs from books on the C64.  My parents got a Windows 98 computer at some point, but I never really had an interest in using it much outside of school work.  Then in 2002 my family finally got the internet, and from there my interest finally went over to "modern" computers.  I never really used the Commodore much after that, and now it's in a box in the basement.

I want to say, then, that it's possible to get a young kid interested in programming with BASIC like I was, but things are different now.  I'm honestly not sure I would have taken as much of an interest in the C64 back then if my family had had the internet from the start 😅.  I'd like to think I would though, since I remember thinking BASIC was really cool, and I just found the act of typing a thing into a computer and then seeing it do the thing you wanted amazing.

Edited by Ender
  • Like 2

Share this post


Link to post
Share on other sites

I think the most lasting feature of retro computers is the fact that they're simple enough that you can have a good understanding of everything in there. There're only a small number of chips and subsystems to learn about when it comes to programming, and if you're more into the hardware side of things, the bus architecture is fairly straightforward and accessible too. What's cool is that even though there's only a small number of things to grasp (comparatively), any one of them can become this rabbit hole of ever-deepening knowledge and skill. Speaking of the C64, there are people who keep finding ways to push SID - it's not been terribly long ago that the technique (I forget what the sceners call it) where you can use the triangle waveform to output PCM in much higher quality than the old 4bit master volume technique. Then there are others who go down the VIC-II rabbit hole and create techniques like FSP(?) for fast horizontal scrolling of bitmaps, etc.

But I wouldn't say you have to be at these levels to say that you understand the computer very well.

  • Like 2

Share this post


Link to post
Share on other sites
12 hours ago, ZeroByte said:

I think the most lasting feature of retro computers is the fact that they're simple enough that you can have a good understanding of everything in there. There're only a small number of chips and subsystems to learn about when it comes to programming, and if you're more into the hardware side of things, the bus architecture is fairly straightforward and accessible too. What's cool is that even though there's only a small number of things to grasp (comparatively), any one of them can become this rabbit hole of ever-deepening knowledge and skill. Speaking of the C64, there are people who keep finding ways to push SID - it's not been terribly long ago that the technique (I forget what the sceners call it) where you can use the triangle waveform to output PCM in much higher quality than the old 4bit master volume technique. Then there are others who go down the VIC-II rabbit hole and create techniques like FSP(?) for fast horizontal scrolling of bitmaps, etc.

But I wouldn't say you have to be at these levels to say that you understand the computer very well.

I think it's the pulse waveform and PWM audio.  https://gist.github.com/munshkr/30f35e39905e63876ff7

Share this post


Link to post
Share on other sites

I tried to get my 4 kids into technology when they were young, trying to pass on my experience and knowledge, and it sort of worked, with my two boys. :P

The two girls didn't get into it much, however my youngest (19) is a "gamer", but that's where her interest stops. She just wants it all to work, and that's all left up to me.

The two boys (30 & 28), I started them playing Amiga 500 games early on, then moved to DOS games, and on up from there, including consoles from the NES on up. They loved it, and both think it's cool, but neither of them really mess with it these days. They both play games, but only on modern hardware, they are not really into retro. Though one of my sons did spend a few years in the Air Force working in cyber security. So he is still very much into tech, just not retro tech.

I agree, I think it's difficult to get people to really dive into the older tech becasue they don't have the nostalgia we have. They didn't grow up with it. Back then it was all we had and we were blown away by it. These days, you can't help but compare it to modern hardware and I think that's were we lose a lot of people.

Still, I think the Commander X16 has a chance to bring more people to the retro side. The fact it's not an emulator, like just about everything else out there, that's it's real "modern" hardware, makes it a portal to the 80's that people may be more willing to go through. Especially since original 80's hardware is getting so hard to find and hard to maintain unless you know what you're doing and actually want to do it. Not to mention the cost. I know a lot of people that look at the price for some of these older systems like the C64, Amiga's, Tandy 1000's, etc. and think people are crazy for spending that much on old hardware they see as obsolete.  

  • Like 2

Share this post


Link to post
Share on other sites

I think one of the points that might make people dive into it, is learning and understanding. One of X16 goals is to be "simple enough that a single person can understand all of its components".

You can't understand all components of most modern tech.

  • Like 2

Share this post


Link to post
Share on other sites
Posted (edited)

Agreeing with @Strider here,

The problem today is computers have become common place to the generations from the 2000's forward.  They don't see it as magic voodoo, which can be conquered.  They view it like we see the refrigerator or microwave, something needed but not that interesting.  They tried this idea with the Raspberry Pi when it came out in the UK. Building whole school curriculum around it,  but in trying to make it accessible to everyone, dumbed it down so much it became nothing more then learning Python and hacking together pre-made hardware kits.  The industry doesn't help either; universities teach the most clicky computer language, and promise a 6 figure salary on departure; businesses fall all over themselves catering to no talent hacks who have no qualms about lying about their perceived qualifications.  Hardware does not come with any documentation anymore, on purpose.  Science fiction both in film and book format has succumbed to anti-science ideas, where whatever you throw up and stick is what they go with.  And the new war on math has pretty much sealed the fate of our future generations, unless the trend is reversed.  I have been homeschooling my sons for 10 years and considering personalities, only one is interested in technology.  I have a pretty good lab at home, and I have been including them in the various stages of building projects; from design, prototyping, soldering, program interfacing, etc.. and I can barely pique their interest.

For kids it all comes down to what their friends or buddies are into, and most kids today, that is little more then playing life like video games.  When I was into the C64 as a kid, all my friends were into the same thing or similar.  Games were only one part of the computers in the 80's, and today gaming systems are not computers. To be honest laptops are uninspired, and desktops are nothing more then Frankenstein's monster gaming systems, and phones have pretty much dominated them both in use.

Unless you can convince a generation of school children that electronics and computing is cool, it simply will not happen. I would love to see a retro revolution, but unless we rescue our schools from the death spiral they are in, I honestly think this ship has sailed.

Sorry about the pessimism, but I have devoted many years of thought and action to trying to do exactly what this topic calls for.  Parents have to understand there is a problem before it can be addressed, and this simply is not a problem for most people in the U.S. today.

Have a great Sunday!

Edited by evlthecat

Share this post


Link to post
Share on other sites
Posted (edited)
19 hours ago, evlthecat said:

Agreeing with @Strider here,

The problem today is computers have become common place to the generations from the 2000's forward.  They don't see it as magic voodoo, which can be conquered.  They view it like we see the refrigerator or microwave, something needed but not that interesting.  They tried this idea with the Raspberry Pi when it came out in the UK. Building whole school curriculum around it,  but in trying to make it accessible to everyone, dumbed it down so much it became nothing more then learning Python and hacking together pre-made hardware kits.  The industry doesn't help either; universities teach the most clicky computer language, and promise a 6 figure salary on departure; businesses fall all over themselves catering to no talent hacks who have no qualms about lying about their perceived qualifications.  Hardware does not come with any documentation anymore, on purpose.  Science fiction both in film and book format has succumbed to anti-science ideas, where whatever you throw up and stick is what they go with.  And the new war on math has pretty much sealed the fate of our future generations, unless the trend is reversed.  I have been homeschooling my sons for 10 years and considering personalities, only one is interested in technology.  I have a pretty good lab at home, and I have been including them in the various stages of building projects; from design, prototyping, soldering, program interfacing, etc.. and I can barely pique their interest.

For kids it all comes down to what their friends or buddies are into, and most kids today, that is little more then playing life like video games.  When I was into the C64 as a kid, all my friends were into the same thing or similar.  Games were only one part of the computers in the 80's, and today gaming systems are not computers. To be honest laptops are uninspired, and desktops are nothing more then Frankenstein's monster gaming systems, and phones have pretty much dominated them both in use.

Unless you can convince a generation of school children that electronics and computing is cool, it simply will not happen. I would love to see a retro revolution, but unless we rescue our schools from the death spiral they are in, I honestly think this ship has sailed.

Sorry about the pessimism, but I have devoted many years of thought and action to trying to do exactly what this topic calls for.  Parents have to understand there is a problem before it can be addressed, and this simply is not a problem for most people in the U.S. today.

Have a great Sunday!

I think that some of the things you describe do not just apply to younger generations. When I tell my parents about how I mess around with computers, they often respond that for them things really need to be 'plug and play'. I think that, generally speaking, the more advanced a technology gets, the more 'black boxing' of that technology occurs, and the broader the audience that is willing and able to use the technology. To some extent I think it is true that the number of people interested in opening the black box of the technology shrinks, because understanding the technology takes more effort and at some point it becomes unfeasible fit the amount of time required for this in your hobby time. However, I also think that to some extent it only seems like this group is shrinking as a result of them quickly becoming a minority amid a much larger group of people that only became activated as soon as the technology was sufficiently black boxed to no longer have to care about 'what goes on under the hood' to use it. 

In a book on social practices I have, the authors make the argument that when cars were still relatively new, you had to be somewhat of a mechanic to drive one. You had to know how to do at least basic upkeep and repair, because otherwise you wouldn't get much use out of cars. This also meant that you had to know more about how cars worked than you need to know today. Actually, that was probably part of the fun of it for the first groups of people that drove them (the authors of the book actually explicitly make the argument that driving a car was considered a bit adventurous at the time, while today's motivations for driving a car are overwhelmingly pragmatic). Nowadays, the vast majority of people driving cars don't understand exactly how they work. Also, they have become much more complex, so you really need to have a particular kind of interest in these things to still care enough to try to understand them. 

I am probably oversimplifying some things here, but I have the impression that computers follow a similar trajectory. In their early stages you had to know more about them to understand how they worked and how you could fix something if they stopped working. Over time, computers have become increasingly black boxed (and therefore more attractive to a broader audience) and overall more difficult to wrap your head around. Is this something to be cynical about? I think to some extent yes, because I feel that the more we lose understanding of the technologies we work with, the more we lose our independence and become passive consumers. 

I should stop rambling. What were we talking about again? 😉

Edited by wahsp
  • Like 5

Share this post


Link to post
Share on other sites
Posted (edited)
4 hours ago, wahsp said:

I am probably oversimplifying some things here, but I have the impression that computers follow a similar trajectory. In their early stages you had to know more about them to understand how they worked and how you could fix something if they stopped working. Over time, computers have become increasingly black boxed (and therefore more attractive to a broader audience) and overall more difficult to wrap your head around. Is this something to be cynical about? I think to some extent yes, because I feel that the more we lose understanding of the technologies we work with, the more we lose our independence and become passive consumers. 

I should stop rambling. What were we talking about again? 😉

@wahsp,

I don't think you are rambling or oversimplifying.  I think black box is a very apt description, and your insight is keen.  I believe there is a concerted effort to keep people from understanding the technology they purchase.  All one has to do is look at the stiff resistance against the right to repair movement going on right now.  It is a double edged sword, manufactures don't want to give up their trade secrets and glean huge profits, but how do we wrestle with the societal problem of being a disposable world.  A perfect example is, it has been proven beyond doubt, that these same manufacturers are manipulating their technology just to die so consumers have to dispose of, and buy a new product.  This is deceptive and manipulative.  That is why I will never buy an Apple product again.  Where do we start drawing lines as consumers?

I know it seems crazy, however, this is why I have purchased what others would deem antiques.  Not for the aesthetics, but for the functionality, durability and ease of maintenance.  I remember a lot of my curiosity for electronics came from reading the documentation that came with products.  Back in the day turning the pages of a users manual was an adventure, especially the appendix, today it is a jungle of legal ease and warnings to Darwin award winners.  If it comes from China, the doc-o is barely intelligible..

I think, just by having a place to go to exchange ideas, like we are, is the best resource for those kids out there who continue to explore and push their limits.  They are out there, and looking through the Internet for support.  I know I have mentored several young people myself, and the things they all appreciated were adults who listened when they had questions, and didn't make them feel inferior.

Want to change the world, lead by example.

Have a great day all!

P.S. I ramble..🤣

Edited by evlthecat
  • Like 2

Share this post


Link to post
Share on other sites
Posted (edited)
On 4/23/2021 at 7:22 AM, wahsp said:

Not sure if the rambling below is very helpful, but well...

... I think you could draw in people with the idea that they can "learn to program their own games"

... Another thought: ... Because of their relative simplicity and their limitations, it is actually more feasible than with more recent machines to get to a point where you get close to "understanding everything about your machine".

Sounds right to me.

And, these are selling points of retro, and, potentially, the X16, and I think they strike near the intent of the X16.

 

Yes, you have to like programming, but GRAPHICS and SOUND are both very accessible on platforms like the C64, in a way that's NOT in modern platforms.

 

I was happy to move off of the C64 and onto the 386.... BUT I COULD do sprite graphics and sound on the C64.  I COULD NOT on the 386.  So all of that capability was locked away from me, buried in an architecture that didn't give me the tools to do it.  I ended up writing a "sprite" editor for the 86 platform, but frankly I did not want to write a toolchain and so my efforts fizzled.

 

The C64 on the other hand had all the tools there.  The API was generic, built-in, and well-documented.  A few POKEs and I could write envelope-driven sound effects.  Another set of POKEs and some data and I had animation.    This is a FANTASTIC thing for someone like me.  

 

Edited by rje
  • Like 2

Share this post


Link to post
Share on other sites
1 hour ago, rje said:

Sounds right to me.

And, these are selling points of retro, and, potentially, the X16, and I think they strike near the intent of the X16.

 

Yes, you have to like programming, but GRAPHICS and SOUND are both very accessible on platforms like the C64, in a way that's NOT in modern platforms.

 

I was happy to move off of the C64 and onto the 386.... BUT I COULD do sprite graphics and sound on the C64.  I COULD NOT on the 386.  So all of that capability was locked away from me, buried in an architecture that didn't give me the tools to do it.  I ended up writing a "sprite" editor for the 86 platform, but frankly I did not want to write a toolchain and so my efforts fizzled.

 

The C64 on the other hand had all the tools there.  The API was generic, built-in, and well-documented.  A few POKEs and I could write envelope-driven sound effects.  Another set of POKEs and some data and I had animation.    This is a FANTASTIC thing for someone like me.  

 

Yeah, actually I think that Cyber more or less boiled down what I tried to say in a couple of paragraphs to a few sentences:

 

On 4/25/2021 at 6:23 AM, Cyber said:

I think one of the points that might make people dive into it, is learning and understanding. One of X16 goals is to be "simple enough that a single person can understand all of its components".

You can't understand all components of most modern tech.

And you are now illustrating that point very nicely as well. This idea, that you can actually feasibly take control of the entire thing and the entire process of development alone, is what makes it incredibly appealing to me. I've noticed a while ago that I have a kind of obsession to "do it myself", but only with software so far. 🙂

  • Like 1

Share this post


Link to post
Share on other sites
Posted (edited)
6 hours ago, wahsp said:

Nowadays, the vast majority of people driving cars don't understand exactly how they work. Also, they have become much more complex, so you really need to have a particular kind of interest in these things to still care enough to try to understand them. 

Though modern cars is more complex, you can understand how they work by splitting the car into major generic components. It is not awfully complex, when you try to decompisite. The main part that makes car a car didn't change that much. But the real problem is (and that pisses me off the most) that manufacturer is not interested for you to understand how it works even if you want! Quite the opposite. Finding detailed documentaition on modern cars is very difficult, sometimes close to impossible. Many cars have security measures from making you messing with them. And also despite that construction is modular, it is somewhat monolith at the same time, because car might not work if you remove some components which in reality are not essential (not needed for simple ride). Car have sensors which detect absence of some components and would not work without them. Messing with this sensors even more nightmare, because you need to dig into firmware, which controls practically everything in the car. All this made this way to make you use car service, and not doing anything yourself.

What next? They'll forbid me to fill gas or screenwash without authorised service? Or I won't be able to change a flat tire myself in a desert?

Edited by Cyber
fixed few typos
  • Like 2

Share this post


Link to post
Share on other sites
2 hours ago, evlthecat said:

That is why I will never buy an Apple product again.  Where do we start drawing lines as consumers?

I know it seems crazy, however, this is why I have purchased what others would deem antiques.  Not for the aesthetics, but for the functionality, durability and ease of maintenance. 

I got disappointed with the Apple products for the very same reasons. Please share your "antiques", I would like to hear.

  • Like 1

Share this post


Link to post
Share on other sites
On 4/26/2021 at 4:33 AM, wahsp said:

I feel that the more we lose understanding of the technologies we work with, the more we lose our independence and become passive consumers. 

This is a corollary of a broader truth: Independence requires effort and comes with risk.

Share this post


Link to post
Share on other sites

An important point to bear in mind that the key goal is not getting connected to a small share of the population who may be mildly interested, but in getting connected with the even smaller share of the population who may become really interested. A thousand people of high school / college age engaged in various types of development would make for a very healthy ecosystem, and with a common online meeting point available, they could be scattered all around the globe. And that's a tiny, tiny percentage of all of the younger people there are around the world.

And really, that is not contacting each of those thousand, but getting a few in, and they share it in their social media networks, and it spreads from there. What the project needs to do is precisely what it is doing ... get the system finished, have good documentation, have a lot of freely available programs, pretty much all with source code, and a supply of people perfectly happy to answer questions about how they did this or that.

The process would be young people coming across it, being intrigued, looking into it, getting interested, starting to dabble, getting more interested, and then starting to share their experience with like minded people in their social media networks.

It's a pretty organic process, so it can't be forced, but the project can set up the conditions that lower the barriers to it happening. And then, if it happens, it happens.

  • Like 5

Share this post


Link to post
Share on other sites

My 12 years old son learns Arduino and Python. A little bit of electronics, but mostly modules in simple form without deep understanding of how it works.

He doesn't share my interest in retro-computing. He doesn't feel that nostalgia to old games (that are really primitive to him even comparing with Minecraft). I had NES before I got my first 486 based PC, so at least I'm curious how was it working. And also I'm interested in the deepest internals of the working computer and X16 is perfect as it can be totally understood. But I'm pretty old boy. For my son Python is a much better and powerful replacement for Basic and Assembler language is too complicated for all near-IT kids.

So X16 will be quite niche product, unluckily. 

Share this post


Link to post
Share on other sites
Posted (edited)
On 4/23/2021 at 11:26 AM, ZeroByte said:

I think the most lasting feature of retro computers is the fact that they're simple enough that you can have a good understanding of everything in there.

Quoted For Truth.

However, STEM notwithstanding, I think only a small percentage of all people can and want to do engineering as a hobby.

Edited by rje

Share this post


Link to post
Share on other sites
Posted (edited)

Greetings!


Apologies ahead of time, this will be long.  

I've divided into sections, hope that helps the digestion.


PART 1:  Making a new "platform" is *extremely* hard these days, but...

The "OP" mentioned that the "time is now" for a retro platform.  I would also add the "time is now" also because of political factors - there is a desire to find alternatives to FB, Apple, and BigTech platforms.   Recently, I was banned on YouTube, despite having never posted a comment on that account and just having a single video of my daughter playing an 1886 piano that I had restored (is Fur Elise inappropriate?).  A co-worker suggested I was banned because of my search history - perhaps, although I don't think I search unusual things (certainly not political things).  I e-mailed YouTube to ask "why", and a day later they unbanned me with a "sorry, my bad" response -- they literally stated that upon review, there was no violation afterall, it was a complete mistake.   No further explanation.   So as BigTech is becoming highly politicized to the extent that people are randomly getting banned, I think this adds general interest in finding "other platforms."

It's interesting to me that as hardware capability expands, we now suddenly take interest in certain legal things.  Having written a game for a 32K machine, I realize there was simply no spare capacity for "fluff" like virus scanning, copy protection, cookies, etc. (and no spare cycles for error checking either - modifying the code becomes like walking thru a landmine;  I literally had my icons turn into half crocodile/dragon hybrids because of an obscure 1-byte memory overwrite -- it was a hilarious unexpected result, but took more time than it should have to track down that bug).   My point is, a retro platform is immune from any BigBrother effect - the system is literally too simple to hide the impacts of such mingling (like tracking; although on the flip side, true, they do need to be capable enough to support secure online-banking).

And it's not that I specifically have anything to hide.  My point is: I want my hardware fully used to solve algorithm problems (stacking astronomoy photos, in my case), not to feed Bots that interpret information and think they know me.  I once searched for "horse brushes", to look up the proper name of a tool (a "curry", not to be confused with "cury") during a discussion I was having about something that happened two decades ago, and I'm still now getting "horse brush" ads on all my devices (that the BigData centers of the world are wasting any energy on that, is tragic! Yet, here we are).

NOTE: Already in elementary, my daughter has been introduced to "copyright" and I.P. rules.  She's been indoctrinated on being paranoid to "borrow" any content.  In my game, I named one of my creatures "CROCS" and she (non-jokingly) commented "that's trademarked by a shoe company! we're gonna get sued!"  This made me realize the legal dilemma that the young generation must face, the mindset that: EVERYTHING, every aspect of your work, must be completely original?  Otherwise it's just too risky to even get started, as one glance from a lawyer shuts down any motivation to "put yourself out there" - as lawyers are constantly on the prowl to "bash skulls" and keep folks in line.  I absolutely do RESPECT other peoples work, and there does need to be legal-fairness for content creators.  But wasn't it Truman who said "There is nothing new in the world except the history you do not know." ?  Ironically, I once read a WW1 book, and the author made a comment that surprised me -- he implied that this similar "legal-impasse-to-do-anything" was part of the pent-up frustration that led up to the cause of The Great War (yes, yes, there was the Ferdinand motorcade and all that - but this author posed an interesting perspective).  I need to read that again and find the specific passages.   [ the book was To The Last Man, 2004 by Jeff Shaara ]

PART 2: Historical Perspective

I reflected recently on what motivated me to write a game for a Commodore PET here in 2021.  Sure, part of it was the fallout of COVID and just being bored enough to bother with it.  But on further reflection, I wrote in my game manual the following:

"I consider the development of Personal Computing fascinating, since it represented the birth of an entirely new form of media (that being Software).  Akin to the development of writing itself:  The use of binary mathematics and logical operators to express ideas and information (content)!   Those of us born in the 1970s are the last human generation to know the world BEFORE mass-computers.  We stood during a final moment that, centuries later, will be considered like the hand-paintings in caves from centuries before: very humble and primitive beginnings, but yet an important juncture in the journey of humankind."

This is a "we were there" moment.  In the day-to-day rat race of paying bills and finding food, we tend to forget this.  We lived in that "origin story" moment of Wozniak, Gates, Jobs, Peddle, etc. - despite what you may think of these people presently, these were my super heroes.  These are the Achilles, Hercules, Odysseus of our time!  And even these were the 2nd generation crew, who didn't pioneer the work itself, but brought it to the masses.   And it's amazing to me that we've somewhat come full circle: mainframe titans ruled the world in the beginning, and now we are migrating back to that model of "dumb terminals" connected to servers (except our dumb terminals have radical graphics cards, and host a bunch of a Bots...). i.e. that whole concept of streaming your real-time action game, wow!  Even where I work, virtualization of computing assets has essentially become the "norm" over the past few years.  And it makes sense: they use less power, and (generally) more secure and updated - BUT, "they" do hold The Control.  That's an important thing to remember, "they" can turn it off or deny it.    So has IBM won afterall! ?  Were they right, the world only needs 4 or 5 "big" computers? ("data-centers")

Anyhow: yes, binary-math quickly evolved in the 1930s-1950s, and ultimately "big machines" (PDP-1, etc) were used to help pioneer the creation of the "small machines".  And the 1960s to 1974 were this transition from digital calculators, to purpose built arcade machines (Pong), and to finally a "KIM-1" (Keyboard Input Monitor) integrated system.  We witnessed first-hand an entirely new form of media being created (Software).  In Ghostbusters, it's not just that "Print is dead."  Old media died.  RIP hand animated Disney cartoons.  Yes, my daughter stills draws on paper a little bit, traditional artists certainly still exists - but mostly it's all about that digital drawing tablet, digital music synth.   It's like  "if it's not digital, it didn't happen."  I imagine someday "BC" will end up meaning "Before Computers".

NOTE:  My wife asked "you have a 3090, why would you play an old green and white game?"  My answer was along the lines that a computer game represents a "math model", and creating something extensive that works in extremely limited resources ("coding to the metal"), it's like solving a (math) puzzle, which yes: it takes an Engineer mind to appreciate that.    It's more about the context of creating that simple-looking-game more then the game itself, and how all the aspects of modern games were there even in the beginning (RAM limitations, disk storage of content, compression of content, "AI budget", rate groups/threading, ultimately creating an experience that simply never could have existed "BC". Anyone who has read HACKERS can probably appreciate what I mean.  I made a 1MHz 8-bit machine draw flicker-free animated fire-breathing dragons, using my own algorithm from scratch, POG!

NOTE: I'm still on the fence on whether Blockchain is really something "new" or not. To me it boils down to a fascinating and clever application of CRC checksums across a peer-to-peer protocol. So while not a new form of media, it is a way to somehow say "these bits are mine, and mine only"?  Well, that's interesting.    I recall a time (in the 1980s) when there was a bit of a crackdown on encryption.  You could PGP your e-mail, and OH MY, you could communicate something that the government couldn't hack?  Mind Blown, alarms went off.  Do certain countries still ban certain levels of encryption?  Don't hear much about it these days.  But I suspect the next generation of superheroes might end up being these Crypto-Folks.  I tell my daughter "learn this blockchain stuff" - I hope it's like my father telling me decades ago "learn this Programming stuff, it'll be big money a decade from now".  TBD.

 

PART 3: Virtual Reality Programming   (aka  Programming Inside Cyberspace)

EDIT CLARIFICATION: When I say "VR" here, I'm not talking Goggles-on-Head and acting like a mime.  I'm talking about any 3-Space means of representation, like 3D monitors, or 3D gaming, and a way to observe attributes of data - it's time to move past "dumb" text files for source code, and to have "layers" to your code.

For the past decade, the following concept has been in the back of mind:  using virtual reality to do software development.   My managers have collected "software metrics" for decades and I think it's all completely meaningless data.  They have no insight to the quality/efficiency of the software.  When I walk down a factory floor, I can see obvious defects -- if a wiring looks sloppy, or component is bent the wrong way, etc.  Software doesn't have this.  Managers are just happy that "the software does what it is suppose to do" -- yes, it's a business, schedule/budget and all that.  But to me, I need to know the "energy cost" of a library or function -- how efficient is it?  If I use that, how much resources are left over to incorporate other things?  If using your function spams ports, writes files (across the network), or blocks databases -- these are resource drains I need to be aware about, in order to integrate that function into a larger system.  But, I don't have the time to read the equivalent of your Lord of the Rings novel (amount of lines of code) to determine that efficiency.

Long ago, my mentor sternly stated "software is written, not built!" (Bryan Slatner, cheers if you're out there!)  and I've taken that to heart ever since then.  Software is creative work like writing a book, NOT like building a house.  However - with virtual reality, I'm starting to now doubt this.  What if in virtual reality we COULD build software? we COULD have visualized  constructs for AND, OR, terinary-(?) operators -- and lines showing what (address) resources your functions are consuming.  In the (virtual) far background, you have a depiction of your available hardware resources (another virtual construct, dialed to how much in resources you want to have available).  And at a glance, one can say "oh, that heap of software uses about 50% of the resources".  At a glance, if I adjust this function here -- follow these (virtual) lines -- they jingle these resources, which are also connected to these functions.  I hated UML, I'm not talking about UML.  I'm talking a way to visualize source code, and at a glance management can see if it's a tangled hopeless crap-pile of a mess (and then jump in to help clean things up where needed), vs "this is well organized" (you have 50 key data structures, you connect to 3 databases -- wait, why today is another network port being used?).   That's what irks me the most about software development -- in a factory line, the foreman can walk the line, and when they see a mistake, they can do "on the job training" right there,  "no, Joe, cross the wires like this.... There ya go!"  We don't really do that in software - we integrate the system, then say "well, this runs too slow" and "blame" (attack) whatever the last change was.  We take for granted the Universities are teaching good things -- but it's really that "on the job training" that helps, IMO  (immediately where/when needed, as each developer is different -- my grandfather was a good welder and became a weld inspector in retirement, you don't have to be mean when people make mistakes or do sub-par work, you show them the better way, nod, and move on;  whereas my managers couldn't write diddly-squat, and just ask "so, is it ready yet? update WAG on that CTC?"  plug that into their metrics database, and... repeat).   That's the difference, we don't have a low-friction equivalent of "walking the line" (aside from reading the equivalent of the Torah for every system, learning the main plots and actors all over again), and mediocre software managers who think software is just a bunch of IF-statements and FOR-loops (never "in the trenches" long enough to appreciate how complex deadlock prevention made that algorithm smooth).

I've mulled this over with co-workers, and they believe no computer could handle this virtual realty to build software concept.  "you'd need a more powerful computer than the computer itself" -- oh, like a mainframe?  Right, so we start small: we use "big" (physical) computers to create "small" virtual programs.  Also I'm not saying your program runs in this "virtual reality" (it might...), but I'm saying virtual reality is just a means to still create traditional software that is compiled, linked, and ran.  Just with the virtual free-space floating constructs, you can now annotate any section of code with relationship-details (not just free-space text comments -- but that "THIS variable you passed through 80 function calls -- call/declare them whatever you want -- but it's ultimately the same bits that jingle in this data structure way over here"...)  At the very least, I think it would help prevent dead-locks 🙂 (which is now more relevant when trying to take advantage of all these wonderful Multi-Cores we have!)   My other term for this concept is a "3D IDE", the next evolution past IntelliSense - to provide meaningful insight to your declarations and their impacts (by having a "layer of insight" behind the IDE text window).

I'm not a huge fan of those "write software by connecting these puzzle piece" type of IDEs, that my daughter uses in school.  I'd say "throw them to the fire!", maybe start with a Pascal compiler.  Though there was one IDE that you could tab-over between C#, Python, and the "code-blocks" style -- that one was clever.   Anyway, in these steps from machine language, assembly, to high level languages.... 20 years ago, there was a thought that "UML" was that next evolution, but that didn't pan out (UML was a very expensive experiment that I think set the industry back 20 years, somewhat like Java in its own way).  I don't need yet-another-way to show class relationships, I can already get to that information.  I need immediate insight into the resources my software is using, and perhaps over-using, to get a better grasp on the practical re-usability across platforms.  Though, true, there is a difference between casual desktop-type software, and more specialized micro-controller type software.   Still, am I close to blowing the stack?  And no, I don't have 80,000 hours lying around to do code-reviews all day.

Maybe this "virtual reality to build software" could be the X16 "killer app" (being a relatively "simple system" to represent in VR) - the Commodore PET "killer app" was BASIC, this would be something new.   I have ideas on what it could look like -- you still have a standard "virtual IDE" somewhere, but imagine if you could "gesture" on your code, zoom in, and literally see what (virtual) registers you're using at that moment!  But I don't have the talent to really even prototype such a thing.  I've thought about maybe trying a crowdsource.  This approach blows away concerns about "well, do I indent 2 or 4 spaces?"  No, you just place a "FOR-construct" out there in space, and attach your logic blocks.  I've always found it curious that young developers stress so much about syntax-sugar - maybe we should just program directly in the Abstract Syntax Tree itself?

I'm too old now, but I hope this idea inspires "the next generation"  (a subtle Star Trek TNG reference, where I imagine they used VR to program the holodeck....)  Or maybe some folks know if this has already been tried?  I know VS now shows memory and CPU profiling during your runtime -- but I'm wondering if such a thing could be pre-profiled at development time (auto-generated sample inputs, find the min/max boundaries in virtual snippet executions?), is the machine powerful enough to "Just-In-Time Link" your software? 

NOTE: I get especially miffed when people don't pass std::string by const&. (yes, yes, there are exceptions when it is appropriate to copy a string -- but typically, no)  These Java new-hires are always doing that!  I've got a whole middle-ware baseline full of that, so we're copying filenames again, again, and again thru thousands of nested calls.  If one could just casually walk by and say "Oh, hey Joe, here's a little adjustment you might want to think about".  The damage is done at a core-level, the political cost to justify fixing it ends up as "OMG, he's gone thru and changed interfaces to the entire core-library, we're doomed!" is too high (since if anything else happens to burst at the same time, well, last-commit-done-did-it), and we just settle for "gosh, our program takes forever to do this thing."   On the flip side, it all pays the same - I get that aspect also.  But the industry is absolutely right in that software is inherently "sloppy" relative to the potential of the hardware.  I'm not a huge fan of just throwing more hardware in as the solution -- although I acknowledge that is often just the most prudent thing to do ("hardware saves the day, again").    But, it comes back to the "energy cost" of your software, or indeed of the whole system.   "Good enough is the enemy of great", right?  (turns out that saying can be interpreted either way, actually)   Think about why cars need radiators or why CPUs need fans.  The hardware is operating on a fine line "operating envelope."  And your ability to function within that envelope is fundamentally what being competitive is all about -- your company, your country thrives or dies depending on how efficient you can be, since there are physical limits to computing (my PhD professor wrote a paper about this back in the day).  Specifically, I'm thinking Space Applications where resources will once again be very limited in those extremes.

NOTE: Another example - external interface demands Elevation HAE instead of MSL.  In 80 modules, we're forever wasting time converting MSL to HAE to accommodate this external interface - because this ONE module over there is the only thing that needs it in HAE.  A virtual visualization would immediately show this.  I can't do a 9 month study to show the runtime impacts of changing this interface, or answer how much performance we would gain.  All I can say is this inefficiency is increasing the "energy cost" of this approach.  We can knock it out by inverting the policy at THESE locations.  Done, now go find the next inefficiency.

NOTE: Another example - imagine the visual alarms that go off when you do the virtual equivalent of "using namespace std;"  Some statements have huge impacts, pollution indeed.  But in the virtual space, it can just be lines of color. Mark this GREEN line MODULE A.  Now you can use all the stuff in GREEN - oops, lookie there, stuff in BLUE from MODULE B has trickled in and we got a conflicts here.   How bad is it, give up and run away?  No, it's two things. Ok, maybe we can do a polite re-arrangement here....  That's literally how I visualize my source code, that there are colors in the background that report the context of what I can do (e.g. "i'm mutext blocked  on THESE things - this SCOPE here, I'm holding up everyone else for this duration, lets visualize this scope as PURPLE, and I can touch only these declarations that are also PURPLE"), etc.  And in the virtual-space, I can annotate my bits with meta-information like this.   But this is advanced detection algorithms that have 1/1000th of a second to finish its job, we're way past casual calculation of compound interest, or sprite x/y positions.

 

PART 4: Involving the younger generation.... think "solar powered microcontrollers"

There is casual Personal Computing (Office-stuff)  more intense image processing and media rendering stuff, and mobile-app-stuff, but there is also the whole world of microcontrollers. The Arduino is a decent platform, maybe the X16-niche is already filled.  But being (legally) blessed to incorporate the Commodore instruction set?  That'd be pretty cool.  I want a WiFi enabled motor for my chicken coop - the Arduino could do it, but it's not quite there that my 10-year old daughter could handle it herself (it can't muscle a high friction door reliably, needs a motor-controller assist, etc.).  But like the 6502 itself when first released, it comes down to the cost.   Something I can stick into a $100 KIM chassis to Program, then also toss into a weatherproof box to micro-control some 3D printed doodad?  That'd be cool.  I need 5V, 6V, and 12V options though.  But please don't make me solder  (I really need to master that skill - some people just got a knack for things; whereas I just blob metal all over the place)  If you're going to do microcontrollers, maybe that's the Barrier to Entry - soldering!  (in my defense, it's probably more about having good tools and not those $5 irons)

NOTE: While I'm at it, I want to add that I think a good "metric" to monitor about a Programmer is "Compiler Hours."  Pilots have a thing called "Flight Hours" - their numbers of hours in the cockpit doing their craft (pun intended!).  For Programmers, I don't care about your "Years of Experience", I'm more interested in your "Debugging Hours."  How many all-nights have you pulled stepping studying your code and squashing bugs, and resolving intricate problems?  I sling code "on the clock" and "off the clock", while some peers are more of "well I wrote my line of code today, time for a movie."  (double-tragic, since lines-of-codes is a horrible metric - but that's another discussion)   The industry isn't mature enough to measure this metric (your hours "in" the IDE or debug sessions), and it is a hard thing to track, but I propose that it is something to start thinking about.  No metric is perfectly - obviously I could just sit idle at the same line of code forever, watching that cursor blinking-away.  But you know what I mean: 30,000 hours actually coding, finding and fixing bugs, had better gained some experience more than someone with 200 hours.

NOTE:  My other idea is FORGET SCRUM... Instead, what if coding was like sitting in the Bridge of the Enterprise.  You have the "Master Coder" in the captains chair, and your smart support crew helping to navigate.  "Uhura, what's that syntax for passing function pointers again?  Ah, thank you!"   "Scotty, I need a faster ADC converter function in the upper ports, can you get it to 5 cycles!?"   Some people are NEVER going to be good coders, and some people aren't good coders YET - and neither of them should be in the Captains Chair of any sizeable software project.  But any good coder is enhanced with a support team - I've found "paired programming" to be wonderful, with that extra set of eyes being like a Good Nurse making advise while you work  "forgot that semicolon;   I think you meant the inverse logic there;    you're not checking that pointer".  I've wanted an experiment on that setup - a huge 120" screen with a room of 5-badass programmers, on a closed Bridge focused on the code, with a clear mission on adding functionality, or the Lower Decks job of just refactoring.

 


Cheers!

voidstar

Edited by voidstar
  • Like 2

Share this post


Link to post
Share on other sites
Posted (edited)
1 hour ago, voidstar said:

Greetings!

[..]

Cheers!

voidstar

tl;dr

 

Regarding the topic at hand an interresting angle one might pursue  with promoting 8bit machines is their potentially relatively low power consumption which is an important topic nowadays. I have e.g. an AtMega running with only a 9V block here.

The simplicity of their design also means less susceptibility to cyber security threats for e.g. power distribution systems.  A system a few KB in size can be better understood than a 90Gb Operating system.

Those are angles that might add some appeal and have real world implications but I might simply project my own interests there.

Edit: Space eploration also needs low power/high reliability systems. Just a thought.

Edited by Falken
  • Like 2

Share this post


Link to post
Share on other sites

Interesting thoughts.

I first attended university between 1986 and 1989 as a computer science student. I didn't have quality study skills or discipline. I did great at classes I found interesting or that came naturally to me (programming, pre-calculus math, freshman level science). I just didn't bother attending the rest of the classes. Oh, I had great intentions at the beginning of each semester, but a few weeks in they were gone. I withdrew to work full time as a computer programmer which I've been working at for over 30 years now.

A few years ago, after my children were grown and moved out to live their lives, I decided I was almost 50 and if I was ever going to finish a degree, now was the time. I didn't necessarily *need* it as I had plenty of work experience on my resume, but I wanted to prove to myself if nothing else that I was a bit more mature than I was in my late teens / early 20s. Also, you never know when you might need a degree to get in the door (as I am sure I've lost out on a few opportunities over the years due to the lack of a degree).

The point to sharing the last couple of paragraphs is to reflect on the difference in university education in the 80s vs today (I only finished my Software Engineering degree last December; I wanted a CS or CE degree, but this fit better into my schedule).

The classes I took in the 80s dug deep into theory and best practices. The classes I took over the last few years focused on application and getting people trained to write web software for the most part.

That's not to say that there aren't still good and valuable things to learn in a quality program. I made a choice to do what was an easier degree just in the interest of getting it done ASAP. But the reality is that most programs are pushing the techniques, languages, and platforms that most people are interested in hiring, and that's web development.

There is nothing *wrong* with web development. It's a area that provides a lot of value. We're all gathered together today thanks to it. It's not what I'm interested in, but I get a lot of contacts from recruiters through my LinkedIn profile (primarily) asking me if I'd like some entry or mid level web development position. It's not what I do, but it's what schools are churning out. Fine with me, I don't want to do it that much myself, and it keeps the competition for my job to a lower level since most new developers it seems do not know what to do without a huge library with lots of functionality in an interpreted or JIT language full of garbage collection to protect one from oneself. Where the real solution to problems is to throw more hardware at it until it works acceptably. Hardware and electricity is a lot cheaper than engineering hours.

Now, my preferred language is C++, and others could make similar arguments about my language. I'm really not trying to denigrate web development, just using it to compare and contrast. Most languages and platforms have their place, and we specialize.

Anyway ... I don't see things changing much. The web / cloud based approach to software engineering isn't going to go anywhere anytime soon, and in as much as it is easier to write software in a "secure" "garbage collected" language that protects programmers from themselves, than to hire more experienced / disciplined engineers, we will continue to go down that path.

  • Like 4

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...

Important Information

Please review our Terms of Use