Jump to content

rje

Members
  • Posts

    1216
  • Joined

  • Last visited

  • Days Won

    37

Posts posted by rje

  1. I know "usability" is hard to argue for in an 8 bit system, but I think Michael and Frank honored the system while also improving it.

     

    Here's two subjective examples.

    1. Michael added the ability for BASIC to translate hexadecimal values.

    I have used this a lot when writing sprite and PSG code in BASIC.  In other words, it improved my experience and as a result I wrote more interesting BASIC code, which exercised the system more.

    2. Frank added the PSG to VERA.

    As a result, I wrote, in both BASIC and C, code that generates sounds on the X16.  I wrote a thing that plays "Invention No. 13", and then did some sound effects.  I'm more likely to try to finish some of my games because I like the PSG.  I started babbling about an envelope manager, even though interrupts scare me the way Alien scares me. Thus the X16 benefits at least from me writing code that exercises the system as a whole.

    Their adds didn't break the hardware.  They didn't break the software.  And they didn't have essential X16 tasks to do (that I know of).  They added things that made it funner for us to exercise the system as a whole, and by bringing usability up, they encouraged me to exercise the system more than I would have.

     

    Testing is always a problem. 

    But the more you make testing part of the creative process, the better tested the X16 will be.

  2. We could do a Geek Bits episode on this.

    This is me thinking about what Feature Creep is.  It's related to:

    Parkinson's Law of Triviality: the amount of time given to a task is inversely proportional to its overall importance. 

    *** 

    In other words, two signs of "creep" is that:

    #1 a feature is not essential, and

    #2 it is sucking resources from getting the essential things done.

    ***

    With respect, the features added are not really feature creep.  I say this because #2 is not in effect... especially in efforts that are driven by volunteer hours and the areas of expertise involved.

    It's even possible that, in adding these things, usability has gone up, and as a result, the system as a whole is getting exercised more than it would otherwise.  It's not a paradox: craftsmanship drives usability.

    Now, if Frank had replaced VERA with a new FPGA with different requirements, yeah that would be feature creep.

    If Michael had completely replaced his CBM-DOS with the MEGA65 DOS, then yeah that would (probably?) be feature creep.

    If 8BG decided to add a USB hub, a UART, dual SID chips, and an IEEE-488 parallel bus just because they're cool, then yeah that's feature creep.

    ***

    I wouldn't even say that using a more capable microcontroller in order to grab keyboard input is feature creep -- the idea is getting a keyboard to work; therefore, do what you have to do.

    The thing is, Michael and Frank are essentially done with the bits they wanted to do.  Thus any non-breaking things they squeeze into existing resources are icing, and not creeping.  It's a sign that they enjoy the ecosystem enough to improve on their work, and craftsmanship is a very strong signal of interest and high quality.

     

    I could be wrong.

  3. On 3/16/2022 at 11:49 AM, Scott Robison said:

    I wish I was further down the path. Stupid jobs.

    If you want to pursue it I don't object. Not that I should, just saying so.

    Ideally I would like to collaborate.  I can do some things well and some things badly.  If your skill set is complementary, then theoretically we should have a better chance of getting something interesting done.

     

    And Tom has two good ideas.

    On 3/16/2022 at 1:25 PM, TomXP411 said:

    I’m going to suggest downloading the SD2IEC source code first. Modifying that may be the smart route, since it already works on AVR processors.

    ...


    Looking at those figures, I’m thinking we should suggest an IEEE interface, similar to the PET. The SD2PET is already a thing, and it works very well. I know David has one, since his Mini PET review inspired me to get one of my own.

    • Like 1
  4. In this thread I want to discuss writing a hunk of C code for the Raspberry Pi Zero that can talk the IEC protocol.

    ***

    My biggest question is HOW CAN I TEST SOMETHING LIKE THAT?  I have no clue.

    ***

    I want to start by using Debian (Raspbian), for example.  In other words, start simpler.  Also, this way it is actually possible that I might be able to write a proof of concept, because I can program applications that talk over GPIO.

    In other words, there's an actual possibility that I could write a prototype that kind-of works.

    ***

    I suggest the Zero, just because I want to start with the least-capable Pi.

  5. Michael Steil did a video on C64-to-1541 optimizations. 

    He puts an absolute speed threshold between the C64 and 1541 at 7.5 Kb per second.  This assumes using a real 1541 -- which has multiple bottleneck problems -- and old illegal 6502 opcodes that I think are removed in modern 6502s.

     

    1. THE X16 IS NOT A C64

    The nice thing about the X16 is that we can to some degree dictate the flavor of IEC that we support.

     

    2. THE IEC PROTOCOL IS "DOMINATED" BY SD2IEC

    The less-nice thing about the IEC protocol is that THE go-to device is the SD2IEC, so we are tied to its limitations if we expect any sort of plug-and-play accessibility for the X16.

    Put another way: custom solutions suffer from availability bottlenecks.

     

    3. POTENTIAL SOLUTION BASED ON PI1541

    That said, the PI1541 is open-source and cycle-exact.  A smart fellow could potentially gut the cycle-exact bits and produce a "fast" PI IEC device.

    It would have to be something like a PI because of pre-covid availability.  Granted today everything is hard to get, but that will abate.

     

  6. On 3/16/2022 at 4:10 AM, AndyMt said:

    That's kind of a bummer, but I understand it. It means we need another device to connect to the board - and it will be a lot slower I assume?

    Note that since the X16 is not a C64, it doesn't have many of the performance bottlenecks of the C64.  

    It has to be able to talk to an IEC device, yes, but it can send a fastloader to said device.  Binaries are not C64 binaries and so a fastloader won't have the software compatibility problems that C64 fastloaders had.  

    Unless I am mistaken of course.

    Here's one place to start talking about that.  

     

     

     

  7. On 3/11/2022 at 10:05 AM, svenvandevelde said:

    Just to confirm, it's the X16 keyboard we talk about here eh 😉

     

    On 3/11/2022 at 1:13 PM, Fenner Machine said:

    What switches did you choose?

     

    My X16 WASD has clear switches.   They do make some noise, but they're not obnoxious.

  8. Good morning retro programmers.

    I'm writing in C with CC65 on the X16.

    In at least four projects, I've wanted to process user input.  In two cases, I went with a full tokenizer.  In the other two cases, there's only a limited set of interactions.  But they've got enough complexity that I'm doing sscanf and if/then's on string comparison.

    I'm wondering if there's something in between that can process typed user input, but isn't a "full" tokenizer?

    Maybe a little tokenizer can just be "directly" (?) connected to execution.  I dunno.  Thoughts?

  9.  

     

    I think the best networking thingie for the X16 is the one that gets built.

     

    That might be RS-232.  It might be a wi-fi enabled SD card.  Could be the neato TI calculator link protocol. 

     

    I think a protocol gets points for 

    (a) getting there first
    (b) ease of construction
    (c) parts price

     

    I don't think ease of use gets points, even though that's what I would want.  A messy protocol probably still wins, if people who want it can build it or get it.

     

    It seems to me that the more popular protocols (RS-232) have a leg up, in that it is more possible to test an RS-232 networking thingie based on existing tech.

     

  10. On 1/28/2022 at 10:10 PM, ZeroByte said:

    I forget exactly every step I did in Flappy Bird, but it does IRQ handling in C.

    Basically, I made a #define that makes the IRQ vector in RAM ($030-something)  so it looks like a variable, and then I did an inline assembly to SEI, then IRQ_VECTOR = &irq_handler;

    Something akin to that.

    THAT might work.  Better than waitvsync() anyhow.

  11. On 1/26/2022 at 9:36 AM, SlithyMatt said:

    Mainly, you need to make sure you are doing the scrolling during VBLANK by putting it at the beginning of your VSYNC interrupt routine. That should clean it up a lot.

    It doesn't work.  By that I mean it DOES work, but C is so slow it reduces the game to a crawl.

    The function I tried using is called waitvsync(), strangely enough.  Standard with cc65's <cbm.h>.

  12. On 1/29/2022 at 4:09 AM, svenvandevelde said:

    Very nice demo. Curious how you created the dynamic tile algorithm.

    I overlapped the sprites significantly -- 8 pixels on every edge I think.  

    The result was absolutely horrible refresh.

     

    Today I got rid of the overlap and reverted to plain square sprites instead of ones with contoured shorelines.  It looks uglier but moves MUCH MUCH smoother.

    The current view is only a bit more than 5 x 6 "squares".  I also added a regional map view -- typing 'm' draws a 50 x 50 PETSCII map.  That helps get my bearings, so to speak.  I might instead just have a 20 x 20 mini-map on a corner of the normal view.

    • Like 1
  13. On 2/24/2022 at 4:01 PM, TomXP411 said:

    I wrote a little PSG tester in BASIC, and I'm pretty pleased with the PSG. Obviously, you have to manually control the volume level, pitch, and square wave duty cycle for the ADSR envelope, but it's pretty straightforward once you have a timer operating. 

    I, too, wrote PSG code in BASIC -- it plays Invention No. 13 of course.

     

    Hmm maybe you're right about the timer.

     

     

     

    • Like 1
  14. So I started writing BASIC on the X16 in Nov. 2019.  Then I started writing C via cc65 in Nov. 2020.  I've done only a tiny amount of assembly (I'd like to do more).

    Here are my impressions of coding on the X16.

     

    THE 6502

    I understand (and feel) the pull to use a 65816, but I'm fine with the 6502.

     

    THE MEMORY MAP

    The memory map is fine.  VERA access feels a little strangulated, but I abstract that with functions, and we can load data "directly" into VERA, which helps.  I no longer fret over accessing VERA for anything except the PSG.

    As I mentioned, the single-byte ports feel tight, and I've written abstraction functions to manage it. A larger window might be nice, maybe.  4 bytes could point to the X,Y registers of a sprite, or a single PSG voice.  8 bytes gets a full sprite definition.

     

    THE RAM BANKS

    RAM banks are perfectly fine and extremely useful for storing piles of data.  They're especially effective with C, where I can cast structure pointers to hunks of memory in a bank.

    The 8K bank size seems fine.  8K is plenty for storing chunks of string data (like instructions or lists).  Multibank maps require simple math hidden in a macro or short function to swap through banks.  The one place that a bank memory manager would be useful would be if I'm storing a heap or hashtable in banked RAM, and if I'm tokenizing and parsing input into banks for temporary storage.  In those cases, I think fiddling with banks is kind of bothersome.

     

    SPRITES

    Sprites are reasonably easy to handle -- especially with a light library that wraps the registers used.  The tricky part (for me) is creating the sprites which fit the palette and bit depth.  One person helpfully wrote a Python script which converts a PNG to a C-style array of bytes.  I've written a Perl script which uses that script and actually generates the loadable binary.

     

    PSG

    The PSG is almost beyond my ability right now.  I think the lack of ADSR envelopes is (mostly) the reason.  I will have to adapt assembly code, such as Dusan's simple sound players, to add sound to anything I write.  And I'm not ready to attempt that.

     

    BASIC

    BASIC 2.0 is good as a "batch" language, for orchestrating more complex tools.  Its extensions (e.g. supporting hex notation) makes it even more useful for this purpose. 

    BASIC is fast, for Commodore equipment.   It's still too slow for arcade-style games.  BASIC's speed and power is fine for rogue-like games.  I think simple versions of Ultima IV-like games *could* be done with BASIC and a few assembly routines.  And some patience and carefulness.

    BASIC 2.0 is most effective for programs that are 8K or smaller.  As your code grows past 8K, the mental load on its limited variable space and near-nonexistent structure makes it not worth your time.  I find it significant that BASIC thrived in the 1980s with machines limited to 4K-16K of RAM.

    BASIC 2.0 is not very fun to program in.  Very early on I wrote a Perl script that lets me write in a slightly better "BASIC" on my Mac, and transpiles down to BASIC 2.0.  Essentially I stopped programming in actual BASIC 2.0 within a couple of months.

     

    C

    C is effective for writing code that fills up "main RAM".  It's expressive enough and seems parsimonious enough to allow sane usage of memory.  Pointers to structures are really a great way to use banked RAM.

     

  15. Oh, I really like the automatic speed adjustment.  That would be super-useful for our relatively higher-tech world, where an extra ACK is a worthwhile tradeoff for high speeds.

    I also like "elegant".   And symmetric.  And cheap.  And easy.

    So two signal lines, like ATN and DAT kinda.  When one wants to send to the other, it pulls the ATN low.  The other calculator acknowledges by pulling DAT low.  That's the handshake?!

    Packets... one byte for the protocol ID (calculator version), one byte for packet type, two bytes for message length, the message, and then a checksum.  Elegant.

    I even like the connector... I mean you could use an RCA jack for the I/O port for goodness' sake.

    ***

    So let's see... how does that work...

    I suppose I have to assume that delays are not in DETECTING SIGNALS, but rather in marshaling the data.  That removes one variable: the time needed to present a bit to the receiver. 

    (If I'm wrong, then I don't know what to do.)

    I'll go look up two-line protocols now...

     

     

  16. On 1/27/2022 at 1:29 AM, kliepatsch said:

    Well, rje has been talking about an ADSR manager for a while now, and I was interested in his wishes specifically. My impression was that he doesn't want a second Concerto, but rather address a simple problem: currently you cannot make a sound with a single line of code (or two). I may be wrong ...

    I have code that can set up a voice -- that's the easy part.  What I lack is the thing that "curates" the sound through an envelope in a "fire and forget" manner.

    The ADSR manager is the piece that requires working off of an interrupt to "curate" a played note, so to speak.  In my mind, it would be handled by assembly code, since as an interrupt process it should be as efficient as possible.

    Envelope_Manager:     ; voice is in X?

        load status, indexed by X
       ...dispatch based on state...

    Attack: 

       increase volume by a_ratio,x
       increment status if it's at max_vol and fall through
       else rts

    Decay:

       decrement volume by b_ratio,x
       increment status if it's at decay_vol and fall through
       else rts

    Sustain:

       increment status if sustain is done and fall through
       else rts

    Release:

       decrement volume by r_ratio,x
       turn off voice and mark done if it's at zero_volume

       rts
       

×
×
  • Create New...

Important Information

Please review our Terms of Use