Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


Posts posted by rje

  1. Good morning retro programmers.

    I'm writing in C with CC65 on the X16.

    In at least four projects, I've wanted to process user input.  In two cases, I went with a full tokenizer.  In the other two cases, there's only a limited set of interactions.  But they've got enough complexity that I'm doing sscanf and if/then's on string comparison.

    I'm wondering if there's something in between that can process typed user input, but isn't a "full" tokenizer?

    Maybe a little tokenizer can just be "directly" (?) connected to execution.  I dunno.  Thoughts?



    I think the best networking thingie for the X16 is the one that gets built.


    That might be RS-232.  It might be a wi-fi enabled SD card.  Could be the neato TI calculator link protocol. 


    I think a protocol gets points for 

    (a) getting there first
    (b) ease of construction
    (c) parts price


    I don't think ease of use gets points, even though that's what I would want.  A messy protocol probably still wins, if people who want it can build it or get it.


    It seems to me that the more popular protocols (RS-232) have a leg up, in that it is more possible to test an RS-232 networking thingie based on existing tech.


  3. On 1/28/2022 at 10:10 PM, ZeroByte said:

    I forget exactly every step I did in Flappy Bird, but it does IRQ handling in C.

    Basically, I made a #define that makes the IRQ vector in RAM ($030-something)  so it looks like a variable, and then I did an inline assembly to SEI, then IRQ_VECTOR = &irq_handler;

    Something akin to that.

    THAT might work.  Better than waitvsync() anyhow.

  4. On 1/26/2022 at 9:36 AM, SlithyMatt said:

    Mainly, you need to make sure you are doing the scrolling during VBLANK by putting it at the beginning of your VSYNC interrupt routine. That should clean it up a lot.

    It doesn't work.  By that I mean it DOES work, but C is so slow it reduces the game to a crawl.

    The function I tried using is called waitvsync(), strangely enough.  Standard with cc65's <cbm.h>.

  5. On 1/29/2022 at 4:09 AM, svenvandevelde said:

    Very nice demo. Curious how you created the dynamic tile algorithm.

    I overlapped the sprites significantly -- 8 pixels on every edge I think.  

    The result was absolutely horrible refresh.


    Today I got rid of the overlap and reverted to plain square sprites instead of ones with contoured shorelines.  It looks uglier but moves MUCH MUCH smoother.

    The current view is only a bit more than 5 x 6 "squares".  I also added a regional map view -- typing 'm' draws a 50 x 50 PETSCII map.  That helps get my bearings, so to speak.  I might instead just have a 20 x 20 mini-map on a corner of the normal view.

    • Like 1
  6. On 2/24/2022 at 4:01 PM, TomXP411 said:

    I wrote a little PSG tester in BASIC, and I'm pretty pleased with the PSG. Obviously, you have to manually control the volume level, pitch, and square wave duty cycle for the ADSR envelope, but it's pretty straightforward once you have a timer operating. 

    I, too, wrote PSG code in BASIC -- it plays Invention No. 13 of course.


    Hmm maybe you're right about the timer.




    • Like 1
  7. So I started writing BASIC on the X16 in Nov. 2019.  Then I started writing C via cc65 in Nov. 2020.  I've done only a tiny amount of assembly (I'd like to do more).

    Here are my impressions of coding on the X16.


    THE 6502

    I understand (and feel) the pull to use a 65816, but I'm fine with the 6502.



    The memory map is fine.  VERA access feels a little strangulated, but I abstract that with functions, and we can load data "directly" into VERA, which helps.  I no longer fret over accessing VERA for anything except the PSG.

    As I mentioned, the single-byte ports feel tight, and I've written abstraction functions to manage it. A larger window might be nice, maybe.  4 bytes could point to the X,Y registers of a sprite, or a single PSG voice.  8 bytes gets a full sprite definition.



    RAM banks are perfectly fine and extremely useful for storing piles of data.  They're especially effective with C, where I can cast structure pointers to hunks of memory in a bank.

    The 8K bank size seems fine.  8K is plenty for storing chunks of string data (like instructions or lists).  Multibank maps require simple math hidden in a macro or short function to swap through banks.  The one place that a bank memory manager would be useful would be if I'm storing a heap or hashtable in banked RAM, and if I'm tokenizing and parsing input into banks for temporary storage.  In those cases, I think fiddling with banks is kind of bothersome.



    Sprites are reasonably easy to handle -- especially with a light library that wraps the registers used.  The tricky part (for me) is creating the sprites which fit the palette and bit depth.  One person helpfully wrote a Python script which converts a PNG to a C-style array of bytes.  I've written a Perl script which uses that script and actually generates the loadable binary.



    The PSG is almost beyond my ability right now.  I think the lack of ADSR envelopes is (mostly) the reason.  I will have to adapt assembly code, such as Dusan's simple sound players, to add sound to anything I write.  And I'm not ready to attempt that.



    BASIC 2.0 is good as a "batch" language, for orchestrating more complex tools.  Its extensions (e.g. supporting hex notation) makes it even more useful for this purpose. 

    BASIC is fast, for Commodore equipment.   It's still too slow for arcade-style games.  BASIC's speed and power is fine for rogue-like games.  I think simple versions of Ultima IV-like games *could* be done with BASIC and a few assembly routines.  And some patience and carefulness.

    BASIC 2.0 is most effective for programs that are 8K or smaller.  As your code grows past 8K, the mental load on its limited variable space and near-nonexistent structure makes it not worth your time.  I find it significant that BASIC thrived in the 1980s with machines limited to 4K-16K of RAM.

    BASIC 2.0 is not very fun to program in.  Very early on I wrote a Perl script that lets me write in a slightly better "BASIC" on my Mac, and transpiles down to BASIC 2.0.  Essentially I stopped programming in actual BASIC 2.0 within a couple of months.



    C is effective for writing code that fills up "main RAM".  It's expressive enough and seems parsimonious enough to allow sane usage of memory.  Pointers to structures are really a great way to use banked RAM.


  8. Oh, I really like the automatic speed adjustment.  That would be super-useful for our relatively higher-tech world, where an extra ACK is a worthwhile tradeoff for high speeds.

    I also like "elegant".   And symmetric.  And cheap.  And easy.

    So two signal lines, like ATN and DAT kinda.  When one wants to send to the other, it pulls the ATN low.  The other calculator acknowledges by pulling DAT low.  That's the handshake?!

    Packets... one byte for the protocol ID (calculator version), one byte for packet type, two bytes for message length, the message, and then a checksum.  Elegant.

    I even like the connector... I mean you could use an RCA jack for the I/O port for goodness' sake.


    So let's see... how does that work...

    I suppose I have to assume that delays are not in DETECTING SIGNALS, but rather in marshaling the data.  That removes one variable: the time needed to present a bit to the receiver. 

    (If I'm wrong, then I don't know what to do.)

    I'll go look up two-line protocols now...



  9. On 1/27/2022 at 1:29 AM, kliepatsch said:

    Well, rje has been talking about an ADSR manager for a while now, and I was interested in his wishes specifically. My impression was that he doesn't want a second Concerto, but rather address a simple problem: currently you cannot make a sound with a single line of code (or two). I may be wrong ...

    I have code that can set up a voice -- that's the easy part.  What I lack is the thing that "curates" the sound through an envelope in a "fire and forget" manner.

    The ADSR manager is the piece that requires working off of an interrupt to "curate" a played note, so to speak.  In my mind, it would be handled by assembly code, since as an interrupt process it should be as efficient as possible.

    Envelope_Manager:     ; voice is in X?

        load status, indexed by X
       ...dispatch based on state...


       increase volume by a_ratio,x
       increment status if it's at max_vol and fall through
       else rts


       decrement volume by b_ratio,x
       increment status if it's at decay_vol and fall through
       else rts


       increment status if sustain is done and fall through
       else rts


       decrement volume by r_ratio,x
       turn off voice and mark done if it's at zero_volume


  10. OK, replaced the BASIC version with the C version.  It's more responsive, but the scrolling is terrible.  Also I don't like the way I did the map.  I have to rethink things and use fewer sprites if possible.


    For example, I use sprites to tile the ocean.  I shouldn't have to do that -- surely I can just use characters to represent the ocean.  Like a reverse period, or something. 

    Then, the land sprites themselves are memory hungry.  Each one is 64 x 64 and 8 bit pixels -- 4K!  Oink!  

    I think I need to go back to using "coastline" sprites for the edges.  8 x 64 and 64 x 8 sprites.  We'll see.


    And even after all that, there does appear to be an obvious redraw going on when the ship moves: the sprites appear to stagger.  In other words, C is not fast enough.

  11. On 1/26/2022 at 3:01 AM, kliepatsch said:

    Dropping the decay phase makes sense IMO, since in most cases, you either want a short sound (decay = release), or a sustained sound or beep (no decay phase, since you directly enter sustain phase).

    I can see that.  And that's a good simplification for an interrupt-driven envelope manager.  Thanks.

  12. On 1/26/2022 at 2:31 AM, kliepatsch said:

    @rje What kind of functionality would you think would be good for such a C library?


    I am thinking that the easiest would be to have a single function to which you pass all required parameters, like frequency, ADSR parameters, waveform. So you can call the function and then simply forget about it.

    Here's the functions I coded up in my proof-of-concept.

    void runVoice( unsigned voiceNumber, Voice* voice );
    void runVoiceWithEnvelope( unsigned voiceNumber, Voice* voice );
    int getTunedNote( unsigned index );
    void bang(unsigned frequency);

    Each voice has a dedicated envelope, so runVoiceWithEnvelope() can figure out the envelope's address.


    • Thanks 1
  13. Using a PSG noise register thingy might be a good idea. 


    I think the emulator doesn't have one source of randomness.

    That said, my "Blinkenlights" demo displays several memory locations that change state while the program runs.

    They include a chunk of the CPU stack ($01D5 - $01E4), which is of course NOT random, the time registers (TM_SC, TM_MI, etc etc), which are of course NOT random, and the addresses from $9F64 through $9F69 and $9FB8 through $9FBB ("external devices"), which ... well they surely can't be random.


    • Like 1
  14. On 12/28/2020 at 4:30 PM, rje said:

    That said, I think some of the VERA bitfields required to use its sprite engine make those lovely sprites Harder To Use, which bothers me.  I haven't played with the sound generator in VERA yet, so I don't know how it compares with the SID.  I didn't really have complaints about the SID -- especially when Compute! magazine's SuperBASIC came along and packaged many sprite and sound commands into convenience statements in a BASIC wedge.

    I've worked a bit more with VERA Sprites and the PSG both, and I have an update.

    (1) The sprite bitfields are not as big a problem as I originally thought.  They are PAINFUL when programming in BASIC, but I suggest a thoughtful extension command would resolve that.  When using C, the bitfields are no problem at all.

    (2) The PSG has one major drawback, and that is the lack of ADSR envelopes.  The solution would have to be something like an interrupt-driven software ADSR system in assembly language, with a small bit of RAM set aside to manage state.  The 16 bit ABI can be used to pass parameters.

    Now I haven't used the synthesizer chip on the X16.  Maybe it's easier to use. 


  15. I wrestled with the code a bit, and am feeling tired over it.

    I feel like it's all going in wrong directions.  But that's no reason to throw it out.  That just requires some careful UX thought.


    That's what it needs:  an old-school but effective common user interface.


    It has a lot of menus.  And I'm sorry to say they don't all work the same way.  This implies that I need a common menu method that the various drivers call on to draw the user's input properly.


    • Like 1
  16. So far, my Kernal Test only tests these:

    1. CHROUT
    2. MEMTOP (read and write)
    3. MEMBOT (read and write)
    4. SETNAM, SETLFS, LOAD (implicitly though...)
    5. IOBASE
    6. SETTIM and RDTIM

    I'm looking for more easy KERNAL tests... I'm thinking the IEC Bus is not a trivial thing to test, so maybe I can test:

    1. Channel I/O calls?


    Any suggestions?

  17. Yep, I was messing with those opcodes, grouping them one way and another, thinking "surely a little decode can reduce size".  I'm sure Woz didn't decode because 300 bytes was the golden compromise for him.


  • Create New...

Important Information

Please review our Terms of Use