Jump to content

PS/2 Direction for the Commander X16


Recommended Posts

I'm glad there are people to step up and take managing this community on, and there is time to figure out a way forward. 

To me the site IS the product at the moment. While we patently wait for a shipping X16 there is so much here to enjoy:

  •  the forum with all its users who have posted from the technical to the social (with their own wit, insight and opinions)
  • the library of software available, and
  • the ability to run a web emulator.

I believe the majority here will gladly become consumers because they are part of the community. One the hardware ships I hope there is an explosion of more free software here, rather than spread far and wide on the interweb.

 Finally, this site has the best forum implementation I've ever seen. Kudos to  those who built it,  those who will manage this community. and those who contribute content.

  • Like 9
Link to comment
Share on other sites

On 12/31/2021 at 6:26 PM, BruceMcF said:

That would seem to be the direct solution ... pull the PS2 SCL clock line low while leaving the data line high before starting to send data to the master over I2C, release it when done.

I still think using I2C will make matters more complicated, as evidenced by the fact that we have not yet a functional keyboard.

The Veronica's keyboard, that I mentioned above, does not need to disable the PS/2 line during operation. After receiving the 11th bit of a PS/2 envelope, the keyboard controller directly puts the byte received onto the shift register which is then available to the 6522. There is enough time to do this before receiving the next PS/2 start bit. The 6502 may then read the byte from the 6522 with a simple LDA instruction. In other words, the 6502 and 6522 are used as designed. We shouldn't fight that.

I don't know, but it feels like the PS/2 wasn't designed to be disabled after every scan code. There is, for instance, no standard saying how quickly the PS/2 device should start when enabled again, it's only said that it cannot start before 50 µs has passed.

Link to comment
Share on other sites

  • Super Administrators
On 1/1/2022 at 3:01 AM, Stefan said:

I still think using I2C will make matters more complicated, as evidenced by the fact that we have not yet a functional keyboard.

The Veronica's keyboard, that I mentioned above, does not need to disable the PS/2 line during operation. After receiving the 11th bit of a PS/2 envelope, the keyboard controller directly puts the byte received onto the shift register which is then available to the 6522. There is enough time to do this before receiving the next PS/2 start bit. The 6502 may then read the byte from the 6522 with a simple LDA instruction. In other words, the 6502 and 6522 are used as designed. We shouldn't fight that.

I don't know, but it feels like the PS/2 wasn't designed to be disabled after every scan code. There is, for instance, no standard saying how quickly the PS/2 device should start when enabled again, it's only said that it cannot start before 50 µs has passed.

I keep looping back to the same issue… a dedicated keyboard controller just makes more sense. IBM did it for a reason. I don’t care whether the interface between the keyboard controller and the computer is serial or parallel… but letting an AVR handle the low level PS/2 implementation really does make sense. 

  • Like 4
Link to comment
Share on other sites

On 1/1/2022 at 2:01 AM, Perifractic said:

Off-topic: Just to clarify a little further, yes I am hosting the site still. The site’s creation came about with the endorsement of the whole X16 dev team on Slack, and indeed everyone seemed quite pleased with it initially. It even allowed David to shut down his own forum Murray2 with his brother and move the archives to this one. The team were all encouraged to post official updates here first, and copy the forum url to Facebook. However I think people just find Facebook quicker, and ultimately I couldn’t force the team to keep updates going here without starting to sound like I was nagging them each time 😅

Still I think it’s a wonderful site and @MattGrandisand I are very pleased with it, particularly the software library with built in emulator. Don’t see that very often elsewhere!

That is why, when there were no other offers from the team to keep the site alive, I financed another year of it recently even though I had stepped back from the X16 project itself. There’s a ton of time and work here that would’ve been lost and disappointed hundreds of people  

We’re open to passing the baton to the community as long as it is properly maintained and moderated. Feel free to DM @MattGrandis and I in a group message if you are serious about taking it on. I’m already discussing with one member also but we have about 8 months before we have to make the switch so no rush.

Thanks for your belief in the site as a cool place to be and a useful ecosystem. Happy new year! 

Thanks @Perifractic for all the fantasitic work you have put into the site.

I think this is one of most beautiful and well functional forum I ever used. It is a true raw model for a great community site. Love the design of it!

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

Getting back to on-topic of this thread, why is supporting PS/2 so difficult?  Is it that the rest of the architecture of the orig. X16 does not play well with PS/2?

It seems peculiar that others have implanted PS/2 kbd. and mouse with zero fanfare or difficulty and X16 is awaiting the return [to the project] of one person (I don’t know the name is the person, somebody that wrote the kernel I believe).

Link to comment
Share on other sites

On 1/2/2022 at 12:13 PM, EMwhite said:

Getting back to on-topic of this thread, why is supporting PS/2 so difficult?  Is it that the rest of the architecture of the orig. X16 does not play well with PS/2?

It seems peculiar that others have implanted PS/2 kbd. and mouse with zero fanfare or difficulty and X16 is awaiting the return [to the project] of one person (I don’t know the name is the person, somebody that wrote the kernel I believe).

One thing to be careful about is that somebody interfacing with the PS/2 keyboard they have doesn't guarantee that the same code will interface with every keyboard that obeys the PS2 spec.

One possibility, which is waiting on Micheal Steil having time for the X16 project to open up again, is that the timeout on the 65C02/6522 code is just too fast when the code runs at 8MHz, and adjusting the timeout will fix the issue.

Edited by BruceMcF
  • Like 2
Link to comment
Share on other sites

On 1/2/2022 at 9:13 AM, EMwhite said:

Getting back to on-topic of this thread, why is supporting PS/2 so difficult?  Is it that the rest of the architecture of the orig. X16 does not play well with PS/2?

It seems peculiar that others have implanted PS/2 kbd. and mouse with zero fanfare or difficulty and X16 is awaiting the return [to the project] of one person (I don’t know the name is the person, somebody that wrote the kernel I believe).

I'm mostly interested in the hardware on this project, but your prompted me to look into the PS/2 code. I think we are seeing the result of two factors in the way the X16 handles PS/2:

  1. KERNAL inhibits PS/2 communication except when servicing the keyboard.
  2. KERNAL polls for a fixed time after resuming PS/2 communication.

The code for this is in kernal/drivers/x16/ps2.s:

[code]
ps2_receive_byte:
; set input, bus idle
    lda port_ddr,x ; set CLK and DATA as input
    and #$ff-bit_clk-bit_data
    sta port_ddr,x ; -> bus is idle, keyboard can start sending

    lda #bit_clk+bit_data
    ldy #10 * mhz
:    dey
    beq lc08c
    bit port_data,x
    bne :- ; wait for CLK=0 and DATA=0 (start bit)
[/code]

That "ldy #10 * mhz" is the delay. In this case it is 10 * 8 = 80 loops. The last 4 lines are executed 80 times waiting to see the PS/2 start condition (both CLK and DATA going low). If there is no start condition, the code re-inhibits communication and returns. I do not believe that the PS/2 spec provides a guarantee for how long after the host resumes communication before the device must send a start a bit if it has data ready. The device itself probably has an always running PS/2 clock so it probably won't send the start bit until the clock after it detects that the host has de-inhibited traffic.

I suspect that this is the problem hitting the X16. At higher frequency, 80 loops is not a long enough delay for some keyboards. After seeing nothing for 80 loops, KERNAL re-inhibits the device before it has had a chance to send a start bit. At 2MHz, the delay is 4x longer and is an adequate delay, again, for some keyboards. This inhibit/poll solution is not what the PS/2 interface was designed for. Inhibiting communication is expected to be a rare event (e.g. during BIOS/UEFI initialization at system BOOT). The keyboard needs to be attached to a controller that doesn't keep communication inhibited and is ready to consume data when it is available. In other words - it needs a dedicated controller or a host keyboard handler that is interrupt driven.

The issue might be fixed for now by changing the loop count from 10 * mhz to 40 * mhz.

Link to comment
Share on other sites

I made two different changes to that code that were compiled and published in this thread on August 25. You need a real board to test it. I don't know if that was ever done by Kevin.

The real problem is, as @Wavicle pointed out, that the PS/2 protocol wasn't designed to be used like this. There seems to be no standard on when a PS/2 device must become active after being disabled. The standard just says that it may not happen before 50 microseconds after the host has released the clock. It could be 50 microseconds, 100 microseconds, or any other duration. The standard doesn't prevent that it could differ from time to time even if you're using the same device (not very likely though). And different keyboards could have different delays, and so on.

Edited by Stefan
Link to comment
Share on other sites

They (somebody) funded a few thousand keyboards of a specific model so why not just get working code for THAT and call it a day?

I bought the WASD Commander kbd (natively USB) that had to be flashed to work as a PS/2 kbd and it works on my Foenix C256 U+ but not on my vintage DEC VT510 Terminal.  Meanwhile, I have a Cherry small form kbd which is USB, also supports PS/2 and it works without fuss in both.

No doubt there are challenges with PS/2, but if the platform/code can be written to support the WASD Commander and the cheap-as-chips (crisps, not ICs) Periboard, isn’t that easier?

  • Like 1
Link to comment
Share on other sites

On 1/3/2022 at 7:20 AM, EMwhite said:

They (somebody) funded a few thousand keyboards of a specific model so why not just get working code for THAT and call it a day?

I bought the WASD Commander kbd (natively USB) that had to be flashed to work as a PS/2 kbd and it works on my Foenix C256 U+ but not on my vintage DEC VT510 Terminal.  Meanwhile, I have a Cherry small form kbd which is USB, also supports PS/2 and it works without fuss in both.

No doubt there are challenges with PS/2, but if the platform/code can be written to support the WASD Commander and the cheap-as-chips (crisps, not ICs) Periboard, isn’t that easier?

I think that the issue is a little bigger than just the inhibit-poll cycle. Once the start bit is seen, the X16 code is committed to the transfer completing even though that is out of its control. It appears that a poorly timed glitch from the keyboard could send the interrupt service routine into an infinite loop effectively locking up the computer. E.g.:

[code]
lc04a:    bit port_data,x
    bne lc04a ; wait for CLK=0 (ready)
[/code]

The PS/2 bus wires are supposed to be open collectors which read as logic high when not being actively driven low. If for any reason the keyboard microcontroller doesn't pull the line low when it is expected, that loop will never exit.

Link to comment
Share on other sites

On 1/2/2022 at 11:40 PM, Stefan said:

There seems to be no standard on when a PS/2 device must become active after being disabled. The standard just says that it may not happen before 50 microseconds after the host has released the clock. It could be 50 microseconds, 100 microseconds, or any other duration. The standard doesn't prevent that it could differ from time to time even if you're using the same device (not very likely though).

I've bit-banged the PS/2 protocol by inhibiting the clock and checking it periodically (15 times per second). I tested against a wide range of keyboards and found it can take up to 4000us for some keyboards to respond when the clock is released. And I would say it is often the case that a keyboard will respond fast (within 100us) and then take a while (>2000us) on a fairly random basis.

I have the Perixx keyboard (the same model planned for this project) and it tends to be more on the temperamental side. I think that's why bit-banging with the 6502 has been abandoned and the plan is to now use the AVR processor for the keyboard interface.

  • Like 8
Link to comment
Share on other sites

That is really interesting info, @SolidState.

As to using I2C as transport layer between the ATTINY and the 65C02.

I tried to measure the time it takes to run the Kernal function i2c_read_byte. Using the clock counter at $9fb8-9fbb I came to about 1,200 clock cycles. Manual counting of the Kernal code gave a similar result, but I didn't count every code path.

1,200 clock cycles are 150 us @ 8 MHz.

It's clear that the ATTINY cannot be listening for incoming PS/2 data at the same time it makes an I2C transfer taking that time. The data valid period in the PS/2 protocol is much less than 150 us in my understanding.

This means that if you are trying to use I2C to transfer scan codes to the processor, you must inhibit the PS/2 line while doing so.

It feels wrong to do this, but it might work anyway. Even if the time it takes for the keyboard to come alive again after being disabled is 5,000 us, there is room for about 200 scan codes per second.

Edited by Stefan
Link to comment
Share on other sites

On 1/4/2022 at 11:48 AM, Stefan said:

It's clear that the ATTINY cannot be listening for incoming PS/2 data at the same time it makes an I2C transfer taking that time. The data valid period in the PS/2 protocol is much less than 150 us in my understanding.

This means that if you are trying to use I2C to transfer scan codes to the processor, you must inhibit the PS/2 line while doing so.

It feels wrong to do this, but it might work anyway. Even if the time it takes for the keyboard to come alive again after being disabled is 5,000 us, there is room for about 200 scan codes per second.

Why can't the ATTiny listen for incoming PS/2 data while handling an I2C transfer? I took a quick look at my handler and I attached an interrupt to the falling edge of the PS/2 clock pin. I can receive and ACK a transfer entirely in the ISR. I didn't spend any time stress testing the setup, but I can't think of a reason why servicing one interface would block the other.

Link to comment
Share on other sites

As an I2C slave, the ATTINY cannot make too many assumptions on the data valid period on the I2C bus. I guess that is why I in my head ruled out that the ATTINY could serve both the PS/2 and I2C lines simultaneously.

But you have tested this in hardware, and I see that it might work as you describe.

I did some manual clock cycle counting on the Kernal I2C send_bit function to calculate for how long the clock line is held high.

  • The clock transition from low to high happens at i2c.s line 223
  • The clock transition from high to low happens at line 210
  • Between those lines there are about 24 clock cycles = 3 us @ 8 MHz

I don't know, but is it correct to say that the handlers for both the I2C and the PS/2 must run within that time to guarantee that you don't loose data?

EDIT: By the way, I see that the ATTINY861 has hardware support for I2C (USI). Did you use this in your test or was the I2C bit banged? I was assuming the latter, but maybe that wasn't right. I would need to read more about USI.

Edited by Stefan
Link to comment
Share on other sites

On 1/4/2022 at 9:01 PM, Stefan said:

As an I2C slave, the ATTINY cannot make too many assumptions on the data valid period on the I2C bus. I guess that is why I in my head ruled out that the ATTINY could serve both the PS/2 and I2C lines simultaneously.

But you have tested this in hardware, and I see that it might work as you describe.

I did some manual clock cycle counting on the Kernal I2C send_bit function to calculate for how long the clock line is held high.

  • The clock transition from low to high happens at i2c.s line 223
  • The clock transition from high to low happens at line 210
  • Between those lines there are about 24 clock cycles = 3 us @ 8 MHz

I don't know, but is it correct to say that the handlers for both the I2C and the PS/2 must run within that time to guarantee that you don't loose data?

The ATTiny contains a two-wire interface hardware block that can be used for I2C. That hardware block can handle an I2C clock significantly faster than what the physical interface can likely support with typical pullups (on a long I2C bus with average pullup strength, the clock signal looks like a line of shark fins instead of a regular square wave). The ATTiny doesn't have to run code on every bit received if the firmware is enabling and using the I2C hardware.

The way I had implemented the PS/2 interface, ATTiny did need to run code on every bit received at the PS/2 interface, but that is a relatively slow interface. It wasn't very much code until the last bit (#11) was received and the code checks if it should ACK or NAK the transfer. The ATTiny has an essentially RISC architecture and using the internal oscillator the bootloader can select the CPU clock to run at a number of values between 31.25kHz and 16MHz with 8MHz and 16MHz being the most common (16MHz is the default on the Arduino ATTiny library, I think). The clock system is described in chapter 6 of the datasheet I have and has this diagram:
image.thumb.png.d79c15fc2c84550808632c808a4c4d7f.png

In any case, please take whatever I say with an appropriately-sized grain of salt. I only know what I did when I built a prototype in a short amount of time (I think it was most of a day on a weekend, I honestly do not remember, but my sketch was called "blink" so you can probably deduce which example sketch I started from and also that I didn't take the time to rename it). I am not aware of what the Dev team has decided.

  • Like 2
Link to comment
Share on other sites

On 1/2/2022 at 5:13 PM, EMwhite said:

Getting back to on-topic of this thread, why is supporting PS/2 so difficult?  Is it that the rest of the architecture of the orig. X16 does not play well with PS/2?

It seems peculiar that others have implanted PS/2 kbd. and mouse with zero fanfare or difficulty and X16 is awaiting the return [to the project] of one person (I don’t know the name is the person, somebody that wrote the kernel I believe).

PS/2 devices can behave very differently. And many are put through PS/2->USB converters. Generally it seems the case that cheap and basic keyboards are more reliable than those with lots of gadgets, bluetooth and so on.

There are three solutions to this.

  1. Connect the clock to the interrupt. You could maybe connected to NMI except you'd want to be able to gate it for some hardware, effectively another IRQ.
  2. Have some sort of clocked shift register design or an CPLD/FPGA connection that does the work for you
  3. Have a microcontroller do it similarly.

Polling is a *bad* idea. There've been Microterminals built around Atmel AVRs for years - I built a couple of 1802 based clones round the 8515, but they were matrix keyboards like the C64/Spectrum and you can address them how you like when you like.

Nobody managed to get display and PS/2 working together - they'd been working apart soundly for ages - until the AVR1284 I think, which has some clockable register that does serial input or something, can't recall. You then started getting projects like this https://www.instructables.com/Single-Chip-AVR-BASIC-Computer/  which were delayed by this limitation unless you wanted something that worked like a ZX80 and blinked.

 

Link to comment
Share on other sites

I don't mean to sound like Paul Scott Robson, so I'll try to dance around the issue a bit.

 

At work, we periodically have to assess whether what we're doing is worth the effort we're putting into it.
I understand that sometimes, business is driving the work, and there's nothing we can do about it.
But it never hurts to let business know the issues and how much trouble they will cause.

 

I hear that USB is more expensive than PS/2, which is cheap.  But... I'm starting to wonder if PS/2 is not really cheap, but rather pushes time and expense and complexity back to the board.

Do we really want home-brew complexity to drive a modern keyboard -- a solved problem?  

 

I would much rather that your efforts went into the KERNAL and toys for the banked ROM.

 

"I have this awesome retro machine, but 50% of the build time and effort goes into the keyboard interface, which has nothing to do with the retro nature of the machine."

 

 

And what's the QA effort around debugging keyboard routines?  

 

Or is this the kind of fun everyone signed up for?

 

Edited by rje
Link to comment
Share on other sites

My guess is that the community is not interested in the keyboard solution per se. We just want it to become functional so that we can get on with what really interests us, eventually owning and using a X16.

I agree that the team should choose a proven design that is easy to implement.

@Wavicle, do you see any problem supporting both keyboard and mouse with the ATTINY + I2C solution?

  • Like 2
Link to comment
Share on other sites

On 1/6/2022 at 11:29 AM, Stefan said:

My guess is that the community is not interested in the keyboard solution per se. We just want it to become functional so that we can get on with what really interests us, eventually owning and using a X16.

Yes, no doubt about that.  And the group has enough energy to talk about technical details.

 

  • Like 1
Link to comment
Share on other sites

  • Super Administrators
On 1/6/2022 at 9:29 AM, Stefan said:

My guess is that the community is not interested in the keyboard solution per se. We just want it to become functional so that we can get on with what really interests us, eventually owning and using a X16.

I agree that the team should choose a proven design that is easy to implement.

@Wavicle, do you see any problem supporting both keyboard and mouse with the ATTINY + I2C solution?

Yeah, I think most of us want it to "just work" and are less concerned with the specifics. More to the point, microcontroller devices are hardly new, and the first examples date back to 1971 with the i4004 microcontroller.

 

 

  • Like 2
Link to comment
Share on other sites

  • Super Administrators
On 1/6/2022 at 10:46 AM, rje said:

Yes, no doubt about that.  And the group has enough energy to talk about technical details.

 

Fair enough. With the fact that we have at least another year to wait for hardware (thanks, COVID), there's not much else to talk about. 😃

  • Like 1
  • Haha 2
Link to comment
Share on other sites

On 1/6/2022 at 1:46 PM, TomXP411 said:

Fair enough. With the fact that we have at least another year to wait for hardware (thanks, COVID), there's not much else to talk about. 😃

LOL, well that's absolutely true.  People gravitate towards what they like to do, and there's a wide spectrum of stuff here.

 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...

Important Information

Please review our Terms of Use