PS/2 Direction for the Commander X16

Announcements by the development team or forum staff.
Stefan
Posts: 456
Joined: Thu Aug 20, 2020 8:59 am

PS/2 Direction for the Commander X16

Post by Stefan »


I made two different changes to that code that were compiled and published in this thread on August 25. You need a real board to test it. I don't know if that was ever done by Kevin.

The real problem is, as @Wavicle pointed out, that the PS/2 protocol wasn't designed to be used like this. There seems to be no standard on when a PS/2 device must become active after being disabled. The standard just says that it may not happen before 50 microseconds after the host has released the clock. It could be 50 microseconds, 100 microseconds, or any other duration. The standard doesn't prevent that it could differ from time to time even if you're using the same device (not very likely though). And different keyboards could have different delays, and so on.

EMwhite
Posts: 220
Joined: Mon Sep 07, 2020 1:02 pm

PS/2 Direction for the Commander X16

Post by EMwhite »


They (somebody) funded a few thousand keyboards of a specific model so why not just get working code for THAT and call it a day?

I bought the WASD Commander kbd (natively USB) that had to be flashed to work as a PS/2 kbd and it works on my Foenix C256 U+ but not on my vintage DEC VT510 Terminal.  Meanwhile, I have a Cherry small form kbd which is USB, also supports PS/2 and it works without fuss in both.

No doubt there are challenges with PS/2, but if the platform/code can be written to support the WASD Commander and the cheap-as-chips (crisps, not ICs) Periboard, isn’t that easier?

Wavicle
Posts: 284
Joined: Sun Feb 21, 2021 2:40 am

PS/2 Direction for the Commander X16

Post by Wavicle »



On 1/3/2022 at 7:20 AM, EMwhite said:




They (somebody) funded a few thousand keyboards of a specific model so why not just get working code for THAT and call it a day?



I bought the WASD Commander kbd (natively USB) that had to be flashed to work as a PS/2 kbd and it works on my Foenix C256 U+ but not on my vintage DEC VT510 Terminal.  Meanwhile, I have a Cherry small form kbd which is USB, also supports PS/2 and it works without fuss in both.



No doubt there are challenges with PS/2, but if the platform/code can be written to support the WASD Commander and the cheap-as-chips (crisps, not ICs) Periboard, isn’t that easier?



I think that the issue is a little bigger than just the inhibit-poll cycle. Once the start bit is seen, the X16 code is committed to the transfer completing even though that is out of its control. It appears that a poorly timed glitch from the keyboard could send the interrupt service routine into an infinite loop effectively locking up the computer. E.g.:

[code]

lc04a:    bit port_data,x

    bne lc04a ; wait for CLK=0 (ready)

[/code]

The PS/2 bus wires are supposed to be open collectors which read as logic high when not being actively driven low. If for any reason the keyboard microcontroller doesn't pull the line low when it is expected, that loop will never exit.

SolidState
Posts: 13
Joined: Sun Aug 22, 2021 11:53 pm

PS/2 Direction for the Commander X16

Post by SolidState »



On 1/2/2022 at 11:40 PM, Stefan said:




There seems to be no standard on when a PS/2 device must become active after being disabled. The standard just says that it may not happen before 50 microseconds after the host has released the clock. It could be 50 microseconds, 100 microseconds, or any other duration. The standard doesn't prevent that it could differ from time to time even if you're using the same device (not very likely though).



I've bit-banged the PS/2 protocol by inhibiting the clock and checking it periodically (15 times per second). I tested against a wide range of keyboards and found it can take up to 4000us for some keyboards to respond when the clock is released. And I would say it is often the case that a keyboard will respond fast (within 100us) and then take a while (>2000us) on a fairly random basis.

I have the Perixx keyboard (the same model planned for this project) and it tends to be more on the temperamental side. I think that's why bit-banging with the 6502 has been abandoned and the plan is to now use the AVR processor for the keyboard interface.

Stefan
Posts: 456
Joined: Thu Aug 20, 2020 8:59 am

PS/2 Direction for the Commander X16

Post by Stefan »


That is really interesting info, @SolidState.

As to using I2C as transport layer between the ATTINY and the 65C02.

I tried to measure the time it takes to run the Kernal function i2c_read_byte. Using the clock counter at $9fb8-9fbb I came to about 1,200 clock cycles. Manual counting of the Kernal code gave a similar result, but I didn't count every code path.

1,200 clock cycles are 150 us @ 8 MHz.

It's clear that the ATTINY cannot be listening for incoming PS/2 data at the same time it makes an I2C transfer taking that time. The data valid period in the PS/2 protocol is much less than 150 us in my understanding.

This means that if you are trying to use I2C to transfer scan codes to the processor, you must inhibit the PS/2 line while doing so.

It feels wrong to do this, but it might work anyway. Even if the time it takes for the keyboard to come alive again after being disabled is 5,000 us, there is room for about 200 scan codes per second.

Wavicle
Posts: 284
Joined: Sun Feb 21, 2021 2:40 am

PS/2 Direction for the Commander X16

Post by Wavicle »



On 1/4/2022 at 11:48 AM, Stefan said:




It's clear that the ATTINY cannot be listening for incoming PS/2 data at the same time it makes an I2C transfer taking that time. The data valid period in the PS/2 protocol is much less than 150 us in my understanding.



This means that if you are trying to use I2C to transfer scan codes to the processor, you must inhibit the PS/2 line while doing so.



It feels wrong to do this, but it might work anyway. Even if the time it takes for the keyboard to come alive again after being disabled is 5,000 us, there is room for about 200 scan codes per second.



Why can't the ATTiny listen for incoming PS/2 data while handling an I2C transfer? I took a quick look at my handler and I attached an interrupt to the falling edge of the PS/2 clock pin. I can receive and ACK a transfer entirely in the ISR. I didn't spend any time stress testing the setup, but I can't think of a reason why servicing one interface would block the other.

Stefan
Posts: 456
Joined: Thu Aug 20, 2020 8:59 am

PS/2 Direction for the Commander X16

Post by Stefan »


As an I2C slave, the ATTINY cannot make too many assumptions on the data valid period on the I2C bus. I guess that is why I in my head ruled out that the ATTINY could serve both the PS/2 and I2C lines simultaneously.

But you have tested this in hardware, and I see that it might work as you describe.

I did some manual clock cycle counting on the Kernal I2C send_bit function to calculate for how long the clock line is held high.


  • The clock transition from low to high happens at i2c.s line 223


  • The clock transition from high to low happens at line 210


  • Between those lines there are about 24 clock cycles = 3 us @ 8 MHz


I don't know, but is it correct to say that the handlers for both the I2C and the PS/2 must run within that time to guarantee that you don't loose data?

EDIT: By the way, I see that the ATTINY861 has hardware support for I2C (USI). Did you use this in your test or was the I2C bit banged? I was assuming the latter, but maybe that wasn't right. I would need to read more about USI.

Wavicle
Posts: 284
Joined: Sun Feb 21, 2021 2:40 am

PS/2 Direction for the Commander X16

Post by Wavicle »



On 1/4/2022 at 9:01 PM, Stefan said:




As an I2C slave, the ATTINY cannot make too many assumptions on the data valid period on the I2C bus. I guess that is why I in my head ruled out that the ATTINY could serve both the PS/2 and I2C lines simultaneously.



But you have tested this in hardware, and I see that it might work as you describe.



I did some manual clock cycle counting on the Kernal I2C send_bit function to calculate for how long the clock line is held high.




  • The clock transition from low to high happens at i2c.s line 223


  • The clock transition from high to low happens at line 210


  • Between those lines there are about 24 clock cycles = 3 us @ 8 MHz




I don't know, but is it correct to say that the handlers for both the I2C and the PS/2 must run within that time to guarantee that you don't loose data?



The ATTiny contains a two-wire interface hardware block that can be used for I2C. That hardware block can handle an I2C clock significantly faster than what the physical interface can likely support with typical pullups (on a long I2C bus with average pullup strength, the clock signal looks like a line of shark fins instead of a regular square wave). The ATTiny doesn't have to run code on every bit received if the firmware is enabling and using the I2C hardware.

The way I had implemented the PS/2 interface, ATTiny did need to run code on every bit received at the PS/2 interface, but that is a relatively slow interface. It wasn't very much code until the last bit (#11) was received and the code checks if it should ACK or NAK the transfer. The ATTiny has an essentially RISC architecture and using the internal oscillator the bootloader can select the CPU clock to run at a number of values between 31.25kHz and 16MHz with 8MHz and 16MHz being the most common (16MHz is the default on the Arduino ATTiny library, I think). The clock system is described in chapter 6 of the datasheet I have and has this diagram:

image.thumb.png.d79c15fc2c84550808632c808a4c4d7f.png

In any case, please take whatever I say with an appropriately-sized grain of salt. I only know what I did when I built a prototype in a short amount of time (I think it was most of a day on a weekend, I honestly do not remember, but my sketch was called "blink" so you can probably deduce which example sketch I started from and also that I didn't take the time to rename it). I am not aware of what the Dev team has decided.

paulscottrobson
Posts: 305
Joined: Tue Sep 22, 2020 6:43 pm

PS/2 Direction for the Commander X16

Post by paulscottrobson »



On 1/2/2022 at 5:13 PM, EMwhite said:




Getting back to on-topic of this thread, why is supporting PS/2 so difficult?  Is it that the rest of the architecture of the orig. X16 does not play well with PS/2?



It seems peculiar that others have implanted PS/2 kbd. and mouse with zero fanfare or difficulty and X16 is awaiting the return [to the project] of one person (I don’t know the name is the person, somebody that wrote the kernel I believe).



PS/2 devices can behave very differently. And many are put through PS/2->USB converters. Generally it seems the case that cheap and basic keyboards are more reliable than those with lots of gadgets, bluetooth and so on.

There are three solutions to this.


  1. Connect the clock to the interrupt. You could maybe connected to NMI except you'd want to be able to gate it for some hardware, effectively another IRQ.


  2. Have some sort of clocked shift register design or an CPLD/FPGA connection that does the work for you


  3. Have a microcontroller do it similarly.


Polling is a *bad* idea. There've been Microterminals built around Atmel AVRs for years - I built a couple of 1802 based clones round the 8515, but they were matrix keyboards like the C64/Spectrum and you can address them how you like when you like.

Nobody managed to get display and PS/2 working together - they'd been working apart soundly for ages - until the AVR1284 I think, which has some clockable register that does serial input or something, can't recall. You then started getting projects like this https://www.instructables.com/Single-Chip-AVR-BASIC-Computer/  which were delayed by this limitation unless you wanted something that worked like a ZX80 and blinked.

 

rje
Posts: 1263
Joined: Mon Apr 27, 2020 10:00 pm
Location: Dallas Area

PS/2 Direction for the Commander X16

Post by rje »


I don't mean to sound like Paul Scott Robson, so I'll try to dance around the issue a bit.

 

At work, we periodically have to assess whether what we're doing is worth the effort we're putting into it.

I understand that sometimes, business is driving the work, and there's nothing we can do about it.

But it never hurts to let business know the issues and how much trouble they will cause.

 

I hear that USB is more expensive than PS/2, which is cheap.  But... I'm starting to wonder if PS/2 is not really cheap, but rather pushes time and expense and complexity back to the board.

Do we really want home-brew complexity to drive a modern keyboard -- a solved problem?  

 

I would much rather that your efforts went into the KERNAL and toys for the banked ROM.

 

"I have this awesome retro machine, but 50% of the build time and effort goes into the keyboard interface, which has nothing to do with the retro nature of the machine."

 

 

And what's the QA effort around debugging keyboard routines?  

 

Or is this the kind of fun everyone signed up for?

 

Post Reply