Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


Wavicle last won the day on September 29 2021

Wavicle had the most liked content!

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Wavicle's Achievements


Explorer (4/14)

Reacting Well Rare Dedicated Rare First Post Rare Collaborator Rare Week One Done

Recent Badges



  1. The render of the Rev 3 board shows this 6 pin header labeled "I2C" adjacent to VIA #1. Interpret that as you may. I'm not sure why the header is 6 pins.
  2. The spec for 640x480 60Hz VGA is exactly 800 clock cycles at 25.175 MHz per scanline. The math from there is pretty much as @TomXP411 says: there are about 39.72 nanosecond per clock, ~31.78 microseconds per line, ~254.2 cycles of an 8 MHz clock per line, which works out to ~50.84 "average instructions".
  3. Is there a place other than here and the FB group where official information may be announced? I try to follow both and didn't think the FB group was any better.
  4. Grabbed a better still and updated my best guesses for VERA components:
  5. I like that they based it around the Upduino and Icebreaker which are relatively easy to get. That said, this looks more like a VERA-inspired project and not a clone. It has a few features that VERA does not, but if I'm reading the RTL correctly, there is no space available for sprites. The VRAM data path is 16 bits wide, clocked at the dot clock frequency, and playfields have priority. Since the effective memory bandwidth is 2 bytes per dot clock, there are no spare cycles for anything else (e.g. sprites, host VRAM access) with two playfields active. Reading or writing to memory during active display may only take place during blanking periods. It looks like software needs to check a VRAM read/write address register to make sure that VRAM access goes to the intended location. While only allowing access during HBLANK or VBLANK was a common limitation for several early computers and consoles, it was not for the C64 and is not for VERA. I know widening the VRAM data path is listed as a TODO item, but the Upduino+VGA version's FPGA has used 100% of the BRAM and 83% of the LCs. Sprites are a pretty big deal on retro hardware. I think the design would be vastly improved by completely dropping the second playfield in favor of sprites.
  6. Hmm, off the top of my head, I can only think of the ATTiny and what looks like an STM32 on VERA. The ATTiny handles the ATX power sequencing and potentially the PS/2 keyboard and mouse. I'm not certain about the STM32 on VERA, but I think it is used for programming the FPGA's SPI flash so it isn't strictly necessary. Is there anything else?
  7. I think the biggest reason is aesthetic: the ESP32 looks nothing like an 80's era computer component. I have no idea if additional FCC testing beyond what the ESP32 already has would be needed. I think everyone would love the ability to transfer files to the X16 via WiFi (I prototyped such a card-based solution over the summer on a breadboard 6502; can't recall if I posted it here or not).
  8. I have no idea what the report rate of the mouse is, but my gut tells me that handling two serial interfaces with a peak clock of 20kHz should not be a problem even if both interfaces were saturated. The I2C clock runs substantially faster, but is mostly handled in hardware so missing a bit there is not a concern. It seems pretty solid.
  9. Fail fast is a good engineering principle. It takes a bit of discipline to let go of the urge to follow a sunk a cost. We do this where I work also; I've watched features that I was very fond and poured weeks into get dropped. A bitter pill, yes, but I understood the business justification. I'm less good at it with my own hobby projects because I do not have firm dates I'm managing work for; I suspect it's something similar here. The first X16 video is coming up on 3 years old, I think? I think the problem may be that it was originally envisioned that the 6502 keyboard handler would be interrupt driven and for some reason that plan was abandoned. I don't know why and I'm not a 6522/VIA expert (not even a novice for that matter). It looks to me from the datasheet that there is no easy way to configure a negative edge sensitive interrupt trigger on a single IO wire. There are some peripheral interrupt/handshake pins that appear to be for this purpose (CA1, CA2, CB1, and CB2) and maybe things could be improved by moving the PS/2 clock to one of those pins and having the whole thing interrupt driven, but we'd still need something that would allow the 6502 to pull the clock low manually to inhibit traffic (I think that would just mean adding a transistor). I don't know how interrupt cause is detected and routed to the appropriate handler routine so I'm not sure what the overhead of such an interrupt would be since we'd be expecting several of them to come in over a few milliseconds. I doubt going to a USB solution would make things less complex. The keyboard and mouse on the X8 are USB and have a simple controller implemented in the FPGA. In the X16 case, it would be one more thing offloaded to VERA. I think moving the PS/2 keyboard handling to a microcontroller is the right thing to do in this case.
  10. The ATTiny contains a two-wire interface hardware block that can be used for I2C. That hardware block can handle an I2C clock significantly faster than what the physical interface can likely support with typical pullups (on a long I2C bus with average pullup strength, the clock signal looks like a line of shark fins instead of a regular square wave). The ATTiny doesn't have to run code on every bit received if the firmware is enabling and using the I2C hardware. The way I had implemented the PS/2 interface, ATTiny did need to run code on every bit received at the PS/2 interface, but that is a relatively slow interface. It wasn't very much code until the last bit (#11) was received and the code checks if it should ACK or NAK the transfer. The ATTiny has an essentially RISC architecture and using the internal oscillator the bootloader can select the CPU clock to run at a number of values between 31.25kHz and 16MHz with 8MHz and 16MHz being the most common (16MHz is the default on the Arduino ATTiny library, I think). The clock system is described in chapter 6 of the datasheet I have and has this diagram: In any case, please take whatever I say with an appropriately-sized grain of salt. I only know what I did when I built a prototype in a short amount of time (I think it was most of a day on a weekend, I honestly do not remember, but my sketch was called "blink" so you can probably deduce which example sketch I started from and also that I didn't take the time to rename it). I am not aware of what the Dev team has decided.
  11. Why can't the ATTiny listen for incoming PS/2 data while handling an I2C transfer? I took a quick look at my handler and I attached an interrupt to the falling edge of the PS/2 clock pin. I can receive and ACK a transfer entirely in the ISR. I didn't spend any time stress testing the setup, but I can't think of a reason why servicing one interface would block the other.
  12. I think that the issue is a little bigger than just the inhibit-poll cycle. Once the start bit is seen, the X16 code is committed to the transfer completing even though that is out of its control. It appears that a poorly timed glitch from the keyboard could send the interrupt service routine into an infinite loop effectively locking up the computer. E.g.: [code] lc04a: bit port_data,x bne lc04a ; wait for CLK=0 (ready) [/code] The PS/2 bus wires are supposed to be open collectors which read as logic high when not being actively driven low. If for any reason the keyboard microcontroller doesn't pull the line low when it is expected, that loop will never exit.
  13. Fair point, I hadn't accounted for the register being 8 bits.
  • Create New...

Important Information

Please review our Terms of Use