Jump to content

Leaderboard

Popular Content

Showing content with the highest reputation on 08/25/21 in all areas

  1. Hello everyone, I wanted to discuss an issue we have been struggling with, and that is PS/2 keyboard and mouse support. This is probably not a secret to anyone watching the videos on the X16 as most of the time we are running at 4MHz or 2MHz. This is completely due to PS/2 timing issues. I've spoken with Michael Steil about this issue many times and he has taken several approaches to get it to working stably. After adding the microcontroller to control the ATX power supply, I proposed the idea of adding a few more pins to also control the PS/2 ports. Like many in the community, he would perfer to get it working as it stands, so I proposed adding jumpers to select between the microcontroller, or the 65c22. So, I did just that on the third prototype. With tiny revisions, this will become our development board so it will give folks the option to try either approach. I initially thought about writing the code for the Arduino myself. I've messed around with Atmel microcontrollers going on 20 years now, but the truth is, I'm probably not the best one for this task. The code I wrote for the ATX power control is working, but I can't seem to get my reboot function to work for some reason.... Anyway, I digress, I guess the real reason I'm making this post is to ask for some help from the community! We are not committed to one approach or the other necessarily. IE, we are still planning to work on the 65c22 'bit-bang' approach (heck, we maybe could use some help here too), but alternately I wanted to start working on the microcontroller code to at least evaluate it before the final version of the board. If you're interested, here are some details wrt to how I have the HW setup on the current prototype. The attached pic shows U18, which is the ATTINY861 microcontroller. I really wanted to stick with no more than a 20 pin IC if possible, to make it look like it matches the board. This chip can run at 8 or 16MHz without an external crystal, but I gather the 16MHz mode may not be as stable. Not sure if this will be an issue later, but I wanted to mention it. I sort of just guessed the chip would be fast enough to handle both the mouse and the keyboard. There are certainly plenty of keyboard libraries out there, and at least one mouse library I'm aware of, but I'm not sure if they have ever been combined. I am running the microcontroller as an I2C device with the 6522 acting as the host to this IC. Right now, it is only listening for commands to power-off the system, reboot, reset, NMI & HDD LED control. I also hooked a line up to the 65C02 IRQ, so it could be polled. (Just as an FYI, the real-time clock is U10, which is read via I2C. J16 is the external I2C header to drive other devices if desired. J16 will also work as a 6-pin ISP header to program the ATTINY861. J6-J9 toggles the clock/data line for each PS/2 port from 65c22 to the ATTINY861.) In a nutshell, I'd like it to maintain the existing functions which should be taking next to nothing in terms of CPU, except maybe the I2C library. Integrate IRQ driven PS/2 mouse and keyboard routines and make the code as clean as possible for easy editing. If we do wind up going this route, your work could be part of the open-source library at the end of the rainbow for the system. Clearly, there will need to be some interaction on the kernal-side to process the data from these interrupts. This will take some coordination too, but I suspect the Arduino-side could be tested with another Arduino reading the data for the time being. Keep in mind, we could also wind up not using it too, just fair warning. Keep the responses on this thread for the time-being. I know this is a big ask, but I really want to make sure we have a nice solid library, so I decided it would be better to get a rock-star developer out there. Assembly is fine too, if you want to do it the hard way , but I suspect C will be fine. I also attached the code which is running on the proto for now. Please let me know if you have any questions, I'm looking forward to testing this out! Thanks! -Kevin X16_Power-current.zip
    6 points
  2. I have until the next wednesday on vacation from my full time job here in Brazil to help you. Although I have not worked with that mcu (nor have it in hands) in particular, I am use to work with ATtiny+ATmega series as a hobbyist (not as a professional). The mcu datasheet seems pretty familiar comparing to other models I have used. I can develop in C with Arduino, when it finish I can also try to make it work without the Arduino Bootloader to save more memory. I can do in assembly too, but there is no need to do that way. I do not have experience working with keyboard+mouse interfaces (w/ PS2) with microcontrolers, there is a good library in github for that. Even so, I will have to study deeper to work with that. Until my vacation's end, I can help to work in this task. But I have no experience (as a professional) in doing that job!!!
    5 points
  3. I'm going to start an artisanal fab in my basement that produces organic, GMO-free ICs. Come visit my table at the Farmer's Market.
    5 points
  4. The ATTINY861 is just like any other IC in the series. I switched from the ATTINY84 mostly because it had the leg count I want. I'm using the ATTINY (arduino) core with the small bootloader. I have a really old original STK500 I'm using to program it, although any old Atmel ISP programmer will work fine. This IC is focused on automotive applications, as I recall, but I'm just using it as a general purpose controller with I2C being the only weirdness. It also has a way to generate a 16MHz clock internally, but it may be less stable. I figure 8MHz may well be enough, but I thought it was nice to have as an option. A programmer like this will work fine: USB AVR Programmer w/ 6-Pin 10-Pin IDC ISP Connector For USBASP | eBay I also saw these: ATtiny 861 or 167 Development Board (assembled) from Azduino by Spence Konde on Tindie I can't vouch for them, I would just buy the dip IC and use it on a breadboard myself, but it could make it easy if you want a quickie and pretty cheap dev board for it. So there really should be two forks here. The first is for the 65c22 approach, which honestly, the only real input I can offer is testing on the real HW at this point. The code is here: commanderx16/x16-rom (github.com) I believe it's more up to date than the current emulator build, including some code I've tested which did sort of work at 8MHz. This means the emulator needs to be updated at some point to match the new code-base as well. Michael Steil is really the guy to ask here, I'll see what I can do to loop him in. Honestly, don't know if the mouse code even exists yet in the kernal. As far as the microcontroller goes, we should be able to test this for now with a second arduino running as an I2C master, and reading the fake IRQ. Not as ideal, but getting the kernal to process this data is also a whole other matter. I'd like to be able to validate the idea externally first. I know David is reading through the thread right now. I'm going to chat with him a little later today and get back to you guys, Thanks so much for the offer of assistance! It is greatly appreciated. -Kevin
    4 points
  5. @Kevin Williams I made a quick test introducing a wait of about 15 us after clock low transition before the data line is sampled. At least that was my intention, but I have no means of testing it. It still works in the emulator. Enclosed is the changed source file (kernal/drivers/x16/ps2.s and the compiled rom image. The changes in the source file is on line 69 and on lines 105 and forward. ps2.s rom.bin
    3 points
  6. I can do the embedded implementation, and I have an oscilloscope and logic analyzer on my bench to verify timing correctness. I haven't worked with an ATTINY861 before and I only have ATTINY85s in my bins here so I'd have to order an 861 from Mouser and hope that the 861 has a pretty similar toolchain to the 85. I'm not sure what the 861 needs for programming, but I assume if nothing else, my TL866II can do so. It would help, but isn't completely necessary, to have the X16 schematic page (or portion of a page) that details the 861 connection so that I can replicate a similar circuit on my benchtop. I'm competent with most assembly languages, however I'm probably all thumbs when it comes to 65xx assembly specifically. I can read it well enough, but have limited hobbyist experience writing it and zero professional experience with it. I'd be most useful for the embedded/861 implementation. I'm in the Seattle area (I work at the tech giant in Redmond) so it would have to be all remote work. Send me a DM if interested and I'll order the 861 parts from Mouser and we can take this to email/zoom.
    3 points
  7. Not only that, but I use Faraday cage-free electrons for all testing, and cruelty-free ceramic packages.
    3 points
  8. I looked around a bit for trustworthy information on PS/2 timing. I found this 1991 IBM manual titled "Keyboard and Auxiliary Device Controller" page 16 (page 230 in the file). https://archive.org/details/bitsavers_ibmpcps284erfaceTechnicalReferenceCommonInterfaces_39004874/page/n229/mode/2up?q=keyboard Possible timing problems: The time it takes after PS/2 lines are released by the host before the keyboard starts sending data. The current X16 Kernal waits about 108 us @ 8 MHz according to my manual clock counting. I have found no documentation stating when the keyboard must start sending data, only that it cannot begin before 50 us after the PS/2 lines were released. It would be interesting to see what happens if the time the Kernal is waiting for the keyboard to start sending data is made longer. The current Kernal samples the data line immediately after the clock goes low. This should be fine according to the specification, if the keyboard in question follows the specification. It would, however, be interesting to see what happens if the data data line is sampled in the middle of the clock low period, for example about 15-20 us after the clock transition.
    3 points
  9. Someone get Ben Eater on the phone!
    3 points
  10. I disagree. I've watched a bazillion repair videos, many of them arcade machine repairs. Many times, the failure mode tells you which of the chips is faulty - like if the even pixel rows of tiles are garbage but the odd rows are correct, that means probably the LSB of some logic chip has failed or is stuck hi/low. You can see how the computer does its machinations when the traces between the CPU and the RAM lead through decode logic. Sure, on the INSIDE, this makes no difference whatsoever. STA $370c is going to make whatever's in the accumulator go to that address, regardless whether the entire transaction happens beneath an epoxy blob or if it heads out across copper traces and gets manipulated by a bunch of logic chips, splitting the bits up between a bunch of 1-bit DRAMS. You can capture these traces with an O-Scope or logic analyzer or whatever and see what's actually taking place. From the inside, the CPU says "STA $9F41" and you don't hear the FM playback change according to the value you just wrote - what now? What's broken? Not your code. Not the architecture. A real chip failed to place a value into another real chip or that other real chip failed to produce the expected output on a pin somewhere, or maybe it did and that signal didn't make it to the speaker jack..... That's the difference. One's a "cyberspace" view of the architecture, and one's a "realspace" view of the architecture. You can say that you know the memory map of a computer system; that you know what the function of each and every memory-mapped register is. You can know exactly what the procedures are for communicating with devices at those addresses (or behind those addresses for things like VIA-attached stuff)... but that doesn't mean you know HOW the computer works. It just means you know how the computer WORKS. See the difference?
    3 points
  11. I don't mind the use of FPGA's and CPLD's. I see them as ASIC's.
    2 points
  12. In reality, my understanding of either is pretty superficial. I just figure there's always a more primitive technology that someone can pull out of their posterior because technology X is just too modern and difficult to understand.
    2 points
  13. This is why I want to create a dream computer made with vacuum tubes. Transistors are just too dang magical.
    2 points
  14. The idea there is to boost the views of the originals by making it easier to pull people in. So it would not be up until the views of the originals taper off.
    2 points
  15. You know... his question got me to thinking about the relative value of 7400s versus CPLDs, and how different digital logic is today from back-when. I guess there's still value in doing it with 7400s.
    2 points
  16. I would like to help you too but I don't have any keyboard and mouse with a PS/2 connector. At least I have a basic knowledge in microcontrollers and knows how to approach both it and 6522 bit-banging methods. So hit me up if you're interested. For a background, I used an Arduino to display Bad Apple on 240 7-segment displays
    2 points
  17. 2 points
  18. Yes, the use of a CPLD in the X16c would be a compromise, because in the cost cutting version of a board, you have to compromise. But if it functions identically to the X16p, I can understand it at the level of the circuit it is simulating, even if I may never understand it at the level of its VHDL specification and how the circuit optimizer turns that into a pattern of NANDs and NORs and latched bits. What "students"? Sheesh, I'm a college Econ professor, how could I spend time on that in a class about the Great Depression and looking at the first two decade's worth of GDP value numbers? My grandkids, of course! In many cases, the "students" would be exactly the kind of interested beginners whose questions about what is the best 8bit computer to buy to explore 8bit computing was part of the original inspiration for this whole project.
    2 points
  19. Fun fact: the Konami Code detection routine exits "true" when the A is detected. I forget the Youtuber's name, but he does videos in a series "behind the code" where he really digs deep into the source routines for things like Ninja Gaiden's jump, etc. He did one on the Konami Code.
    2 points
  20. Hello Everyone! I received the parts for the third Prototype on Tuesday evening and spent a good chunk of that night and a bit of yesterday morning getting it worked out. I had to get my code moved over to the ATTiny861 before the board would even power on. This turned out to be pretty easy now that the I2C header will also work as an Atmel ISP programming header. Of course, I'm pretty much going to make a mistake somewhere on a board of this size, and this time is no exception! Fortunately, they were easy to spot and the actual logic is working as it should. Or at least as I designed it. Two of my mistakes are visible in the pic, see if you can find them. One is just cosmetic, and the other is a bodge job on a chip which I will admit, is next to impossible to see in this photo. The less easy to spot issue is that I used the stand-by power to power the microcontroller, but I put pull-up resistors on the I2C lines (and SPI programming lines) to the system voltage and not the VSB. I did this on purpose as I didn't want to pull these lines high while programming the microcontroller, but need to pull them high for I2C. The net result is that leakage was happening backwards through these resistors when the data/clock lines were high. Enough to power on the LED on the motherboard with no other ICs plugged in. Took me a minute to figure that one out, but I think I will just throw a few more diodes in to protect this from happening. I suspect issues like this are sometimes why you may see a lone crusty resistor on an old PCB after years of use. Easy fixes all around! One last issue is that the parts sourcing scourge which has been affecting the world is also affecting TexElec! Yes, we can't get parts in for some of our products, and as time goes on, it seems like it may get worse before it gets better. And now, it has hit the X16 project! I am unable to get the FPGA and the DAC for the new Version 4 VERA, so we're still running V3. The main difference has to do with hardware deadlocks on the SD card, so functionally, it's fine. However, the lead times are a bit concerning. I'm looking into some other suppliers now, and hoping for the best. For now, here's a pic of the new machine, with no wires all over it! Take care! -Kevin
    1 point
  21. The hard part of duplicating any chip, be it in FPGA or ASIC, is knowing the exact internal structure of the original. That's what takes time. If you have the masks that define the original and still have access to the original tech, duplication is relatively easy. It would be "simple" (I think) to create a VIC compatible chip in FPGA if all one wanted to do is create something that matched the datasheet. Or the VIC II or SID or whatever. But these chips all have weird undocumented / unintended features that people have learned to exploit. Getting all of those 100% correct is what is hard.
    1 point
  22. Speaking of ASICs...probably way off topic, but what's the possibility of just burning some that duplicate the VIC? Is there some weird licensing thing going on for a 40+ year old chip no one uses anymore? The intent here being you could completely duplicate a VIC-20 with off-the-shelf parts.
    1 point
  23. I re-read above a bit closer, sorry I missed it You do bring up a good point: the challenge of highlighting the beauty within while properly identifying on the outside is not to be overlooked. It is a challenge I respect and look forward to.
    1 point
  24. If the first attempt is not successful, you may try this that prolongs the initial wait for the keyboard from about 108 us to over 400 us. This also works in the emulator. ps2.s rom.bin
    1 point
  25. Because they are ... little ASICs with an eraser so that their fabric can be repaired as needed (from one perspective, anyway). I realize that not all FPGA are "reprogrammable" ... some are like a PROM, write it once and throw it away if it doesn't work. But reprogrammable ones are more and more common.
    1 point
  26. The funny thing is I understand transistors at a physical level better than I understand vacuum tubes. In the 70s, there were not various vacuum tube electronics magazines explaining them to me ... it was assumed anyone who cared already knew.
    1 point
  27. It's perfect ! Now we can work out a doom port on it
    1 point
  28. Or added a bunch of DATA statements and replaced the main loop with READ X : READ Y : PSET X,Y,.
    1 point
  29. A quick Google search suggests that ASIC becomes more affordable than FPGA when one reaches about 400,000 units. This of course assumes that the ASIC is rock solid and doesn't require replacement. For some ASICs it isn't an issue: any "defects" become "features" at some point. If trying to reproduce old designs that becomes a bigger issue, as a defect between old and new means that it is no longer compatible. FPGA has the potential benefit that it is "always" upgradeable, so defect resolution is a simple download / reflash away.
    1 point
  30. Fortunately there isn't a really "negative" reaction (as I interpret "negative" from the available options) in this forum.
    1 point
  31. Good point. Though then you risk fragmenting your video views, and sadly, the way YouTube works, one video with high views is better than catering to multiple audiences. It's the YouTube Kobayashi Maru.
    1 point
  32. Oooh, the "Pros", and the "Antis". Not the first political rift here!
    1 point
  33. Hi @Kevin Williams, @Michael Steil, I might be able to offer some help around the 65c22 bit-bang approach. It's unusual that this works fine at lower and runs into problems at higher clock rates. Just reach out. Cheers /Philip
    1 point
  34. I would gladly help too but I don't think I'm a wizard-level developper with Arduinos (hobby programmer on Arduino and ESP32). Anyway, if any help is needed, i'll try as hard as I can. I'm also on vacation for the next 2,5 week. __________ By the way, I found a library for PS/2 mouse handling to add with the keyboard library. Not sure if that could help but putting it anyway.
    1 point
  35. Though now that you have those, a 10-15min round up video that links back to those with time-stamped cards and links but focuses on the Battle Royal itself would be good for sharing on Twitter, Facebook, etc.
    1 point
  36. He's been messing with PS/2 and USB in his last 3 or 4 videos. Pretty sure he will be able to solve this in 30 minutes or recommend a different, yet elegant approach. I wasn't joking when I suggested somebody get the guy that did the "C=key" on the line. It maps PS/2 to traditional Commodore style row/column scanning; in a worst case scenario, just get it over with using something that works even if it costs an additional $1.29 per unit.
    1 point
  37. But OR'ing the two lines and putting the result through a free pin uses up a pin. Instead, implement the lines so that b00 does not de-select both CS lines, but simply send the state through the pins. Then an external active low decoder can generate three CS lines from the 01, 10 and 00 outputs, 11 is not attached, so that is all devices deselected. If there is no free pin, then use it to select an I/O expander, that's the effective User port. If there is a free pin, there's an option for a four "slot" external SPI bus through a block pin header. The decoder would select a serial latch. You write a byte to the serial latch. It could even be a serial latch with a carry, so you get back the previous setting of the serial latch on the MISO line. Then the free pin selects the output enable of the serial latch, so the you write xxxx1110, xxxx1101, xxxx1011, or xxxx0111 to select one of four SPI devices that can be placed on the SPI bus. Put the SPI bus and the select latch outputs on a block pin header with power and ground and the select lines pulled high by a resister block, there's your hat connector for SPI based expansion boards. EDIT: After looking more closely at the MAC3701 datasheet, if there is a free pin, one might still use a MAX3701 20pin or 28pin GPIO I/O expander and use the free pin for the interrupt line that can be set up for a subset of the ports when they are in transition detect mode. If DOS is boot loaded from the SD card, the simplest implementation would be to just load the BASIC and DOS as a single block, and have a magic address the bootloader jumps to that executes the code to turn off the bootloader and start the regular reset process. Then the space for DOS would be completely flexible. In that implementation, the hardwired part would be the base load address for the Basic/Dos block. The boundary between Basic and DOS would be irrelevant to the bootloader.
    1 point
  38. Hrrrng... if I didn't have a day-job that I loved dearly, I would be all over both this and offering to help out with the kernal. I mean, I'm continuing to work on the emulator; albeit in my own fork that I haven't formally announced here, but has been named dropped here by others. Suddenly, so many things I want to add to my precious hobby time. And I've previously worked with 2-way communications in microcontrollers for various sensors, so adding a PS/2 communication routine here sounds like fun to me. I hope someone pops up who can help out with this.
    1 point
  39. I'll try to be more touchy and obnoxious.
    1 point
  40. I always wanted it to be a DIY kit and will absolutely prefer to consume it that way. To me this is simply good news. I’m definitely more interested in the original X16 than the described X8. I mostly work with Zilog line CPUs and I’d prefer a single, full-featured X16 to serve as my single reference system for this CPU.
    1 point
  41. Apologies if this has already been discussed but I couldn't find the final spec for the board design (I'm sure it's not set yet anyway). So I'm talking about the connector you usually find on ATX motherboards that connects to the power and reset buttons and LED etc. @Kevin Williams I assume the X16 will have this connector in the standard layout, but I have a suggestion that I'd love to see. Can you add on an additional two pins on the end of the header with +5VSB and GND? This would mean that you could use a single header connector to power a "fancy" front panel via the ATX permanent 5V line. This could be things like capacitive power/reset buttons on the case (which would make case production cheaper and give you more design options)
    1 point
  42. Yes, that's how it's setup. I also have a seperate 2-pin header for NMI. The way I have the code in the microcontroller at the moment uses the reset pins on the header for both reset & NMI. Short push for NMI & long for reset. It works great, but takes a little getting used too, but I feel like if you're trying to hit NMI and reset instead by accident it would be worse. The screen blanks when you've pushed long enough for reset to happen, so it feels somewhat natural this way.
    1 point
  43. I believe this to be very true. Switching RAM bank in X16 is the time it takes to write a value to a zero page address, i.e. 3 clock cycles = 375 ns @ 8MHz. Virtual RAM bank switching would require you to first write the current bank values to disk (8 kB) and then read the new bank values from disk (also 8 kB). In this thread, @Michael Steil commented the theoretical max throughput of the file system - about 13 kB/s if using (the KERNAL's) byte by byte operations, or 140 kB using the DOS routine macptr (I haven't looked closely on that, but it sounds interesting as the programs I've made have a throughput close to 13 kB/s). https://www.commanderx16.com/forum/index.php?/topic/346-how-fast-will-sd-card-access-be-on-hardware/#comment-2223 Let's assume you would actually achieve both a read and write speed of 140 kB/s. First writing, and then reading 8 kB would take like 0,11 seconds. At 13 kB/s it would take about 1.23 seconds, by the way. 0.11 seconds is quick, but compared to X16 bank switching it's very slow. In fact, you could make about 293,000 X16 bank switches in the time it takes to do one virtual disk based bank switch (assuming read/write speed of 140 kB/s). This doesn't mean that the X8 is useless. It means that the X8 and X16 requires fundamentally different thinking when you make programs. And some programs that need to use banked RAM a lot will be virtually impossible to port from X16 to X8. I would certainly miss the real banked RAM of X16 if this project ended up being the X8 (only). The banked RAM is what opens so many opportunities for interesting 8 bit programming.
    1 point
  44. You know what I really want? Just the VERA. I already made mention that we're already talking about a "VERA ecosystem", especially if both X8 and X16 are released, and @Birk reminded me: I already have fully working, fully license-problem free versions of 65xx systems sitting on the table behind me. They're called the VIC 20 and the C64C. I'll probably buy a 3rd 65xx system: Ben Eater's kit. And, at this point, I may or may not buy a Commander Xxx. But I would buy VERA modules for ALL of those (that didn't have it already). Especially if there's more than one Xxx platform anyway.
    1 point
  45. Five with me, my poor soldering skills will probably make a fried potato instead of a computer !
    1 point
  46. Wait, how did you know that I ... ... OH! I get it ... ... make that two.
    1 point
  47. I’d go with a phase 1 kit if buying the prebuilt one would add significant cost. Personally I can’t see the point in phases 2 or 3. The original goal was a computer where you could understand every part of it. Phase 1 fits that scope much better. Also, it seems to me that planning subsequent phases just makes the project bigger. I’d suggest concentrating on phase 1 and shipping it before making further plans. Being realistic, very few people will buy this if they’re not interested in retro computer architectures or programming already. It’s not going to have a software ecosystem to compete with today’s consoles. I’d take that clarity of purpose and super optimise for it. I’m more conflicted about the X8, but would probably lean towards not releasing it for the same reason. The difference in accessing VRAM is my main concern — that would make X8 software incompatible with the X16.
    1 point
  48. I figured the thread was due for post-script of sorts.... call it something of a denouement to our little adventure. (Also, I forgot to mention: If anyone sees any typos, errors, or unclear/confusing things in any of the above posts, please message me so I can fix em). On reflection, it seemed only right to take all those optimizations we made for our X16 conversion of this program, and backport them to the original Plus/4 platform in order to see what they accomplish. How much can we improve from the Plus/4's original 2 hours and 42 minute plotting time?! Well, here's the listing and outcome, as reflected in a couple screen shots from 'plus4emu': The result on the Plus/4 was actually really good ... it went from a 2 hours 42 minutes plotting time, down to 1 hour and 37 minutes! Not too shabby. You might notice I added one more optimization on the Plus/4 version, which I've sort of called out in purple in the listing above. Bonus points for anyone who can both (a) figure out what the heck I'm doing there; and (b) hazard a guess as to why the tweak provides a tremendous speed improvement on the Commodore Plus/4, but actually TAKES LONGER when added to the fastest X16 version posted in this thread above. Moving on, my last post mentioned that my college aged daughter threw down the gauntlet and challenged me to try and covert this routine to Python, which is her preferred language for all her scientific / lab / analysis stuff at school. Well, she kept pestering, and wasn't about to let me off the hook. So I downloaded a fairly all inclusive Python setup she suggested (Anaconda some such with an IDE called Spyder) and gave it a shot. Only problem: I have no genuine expertise in Python. I played with it a bit in 2006 or so just because it was included to run some bloatware on a computer I bought. So be sure, I can look at a Python program and know pretty much what its doing... but I don't 'know' Python by any stretch. But... with the help of google and the excellent reference guides from the official Python site and a few of the libraries I used, I was ultimately able to cobble something together over the past few evenings. The result was that I got Python to spit out what appears to me to be a pixel-for-pixel correct (comparing with the X16 and Plus/4 versions) output. There aren't any Python specific optimizations because I frankly would not even know where to begin. And besides, generating the image below took only seconds on the OLD (circa 2012) Core i3 machine I used for the project anyway! For what it's worth, when I emailed a copy to my daughter to say "ok, I did it, quit bugging me about it," she responded by saying (of my Python code): " um... that's not really how you are supposed to do Python..." She also accused me of not having any 'class' or something! Yes, I know she wrote 'classes' -- but tend to dislike the 'abstraction for abstraction's sake' object oriented mindset. Also, this really didn't need it and, in my view, probably would not have benefitted. And anyway, 'get the hell off my lawn!' For now, all I wanted was to cause Python to produce a 'proper' output. So I'm calling this good enough. Cheers!.
    1 point
  49. Yes indeed. Rygar is correct. Something about this game just captured my imagination. I’ve always wondered if there’s any significance to his turning the statues to face east at the end of each round. Did the devs have something in mind story-wise? Was it put in for technical reasons? I’ve never found any info about it.
    1 point
×
×
  • Create New...

Important Information

Please review our Terms of Use