Jump to content

voidstar

Members
  • Posts

    42
  • Joined

  • Last visited

  • Days Won

    1

voidstar last won the day on March 2

voidstar had the most liked content!

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

voidstar's Achievements

Apprentice

Apprentice (3/14)

One Year In First Post Collaborator Rare Conversation Starter Week One Done

Recent Badges

14

Reputation

  1. My question was just how to use CC65 inline assembly to replicate the query for the depth of the keyboard buffer. And now I see that the cc65 develop branch (with the libsrc folder) is old relative to the current release branch (that contains a prebuilt cx16.lib). i.e. #include conio.h and calling kbhit() works with the release pre-built, but the code in the libsrc is not what is actually getting linked in. I disassembled into what kbhit becomes in my .PRG, and now I see it's like this (the following does work, insofar as it does return the current keyboard buffer depth and matches the kbhit return): unsigned char check_kbd() { static unsigned char keys = 42; __asm__("jsr $febd"); __asm__("txa"); __asm__("sta %v", keys); // %o (local) or %v (global) return keys; } And, I see now why %o wasn't working for me, because I had the following pragmas in the sequence of includes... (as my test was mingled in with another project) #pragma static-locals (on) #pragma register-vars (on) Removing those lets me use %o in the __asm__ (as far as no type mismatch is flagged), however it still doesn't actually work (i.e. "sta %o,keys" doesn't seem to actually store reg A into local var "keys"). When I use %o (and remove "static" from "keys"), it compiles and links -- but the result always stays 42. And I'm not finding much in examples of how to use %o vs %v (but I see %o says it is a "local index", perhaps the argument isn't what I think it is). Just experimenting to see how the inline assembly works. I wouldn't actually use the check_kbd call as presented, but trying to see what it allows. So... Just curious, in the cx16 rom.bin where does this $febd map to? Offset $febd in that file is an empty no-mans-land of 0xAA's. Is an annotated disassembly of rom.bin public available?
  2. Any idea why the cc65 equivalent doesn't seem to be working? unsigned char check_kbd() { unsigned char keys = 42; __asm__("ldy $9f61"); // save current RAM bank __asm__("stz $9f61"); // store 0 to VIA1::PRA (RAM bank 0) __asm__("lda $a00a"); // KEY_COUNT as defined in cx16.inc __asm__("sta %v", keys); // %o didn't work __asm__("sty $9f61"); // restore Y back return keys; } A disassembly of the assembled PRG seems correct: (#$2A aka 42 was just a test -- the function is return 0, so I think that the STA is working ok -- $2cdf is just what address was decided for the local "keys") I couldn't (yet) recreate a .o of the kbhit.s of cc65 for cx16. I figured that code should be somewhere in the cx16.lib (if anything, the 9f61 and a00a addresses should be somewhere such that the code looks similar to the above). But it's taking awhile to disassemble that (ran out of time for today). The regular call to kbhit() is working, so I know the emulator is working ok to support this. Was just trying to exercise the cc65 inline assembly (as it gives you a little more control on inlining, that cc65 doesn't do very well on its own).
  3. I was experimenting with the WiModem232 on the CoCo1... Without the RS232 Pak, the built in serial connection could only do 1200 baud. That was surprising since the PET could manage 2400 baud. Adding the RS232 Pak on the CoCo (which connects thru the cartridge port), then it could handle 9600 baud. With the 8MHz X16, I'm not sure what the difference will be (between some onboard emulation versus some solution as a cartridge).
  4. The Sega Dreamcast had a built in modem Wish I still had that system - had a keyboard too, and a web browser software was available for it eventually. I read about the 1982 game COMMBAT, published Scott Adams (I emailed him recently and confirmed it actually existed and works). It let a TRS-80 and Apple II users play a simple game against each other, over a modem connection (or could be same-type systems also). So, it had some simple protocol to communicate plays and game-state between these two systems. Another example is SOPWITH2 around 1984 (it was IBM PC only, but worked across a serial connection). No ethernet. I just thought it would be interesting to replicate that "heterogeneous system" connectivity, across a modem. I still have lots of physical systems and don't use emulators all that much. Also: These days, you have the projects with "weird size" LED screens - like say 32x4 rows. Sure, someone could do a terminal emulation that supports ANSI on that. Then the Datapoint 2200 from 1970 was 80x16, then the TRS-80s/IBM 5100/Wang 2200 with 64x16. Not that many people are powering up pre-1977 equipment (I do) - not much selection for ANSI terminals on those anyway (since even if a terminal program were written, not many have a working tape or 8" disk to even load up any software). Followed by 40x24 on the early Apple's (PET was 40x25), 80x25 for PC. Then finally modern GUIs let us virtualize up to any size console we want (like 200x160, or... whatever suits you). If a modernized "smart BBS" asked for what console size to use during that connection-session, then it could render ANSI stuff accordingly to that resolution. Maybe like a "RetroTube" or "ASCIITube": You connect using the WiModem232 via a command like "ATDTsomewhere.com:1234", enter your console resolution, and it pushes content appropriate to that specified console size (similar to how YouTube is forever transcoding videos to be more appropriate to the viewers bandwidth and screen resolution) -- whether that connected client is ancient or modern. If anything, it could be a "screen saver" to play during vintage computing gatherings No animation per se, but a kind of "ANSI codec" to find the least amount of escape sequences to transition from one image/scene to another could be computed on the fly (by this kind of "smart BBS" server). Revised... Not a network between these systems, but managed connections by a "smart BBS" (that handles interaction/collaboration between the connections, w.r.t. the specified console resolution of both systems involved)
  5. As for floating point -- no, I don't think the "float" type was yet added to cc65. Unless there is some branch or add-on that I'm not aware of. I just get "Floating point type is currently unsupported." And it's a difficult thing to implement on a processor that doesn't natively support double or single precision floats. I haven't checked out vbcc in a while, but perhaps they added both float support and a target output to the 6502. Often you just have to re-think the approach to the problem, so floating point might not be truly necessary. In theory, we should be able to invoke and use the built in BASIC ROM float support - but I think you'd have to POKE your way into tricking BASIC into thinking some variables had been set to the float values you want, and interpret the results (encode/decode BASIC's internal float representation). So it may not really save much code space to do all that, relative to just re-implementing float ourselves.
  6. I was dusting off cc65 for this same purpose, to see the state of X16 support. Here's my quick notes (to me a quick list of steps helps more than a video): - Download cc65 (clone, zip) from github: cc65/cc65: cc65 - a freeware C compiler for 6502 based systems (github.com) - Unzip to a folder. Start your project folder. Arrange something like this: - In the source folder, I keep two batch scripts like this: #1, compile script... set CC65PATHBINROOT=F:\X16\cc65-snapshot-win32\bin set CC65PATHINCROOT=F:\X16\cc65-snapshot-win32\include set OPTIMIZE=-O -Oi -Or -Os REM COMPILE, ASSEMBLE, LINK... %CC65PATHBINROOT%\cc65 %OPTIMIZE% --target cx16 --include-dir ..\include --include-dir %CC65PATHINCROOT% main.c %CC65PATHBINROOT%\ca65 --target cx16 main.s %CC65PATHBINROOT%\ld65 --target cx16 --obj main.o --lib ..\lib\cx16.lib -o main_cx16.out HINT: Neat to compare the new main.s with a prior main.s, like by using BeyondCompare, and see how much code your project changes are adding during each build -- remember, the more "code space" you use, the less RAM your program has to work with -- try linking with other optimization options; also try to stick to just one output function, like change to all puts or printf, not a mix of both. #2, update and run The resulting .out file is a PRG. I don't output directly to the PRG just as an extra precaution (like if I'm doing some experiment and don't want to mess up the PRG). It's not necessary, just an old habit. I keep the emulator in another folder, so I just do a script to copy the .out to where I keep the emulator and to a PRG extension. Then in the emulator folder, I run it like this: F:\X16\x16emu_win-r41>x16emu.exe -prg main_cx16.prg The -prg automatically does the mount and load, so once the emulator is up, just type RUN. And here some sample starter code... EDIT: I say "starter" code because the C-library lets you do a lot of things and get up and going quickly -- but they're going to be relatively slow and eat up a lot of code space. It's not that the C-standard stuff is "bad" -- printf just has a lot more capability (% translations) than most people use, but all its code is linked in, and it's still a lot of IF checks to get past all those things you're not using. Or kbhit could be some inlined assembly instead, since there is overhead in just calling a function. #include <stdarg.h> //< if need: va_list, va_start, va_arg #include <string.h> //< if need: memcpy, memset, strlen #include <stdlib.h> //< Used for srand #include <stdio.h> //< printf - consumes a lot of code space #include <conio.h> // starter - to be replaced with VPOKE stuff (see cx16.h) // THIS STILL WORKS! Same as used on original Commodore PET #define CLRSCR \ __asm__("lda #$93"); \ __asm__("jsr $ffd2"); // Not used in the example below, but you can also use the following to switch between character sets. #define ENABLE_MODE_1 \ __asm__("lda #$8E"); \ __asm__("jsr $ffd2"); #define ENABLE_MODE_2 \ __asm__("lda #$0E"); \ __asm__("jsr $ffd2"); #define TRUE 1 #define X16_COLOR_BLACK 0x00U #define X16_COLOR_WHITE 0x01U #define X16_COLOR_RED 0x02U #define X16_COLOR_CYAN 0x03U #define X16_COLOR_PURPLE 0x04U #define X16_COLOR_GREEN 0x05U #define X16_COLOR_BLUE 0x06U #define X16_COLOR_YELLOW 0x07U #define X16_COLOR_ORANGE 0x08U #define X16_COLOR_BROWN 0x09U #define X16_COLOR_LRED 0x0AU #define X16_COLOR_DGREY 0x0BU #define X16_COLOR_MGREY 0x0CU #define X16_COLOR_LGREEN 0x0DU #define X16_COLOR_LBLUE 0x0EU #define X16_COLOR_LGREY 0x0FU void main(void) { unsigned char x; unsigned char y; char buffer[128]; char ch; int example_int; buffer[0] = '\0'; // init to null string textcolor(X16_COLOR_YELLOW); bgcolor(X16_COLOR_BLACK); //bordercolor(X16_COLOR_RED); // < doesn't seem to work CLRSCR; cputs("Hello World\n"); // just showing that cputs does work printf("press a key"); while (TRUE) // example of busy-waiting for a keystroke { ch = kbhit(); if (ch == TRUE) { break; } } ch = cgetc(); // get the actual keystroke if (ch == 0) // just in case... be hard to actually type this { cputs("ZERO"); } else if (ch == 65) // example of some logic response... could do switch instead { cputs("A"); } sprintf(buffer, "%u", ch); // convert the character to ASCII string gotox(40); // does not change y column cputs("pressed: "); cputs(buffer); // show an "ASCII symbol chart" for this system ch = 0; for (x = 0; x < 7; ++x) { for (y = 1; y < 40; ++y) { gotox(x*10); gotoy(y+2); // 0 is top row sprintf(buffer, "%3d %3X %c", ch, ch, ch); if ( ((ch & 0x0FU) != X16_COLOR_BLACK) && ((x & y) ^ ch > 0) ) // just for fun, example of using logic operators { textcolor(ch); } else { textcolor(X16_COLOR_YELLOW); } cputs(buffer); ++ch; } } textcolor(X16_COLOR_WHITE); printf("\nInput test: "); scanf("%d", &example_int); // don't forget to use the & ADDRESS if (example_int < 0) { printf("\nNEGATIVE [%d]\n", example_int); } else { printf("\nPOSITIVE [%d]\n", example_int); } printf("That's all!\n"); } EDIT: For performance, you'd transition from printf to some form of VPOKE wrapper -- but in doing so, remember you then have to work in "screen character set" rather than ASCII character set. This is also why printf is slow, since it does the work of translating your ASCII request into the systems native screen character set for you. But if you output directly to the screen, you avoid the overhead of that translation.
  7. No (float) solution yet for CC65, but for reference... Assembly examples of implementing some floating-point support (from 1976 and the 6502): www.6502.org/source/floats/wozfp1.txt In the discussion below, we pondered invoking the built in BASIC to perform floating point functions using the available ROM software. You'd have to contrive setting up variables in BASIC (POKE) and call the right routine, then extract the answer back out. Possible, but not very efficient (and very coupled to that ROM). But also in the discussion, someone ended up adding support for the "float" type in C in something called vbcc. calling ROM BASIC functions from cc65 - Commodore 64 (C64) Forum (lemon64.com) A person made a build of my C code that was compiled for the Commodore PET (6502 CPU), and the C and resulting binary is here: (a modified C compiler to include support for the "float" type) voidstar78/SolarSystemCalculator: Solar System Calculator (github.com) Note, if you just need a MOD operator (that still requires float support to determine that remainder).... You might just try using a table lookup instead of the actual math. When determing some screen-coordinate stuff in a game I made, I ended up using a MOD_4_TABLE and DIV_4_TABLE, avoiding having to actually do the floating point operations.
  8. Connecting classic systems to the modern internet won't be worthwhile - no performance to absorb all the extra pushed traffic or (performant) capability to interact with the graphics. But forget all that, and forget the idea of a web browser. I suppose what I'm thinking is something like what is shown below -- which requires an "all new" kind of BBS host, and "all new" kind of terminal software for the various vintage systems. That sounds bad (i.e. won't ever be built), but humor me for a bit, because there may be a compromise between what already exist and what needs to be built.... The WiModem232 does the work of relaying the local serial IO over to TCP/IP - so you just type "ATDTimts.com:1234" (or something like that) Most of the vintage systems have some kind of VT100 compatible terminal available (i.e. can interpret those ANSI escape sequences). But the "experience" is going to be inconsistent. First, consider baud rate. The baud is a function of several aspects: the serial IO hardware available, the MHz of the system, AND how efficiently the terminal software is written. Collectively, all that translates into a max baud rate that that terminal software can support. If you bump up the baud rate just a little bit, it might still kind-of work, but will start to drop characters here and there -- the dropping of characters is (generally) the terminal software not being able to keep up (increasing the MHz of the system might be enough to compensate, but of course at the risk of causing other issues). At least, these were my observations on the PET 4016 and IBM PC 5150 with original serial card. If the characters stop entirely, that's probably past the hardware limits (i.e. wrong baud rate). Next, VT100: not all the terminal software interprets all the escape sequences consistently. I don't have a specific example to refer to, other than in the PETTERM documentation I recall the author saying something to the effect of "not all VT100 commands are interpreted" (a similar statement is in Brutman mTCP telnet documentation - it may depend on the version, as the software may be incrementally adding more support over time: mTCP Telnet client for DOS (brutman.com) -- the Netris thing is neat) Then the BBS software itself - most of them generally have to assume some kind of "text screen" resolution. Some avoid "moving the cursor" and drawing things, being sensitive to connections that don't have 80x25 or 40x25, or just don't have the ability to draw on a screen (so we're going back to line printers...). And some clients might not even support ASCII, or don't support lower case characters (think about that -- RIP password systems that require at least one lower case letter! haha). Around 1988 or so, there was an alternative to ANSI called AVATAR. As an example: with ANSI, you need about 7 full bytes to express a cursor movement. I've forgotten the specifics, but something like: ESC BRACKET CODE COL SEMICOLON ROW ( <- [ M 34;54 ). AVATAR used a shorter expression, to get that same information down to about 3 or 4 bytes. In addition, AVATAR supports codes for "scrolling windows" - so you could have state info in a portion of the screen, and a "chat window" or "scrolling status window" in another portion (so in one short binary code, you could say "make a window this tall and wide starting at this x,y with foreground A and background B"). But supporting that windowing is asking a lot for whoever is implementing AVATAR - especially on a 64K or 32K RAM system. But in general, when using an AVATAR supported BBS, it was roughly 2x as fast as ANSI (well, if color was involved). When I wrote DestinyHunter for the 1MHz 6502-based PET, it was a struggle to get flicker free animation while using PETSCII -- parts of it had to be done in assembly. So I can understand the challenges of writing a terminal program that buffers inputs, interprets escape sequences, and draws updates while polling for user inputs -- to then also add some game-state processing on top of that, it's rough (for a 1MHz 64KB system). Asking people to write a NEW terminal software for a 1MHz isn't an easy thing to ask for (e.g. like more clients that support something like AVATAR -- I think QModem on the PC did). You'd have to dust off a lot of old code, that's probably in assembler. Like take PETTERM itself: the code (of just handling the serial IO traffic and interpret a few escape sequences) consumes about 4K RAM (it's on github). But from that baseline, one could add support for new escape/code sequences - but not everyone is assembler savvy. Having the "serial IO code" in C might help (or the groundwork of using inline-assembly to do the serial IO stuff) -- but the more "features" the terminal handles (like elaborate interpretation of escape/code sequences), that effectively slows down your baud rate (if you spend too much CPU time drawing stuff, you potentially start dropping characters in the serial IO buffers). Around 1994-1996, we had some "graphical" BBS - using new protocols. The concept was similar to ANSI or AVATAR, in that some sequence of binary was interpreted as a command to enter a certain graphics mode, draw some lines, fill this area with that color, etc. On the initial connection, you just had to answer a question on if you supported whatever that protocol was (forget the names -- it was fairly short lived after Internet became affordable, and all the technical talent moved to web-browsers rather than any more fancy-terminal development -- but instead of using TheDraw to create the scenes of your BBS, you used a graphical paint-like editor and specified interactive objects like menus and such). I ran a "graphical BBS" for only a few months, before accepting that the Internet was here to stay. Anyhow, if we stick with existing VT100 terminals - it seems a kind of "IMTS" could be made, a new kind of BBS host. Most existing BBS host software (written for those old vintage systems) maintained a single connection -- some were advanced enough to handle multiple connections. But this modern one would accept multiple connections, and ask "What resolution in your screen? " I suppose a couple more questions could be asked, such as: ABCDEFG abcdefg Do you see upper and lower case characters? (I don't think there was ever a system that had lower-case only???) Then clear and write to screen center "is this centered? -- if answer Yes, then they support VT100 and the screen size is correct. From there, the "IMTS" would offer "experiences" based on the specified screen resolution, and the connected baud rate. One experience could be coordinating a game of backgammon, across two different types of systems. And to me, that's the main new-things this is offering beyond a telnet-connection and "classic BBS experience." "IMTS" would focus on offering multi-player-heterogenous-systems experiences - and be another way for folks to make interesting use of vintage systems (or modern remakes). The modern telnet BBS's that are networked together to have multi-user chat rooms -- those are neat. But I think it would be fun to have a dedicated host that acts as a kind of lobby and coordinates connections, to play some simple VT100 games (so it's not just resurrecting Lynx). At some point, there is a threshold where a 1MHz 2400 baud client can't handle the traffic - so full MMO-style gameplay won't be practical. It would need some "ground rules" where "this lobby requires 9600+ and accepting up to 4 clients" vs a lobby that is "1200+, 2 clients" (which the EBGS would determine and manage), and be "agile" such that clients could come and go casually (i.e. your game opponent might drop out for whatever reason -- but wait a few minutes, and maybe another suitable connection resumes it). But I also realize this is just terminal-experience stuff, and doesn't really exercise the capability of the connected 8-bit system (graphics or sound, unless a terminal program was built that did something like play audio in the background). It would be an excuse to fire up the old vintage systems, it would be "something they are good for" but can also equally be used by a modern Pi or ESP32.
  9. Fair enough, no driving a 3D Printer or CNC! But yes to some LED blinky lights? from BASIC, via the expansion port(s) ? Maybe
  10. Exactly, with "this and that" wired up, with a homebrew concoction that nobody else will replicate. We're defining a system here. Make a standardized ROM with some programming control, and standardize parts of the system for some hobby-grade applications.... So all I have to do is share my BASIC program, not any h/w schematics. This sort of relates to why the case is important, it sets the identity: the box that looks like that has X,Y,Z fixed capabilities (like you look at a PS3 - yea it could be modified, but chances are it's the standard formula as Sony defined it for that system). [ about that -- at least define the "standard" or "default" case to establish that identity, let others replicate/3D print eventually - up front just sell maybe 20? 100? cases but sold at a premium kind of as collectors items, but also to kickstart production - $500+? yea it's just plastic, but partially a charity drive of sorts too? iirc, the concern was some minimum order was necessary to make it worthwhile doing that production, and then you have the logistics of storing all that inventory for awhile -- which yes, applies to the system as a whole, not just the case ] Arduino is a nice modular thing - but you have to tether a laptop to upload some compiled C code. That misses the "vintage experience" - where you plop down a machine, attach a screen, and can immediately start programming it to do stuff (with the default vintage capabilities being: make some beeps, poll a joystick, draw stuff on screen -- now add: make a couple network connections, and interact with some above-5v stuff). I'm not sure how power management is envisioned on the X16. For example, is it a power brick that does AC/DC convert, so the system itself is all DC power? Could one of the expansions be reserved as a "power distributor" of sorts, for maybe 6V - 18V range? (and it has fuses or whatever appropriate circuit protection). Or can the expansion bus be arranged to at least support such an add-on card? The only reason I lobby for making it a more integral part of the system, is so that it can be accessible (in a standardized fashion) from a hypothetical BASIC ROM (so it is accessible as an "out of the box" capability - that could equally be accessible from compiled C code). I guess while I'm dreaming: also standardize some LED blinky lights. Capture a bit of the Altair experience, but can also be status indicators while the screen is off (like status on if a 12v line active?). Yep, more costs - but that's what building a system is all about: finding that sweet spot between being affordable and interesting standard features that people want that system (like an "easy built-in language" that can interact with all this capability, in like a 10 page writeup). I've got a lot of systems that can play Sonic-type games, but a networked Sonic game that blinks lights by day, and closes chicken-coop doors by night? That's new
  11. Oh yes, my wife's chickens are very special ! And they don't like possum visitors at night. And yep it's overkill for that one application. But what about a charge controller? I've several processors out there, seems if I had a little more control on the programming I could combine all that into one device - but I don't need an Intel i7 out there (well, unless I wanted onboard facial recognition on the cameras, but nah). Those Intel ITX boards are nice, but that's definitely overkill. Anyhow, 5v is just a little too sissy - it can handle this wimpy plastic servos to slew a little camera, that's about it. I think some "charge up" expansion thing could be added to the Arduino - so sure, if some "standard" expansion device was made for the X16 (with some fuses and such). But extra cables really is a drag - even the IBM PC understood that, with the extra cable from the P/S to the monitor (so once switch to power up everything). Now-a-days we got that induction stuff (for charging), trying to "cut the last cable" out of devices. BTW: Tandy removed the 12v pin from their cartridges after the coco1 (not sure why, but just noting it). The PCJR expansion boards (the ones on the side, not the cartridge slots) had a 12v line. Yea there were some "horror stories" of abusing the 12v line of the Apple2. I think the "killer app" here is making stuff like this easily accessible without the need for an elaborate operating system - and BASIC was kind of good for that: the simplest control language embeddable into a ROM? And once you have PEEK/POKE, you can then basically inline any assembly stuff that's urgent (but wasting/consuming a lot of code space to express those POKEs since all have to be interpreted).
  12. Probably so - not that anyone would actually use a tape deck, but: (a) the tape interface could be emulated with a different device that had storage, but didn't literally use a rolling tape media - but still used the simple "header" and tape-data format (maybe not as easy as it sounds, since such a device would need to support at least 44khz encoding of an audio signal??), and (b) depending on how similar it is to the original Commodore ROM, there is space (addresses) reserved for tape buffers (I can't recall the size, maybe about 300-600 bytes for both of the two tapes?). Maybe this portion of the code could be extracted and replaced with some networking code -- however, the whole point of "cold storage" is to be separated from a network.
  13. If we did this "stupid easy networking from BASIC" - imagine: using an X16 as a base controller to some home automation stuff? I'm not sure if X10 is a thing anymore. Yes, you can get a lot of that on Smartphone now and IOT enabled power ports. But maybe I don't trust my smartphone and want a more standalone system Or I just don't want to need a data plan. And maybe you hardware folks could make some low-powered "X16 compatible" control devices. Like, a single-shot camera? (remote property monitoring). Remote power toggle (for lights)? Cover openings (for solar panels, telescopes)? So from BASIC, something like... 10 B$="123.123.123.123:84 15 POKE(0x5556, 5): REM set connection 1 query timeout (seconds) 20 OPEN IO,5,1,A$,B$:REM device 5 is NIC, A$ for input, B$ for output [maybe limit up to 3 connections? each connection consumes h/w resources] 30 X = PEEK(0x5555):REM NIC status of connection 1 40 IF X&0x01 = 0x01 GOTO 50: REM check bit for "data content ready in input buffer" 45 GOTO 30 50 ...parse A$ (up to however long A$ length can be, 255 chars?) 60 PRINT #-5,1,"TAKE PHOTO REZ 3":REM send to device 5 (NIC) connection 1, string command to take a photo image [ more likely the command would be like "08 34" some hex code ] 70 ...wait for a response, parse and wait for data stream mark begin... 80 ...write the received stream to a file... 90 CLOSE #-5,1 : REM politely close the connection That kind of modification to the ROM is a big ask, plus the NIC buffer has to go somewhere (re-use the Commodore tap buffers??). But point is, if you can make it "stupid easy" and a "standard way" - well, maybe that inspired folks to tinker with it? (with this kind of ROM, make "X16" commanded sprinklers? thermostat?) I'd use a couple X16's to open up the chicken coop door (which needs 12v -- outside slide doors got a lot of friction to deal with)
  14. Anyone got thoughts on vintage PCs for crypto private keys? They're not (usually) networked [irony, in contrast to my other thread!] so that's a plus. Floppy media isn't reliable (well, IMO). Did I read correctly the X16 will support a tape? (basically Commodore-ish ROM and user-port?). The ROM itself is enough to have some sort of load/save capability, without any kind of OS. Parts and reliability of actual vintage machines is one issue, you really don't want to lose those keys. Paper obviously can burn. But anyway, basically couldn't any vintage system basically be a Ledger Nano S type device? (cold storage) A "simple" non-FPGA device, with documented and obtainable components - maybe that's (another) good reason for a "modern" vintage system?
  15. I thought "RS" was RadioShack Haha, actually I never even really thought about it meant. Learned something, thanks Originally when I said "from a programming perspective" I meant a "standard way" to access this kind of connectivity. Same idea for the audio/sound. A good h/w NIC helps offload the CPU for some of the duty, yes? and then there are choices about the MTU size and such? could/should the BASIC be adjusted to support that? 10 CONNECTION X="123.123.123.123:84" 20 NPRINT X,"HELLO" 30 NINPUT A$,2000:REM 2000 is ms to wait for input? (default to 500?) (just making stuff up) Someone made a comment of having the hw design "frozen in time" - yea, I like that idea. Kind of neat that CAT5, it hasn't really changed in -- 30 years? I guess that's like 120V power outlets - that's like over 100 years standard now? (I know, "standard"... 115v, 120v, something like that my old professor one said "one thing you'll learn about digital circuits is that everything is actually analog" ) Another thing might be a 12V power line somewhere. I'd like to control some motors or more extensive equipment (gate openers and TEC coolers), and 12V helps for that. Maybe there is a way off the expansion port?
×
×
  • Create New...

Important Information

Please review our Terms of Use