Jump to content


  • Posts

  • Joined

  • Last visited

  • Days Won


Everything posted by ZeroByte

  1. I think the biggest hurdle is the tool chain. Probably the thing to do would be to update the zsm2zfx tool to be able to read the current ZSM format as input, and then use Furnace as the platform, so at least you can create and export directly in ZSM (Deflemask wouldn't allow for very creative use of VERA PSG, as it has no such chip, and VERA doesn't exist in VGM standard)
  2. I've done a few different things over time.... For Flappy Bird, I would use Deflemask to get my YM patches the way I needed for the effect, and then export them as instruments, and convert them into my "YMP" (ym patch) format. Then I'd manually generate the sequence data in the source code. Rinse. Repeat. For things that are basically an algorithmic-style sound (pitch sweeps up and down, volume fades, etc) I would hammer in a quickie BASIC program to make stuff, and when I got it sounding the way I wanted, I'd mod the code to also print the registers+values as they were being written into VERA. This of course borks the timing, but I've already determined they were what I wanted.... then I'd use the emulator's -echo mode to copy/paste the results and then write them into source for use with my engine. Lastly, I've made a couple of effects as Deflemask "tunes" (using voice 0 of the YM2151) and exported them as VGM. I had a tool that would take this VGM and convert to ZSM (a previous ZSM standard that no longer exists). I have the core of a sound effect format for Zsound (ZFX) but it's only just a bare-bones playback routine that's available. There are not any supporting toolchain components for the SFX. Whenever I get back around to this side of the house, I'd like to make an on-system tool for SFX creation, using a dopesheet style UI where you draw lines that describe a value over time. (e.g. a line sloping up, tied to frequency would make a rising pitch sweep) If anyone would like to collaborate with me on the SFX side of Zsound, I'd be more than happy to oblige.
  3. They seem pretty comparable to the emulator using SD card.
  4. MACPTR is essentially a frontend for the DOS module's fat32.s functions. It does a lot under the hood, but the meat and potatoes occurs in the routine fat32_read: in fat32.s MACPTR calls file_read, which calls fat32_read fat32_read in turn calls other, even lower level functions to do the actual reading from SD, tracking block numbers, etc, but I've never been any further down the rabbit hole to tell you what those do - but they're all in fat32.s I would imagine.
  5. By default, it expects you to call it at 60hz, and it will resample time to 60hz automatically when you start a song. If the song's play rate in the header is 120hz, the player will tick the music twice per call to playmusic. So in other words, you do nothing, you call it once per frame, it converts the song speed automatically. There is a routine to set music speed (.XY = hz) - when you call this, it adjusts the tick rate to whatever you specified instead of what the song specifies. So let's say your 120hz tune is playing and you call setmusicspeed = 240. You continue calling playmusic once per frame, but now the song will tick 4 times per call, which means the music will now be 2x normal speed. And yes, you can ask Zsound what the normal speed is at any time using the get_music_speed API call. You can use the reply from this to then set_music_speed back to that value. What I was describing above is for cases where you don't want Zsound to automatically resample time for an assumed 60Hz call rate. The procedure is a workaround. If you set_music_speed = 60Hz, then the player will only advance the music by one tick per call. The player doesn't have any idea how fast you're actually calling it - so if you have a, say 700hz tick rate song, you can call playmusic at this rate and the music will play at normal speed. The API call I would present would be either a parameter on the initialization (enable/disable time scaling) or else have an API call that disables/enables time scaling. Essentially, set_music_speed(60) disables time scaling. When time scaling is disabled, you can ask what the tune's rate is (or just go read the header from memory if you prefer) and drive the player at that frequency yourself. The main reasons to do so would be if you had more timing-sensitive effects in the music. (e.g. if the Concerto tunes came out sounding wrong at 60hz, you could opt to drive the ZSM at 128hz directly which would eliminate such errors).
  6. By default, the Zsound player resamples time to 60hz regardless of the ZSM tick rate, but you can play back files at the native tick rate if you use a time source like VIA timer IRQs to drive the playback function. You'd need to either call set music speed = 60hz and then call the playback function at the actual speed in order to do this, or else call stepmusic directly on each tick (stepmusic doesn't preserve things like bank / VERA data port states, so plan accordingly if calling that). I've got an internal variable in the code that specifies whether the player should scale to 60hz or not - just no way to manipulate it in the API yet. Would this make more sense as an argument to the zsound init routine, or as an API call to enable/disable it? I'm thinking init() argument makes more sense, really....
  7. This is some fine work, @kliepatsch, and thanks for the shout-out on Calliope. After playing the song in Calliope, I see what you mean. It's neat how the software chooses random channels for stuff in a way that a human wouldn't - definitely makes an interesting show on the PSG LEDs. Whenever I get around to adding a "tap" feature to Zsound's ZSM player, I'll be able to make the FM lights do the same thing instead of just being static "channel used/unused" indicators.
  8. Actually, Flappy Bird specifies its own VRAM map so the official R39+ changes to screen memory didn't affect it. I did do some direct dips into the Kernal's BRAM vars (which is explicitly discouraged for exactly this reason) and I guess I should go fix this once and for all....
  9. wiatvsync() comes with cbm.h And yes, it's typically used as a quick and easy way to time software with the 60hz screen refresh, for instance some animation that should only update once per frame, your loop can have waitvsync() in it to ensure that it only executes once per 60hz frame. It's less useful for things like avoiding tearing because your code won't get to execute until after the VBLANK interrupt has finished which may or may not be after the VBLANK period is over (i.e. into visible raster time).
  10. Sprites only come in 4bpp and 8bpp color depths. Palette offset is essentially meaningless in 8bpp mode (although it does affect the colors, and I'm sure there's some clever schema in arranging the palette and sprite assets to get some kind of effect from it, but that's going into mad science / demoscene territory). The palette offset value is designed such that you should think of the master palette as being divided into 16 different 16-color palettes. Palette offset selects which of these 16 palettes a 4bpp asset uses. I.e. palette_offset = row, pixel value = column.
  11. My most recent retro collection purchase was the beat-em-up collection from Capcom. It's fun to pop that in with the kids and start trashing bozos. My most recent "HD remake" title was Secret of Mana II. It was interesting how they made it play a lot more like the action-RPG titles of today (like Xenoblade)
  12. My $0.02's worth - I understand about the intermediate 48px size, but I really can't honestly say one of the other sizes that should be discarded in favor of 48px. 8 and 16 - no way. Those are tile-sized aspects and there are far too many uses for such, and furthermore, there are far too many "little" things that would become bloated in VRAM if 8x8 were dropped as a sprite size, especially. I've already done many programs where I used free slots in the tilemap as sprite slots or else used letters/numbers from the tilemap as digits, e.g. the score counter in Flappy Bird.... 64 - useful for bosses / making panels for things like radar / HUD overlays / popup dialogs, etc. (for instance, in games like StarFox where a character's portrait pops up along with their dialog/voice clip, having a 64x64 sprite for this is perfect) 32 is probably the only one I'd say is "up for discussion" as an atomic size that could be considered expendable in favor of 48. Both are "middle" sizes, and maybe 40 or 48 offer more utility than 32, but the extremes are both quite valuable and dare I say, indispensable?
  13. So you're not animating by having all frames in VRAM and updating indexes, but by giving each object a VRAM allocation and each one updates its pixel data directly? That definitely would consume lots of CPU at scale w/o a DMA chip. That's how Sega did Sonic's animation, but the Genesis has DMA....
  14. FWIW, using multiple sprites is pretty low overhead in the way I did it for Sonic Demo: .struct SPRITEREG addr .word 1 xpos .word 1 ypos .word 1 orient .byte 1 attr .byte 1 .endstruct sonic_spriteregs: ; shadow registers for Sonic sprites (initial values) ;sonic's body .byte $10, $08, <sonic_x, >sonic_x, <(sonic_y+8), >(sonic_y+8), $0c, $a0 ;sonic's ears .byte $00, $08, <sonic_x, >sonic_x, <sonic_y, >sonic_y, $0c, $20 sonic_frames: ; VRAM locations for frames of Sonic's animation (SPREG format) .word $0810, $0820, $0830, $0840 .word $0800, $0804, $0800 $0804 animate_sonic: lda sonicframe inc and #3 ; sonic frame = 0..3 sta sonicframe asl ; use frame as X index (*2 because data stored as words, not separate HiByte / LoByte tables) tax lda sonic_frames,x ; sonic body address LoByte sta sonic_spriteregs + SPRITEREG::addr lda sonic_frames + 8,x ; sonic ears address LoByte sta sonic_spriteregs + 8 + SPRITEREG::addr lda sonic_frames+1,x ; sonic body address HiByte sta sonic_spriteregs + 1 + SPRITEREG::addr lda sonic_frames+9,x ; sonic ears address HiByte sta sonic_spriteregs + 9 + SPRITEREG::addr lda dirty ora #DIRTY_SPRITE sta dirty ; flag sprite address as dirty so VBLANK IRQ will update VERA rts A similar approach in C would use something akin to this: uint16_t sonic_frames[2][4] = { {0x810, 0x820, 0x830, 0x840} , {0x800, 0x804, 0x800, 0x804} }; Again, not saying "do this, n00b" - just sharing what I've done in case anyone else finds it useful or informative.
  15. I think the main issue at hand is that the engine uses a heap management approach to allocating VRAM which comes with benefits and drawbacks, as with all design decisions. That's what makes this an engaging hobby.
  16. honestly, the correct solution is to use multiple sprites for one object. This is how it was done on classic systems. Take this sprite of Samus from Super Metroid: It requires 36px to cover the full width of the character on screen. However, most of that extra 4px of width is blank, with only 10px height for back foot. This can be contained in an 8x16 sprite placed correctly in relation to the main 32x32 sprite. If I were to make a demo/game using artwork of this sort, I would simply make the actor/object control these sprites and define animation frames for each one. In cases where the "extra bits" (like Samus's foot) move around relative to the main sprite, the engine would also need an x/y offsets table. In the Samus animation, the back foot is near the bottom of the sprite for one frame, and closer to the top on the next frame. This "foot sprite" would just be drawn at different relative locations to the main sprite during each frame of the animation cycle. My Sonic demo faced a similar issue on the Sonic sprite, which is 40px tall. I used one 32x32 sprite for the main part of Sonic: and I use a separate sprite for the ears: Interestingly, the ears only require 2 frames of animation (frames 3 and 4 are duplicates of 1 and 2). I didn't bother deleting frames 3 and 4 from the resource, as I've got plenty of VRAM for this simple demo, but the engine never uses those frames. I set up an animation frames table with the VRAM locations of the frames, and the engine cycles through these tables. So the main body's data is the VRAM address equivalents of: 0, 1, 2, 3 and the ear's data is 0, 1, 0, 1. This is also useful for "pingpong" animations where you only need pixel data for 3 frames of a walk/run cycle and you just display them as 0,1,2,1,0,1,2,1... no need to have frame 1 in VRAM twice. This is part of the challenge of programming retro. It's nice when HW just does the work for you, but when it doesn't, that doesn't mean you're stuck. You just have to code a solution. This stuff I've explained here actually saves more memory too, as you only need exactly the pixel data for exactly the parts that go out-of-bounds. The lazy alternative would've been to use a 64x64 sprite, but yes, that would gobble up VRAM needlessly. Hope this makes sense.
  17. Late November is when I officially released Zsound. That was at the ultimate nadir of Commander X16's community enthusiasm level. Things had stalled out. R38 was a year and a half old. It had serious broken things about it. The community was having to build everything for R38 and for the potential R39 that never would seem to come to pass. Then in March, @Michael Steil started merging pull requests to the repo, and we suddenly shot up to version r41 in a matter of mere weeks. Furthermore, @Wavicle got his Breadboard16 up and running and was very involved with the community members who wanted to see their projects run on real hardware. This led to the discovery and correction of a few bugs in the VERA code, read/write timing fixes for the YM2151 (quite a finnicky chip to deal with on the bus!) and most importantly of all - spearheading the efforts to resolve the long standing problems of PS/2 and SD card stability at 4MHz / 8MHz. For once, we started making real headway! You can see how much stuff I started working on in the spring. This new synergy led to renewed excitement and progress on the parts of Kevin and Dave, and now we really are able to feel excited - like there really will be a Commander X16 on people's desktops in the not-too-distant future! Many others have been making amazing contributions to these efforts over the summer, especially @Stefan and @Jeffrey who worked tirelessly with Wavicle on getting the new PS/2 functionality ironed out in the Kernal and in the ATTiny SMC code. The code in September was me helping David get Zsound working in PETSCII Robots for a great upgrade to the X16 version's audio aspect, and then on to my latest project, Calliope. I would probably not have ended up making Calliope were it not for the reinvigoration that's taken place this year. Thanks, everyone!
  18. I gotcha fam: https://github.com/tildearrow/furnace My code is mostly contained in engine/zsm.* and zsmOps.* - I had to make a couple of tweaks in engine.h etc to make the project as a whole know that the code is there, and then there's the spaghetti bowl that is gui.cpp but that has little to do with my code - only putting the menu option and file dialog box components for ZSM files into the logic. Of course, Zsound is found at https://github.com/ZeroByteOrg/zsound This repo contains not only the X16 library, but a set of other tools for creating and manipulating the data files as well. For instance, it contains a python script which is a frontend to sox for creating the "ZCM" files (they're basically WAV files for Zsound) I presume you mean in ZSM music, not Zsound as a whole? Zsound already has a PCM playback engine that lets you easily trigger digi samples as SFX in programs. The challenges I'm facing for getting PCM into ZSM playback are: - There are two basic "paradigms" for this: Triggering one-shot samples, or manipulating sample stream playback on the fly. Each one's got a challenge for me. One-shot is easy and I already have "hooks" in the ZSM standard to allow for this, but the challenge here is how to get such things from various VGMs - many of the PCM sound chips favor the streaming method and aren't just simple one-shot triggers. How to convert this (intelligently) into a collection of the various sample clips is challenging to me. For stream playback, I kind of have the opposite problem. Encoding is easy, but architecting the player library to facilitate near-realtime fidelity to what the original playback source did is much harder. - Architecture challenges in the library - I want to make sure the implementation stays as lean and mean as possible - not requiring a lot of code be linked in if it isn't needed. For instance, if the PCM engine is not included, ZSM files should play back everything but the PCM track. Etc.
  19. At long last, my PR adding Zsound's music format ZSM has been accepted and merged! This means that there is now an official version of a current tracker that you can use to make music for Commander X16 and use the music on system with Zsound. The exciting thing about this is that Furnace supports native VERA channels, so no more having to work with Sega Genesis mode in Furnace or Deflemask and export via VGM! For now, the ZSM export module does not support the PCM channel of VERA, so don't expect to use that channel and have it in your music on Commander X16, but other than that, it's completely functioning as a music tracker.
  20. Just in case anyone's been looking for information on how to write code for the YM2151, the Offical documentation now has a register reference in the style of the other documentation sections, and also includes a short "getting started" tutorial on the task of communicating with the chip. It doesn't go into FM instrument design theory, as that's beyond the scope of such a document. https://github.com/commanderx16/x16-docs/blob/master/X16 Reference - 09 - Sound Programming.md
  21. Basically, it's the definitive answer to the question "has the previous sound finished playing?" or "am I getting a buffer underflow?" (depending on whether your routine is currently in "playing" state or not) It was essentially a "free" feature that doesn't break anything by adding it, but allows more options if you want to use it, so why not, eh?
  22. Yeah, the latest stuff from the HW enthusiasts has rendered stable SD reads.
  23. Having installed some stuff I found on CSDB (if I remember the right acronym), I don't mind the cracktros themselves so much as I mind the fact that most stuff on there also pulls up a trainer menu before starting the game. I really really really don't like cheat codes like this - and having it drop into a menu first before even bringing up the main screen is annoying.
  24. Probably the biggest deal with 816 after the 24-bit bus is the segment pointers (forgive me, folks, if my terminology is off - I have -zero- experience with 816 assembly). I think of it as "portable zeropage" but even more is portable than that. The stack is portable, and there is a data segment as well - so you can make it point to any arbitrary table and then walk through it relative to the segment and not its actual address - these sorts of things speed up execution, and make multitasking and relocatable code easier.
  • Create New...

Important Information

Please review our Terms of Use