I think I have figured most of it out with a bit of trial and error:
Offset 0: 4 bytes 'M', 'Q', 0, 0 (some sort of version here?)
Offset 4: Title, 48 bytes zero padded
Offset 52: Developer, 24 bytes zero padded
Offset 76: Publisher, 24 bytes zero padded
Offset 100: Year, 4 bytes zero padded (if necessary)
Offset 104: Some sort of flag bit field, 16-bit value
Offset 106: Load address, 32-bit value
Offset 110: Exec address, 32-bit value
Offset 114: Box art image, 88x124 pixels as 16-bit values, packed as RRRRRBBBBBGGGGGG
Offset 21938: Screenshot image, 88x56 pixels, as above
Which gets you to 31794 bytes. All multi-byte values are interpreted as big endian.
The only part I'm unsure about is the flag bit field starting at 104. Setting a custom load/exec address seems to toggle 0x1 which makes sense, but setting the EEPROM to 128 bytes (0) doesn't seem to change anything, but setting it to 256/512 bytes (1) or 1024/2048 bytes (2) toggles 0x2 or 0x4 respectively in the flags. I can also pass the undocumented value 3 which toggles 0x6 in the flags (i.e. a combination of the two). Larger values seem to loop back around like the value is AND'd with 0x3 and then shifted left by one bit.
Some of the home-brew titles have this 0x6 value (HMS Raptor was one I found) so either the EEPROM values are actually 0 = no EEPROM, 1, 2, 3 are the various sizes, or it is 0, 1, 2 for the various sizes (and every game implicitly has at least a 128 byte EEPROM) in which case what does passing 3 signify given it's not mentioned in the tool help?