Jump to content
IGNORED

classic battle atari 8bit vs commodore 64


phuzaxeman

Recommended Posts

Gee, that's odd. My Commodore 64 Programmer's Reference Guide from 1982 has no mention of "Howard Sams" publishing to be found anywhere within it's pages or on the covers. In fact the very first page of the guide states "Published by Commodore Business Machines, Inc."

 

So, it wasn't published 3rd party ("Howard Sams") till much later and does not fit into the category of the "Compute!" published books.

Yes, carmel_andrews is way off base. He's arguing against well documented history. The Commodore machines were all very well documented right from the time of release, and the information came directly from Commodore.

 

He's wrong about the Amiga too... I've got the Amiga Hardware Reference Manual right here. Sure, it was published by Addison-Wesley, but right on the back cover it says "Written by the technical experts at Commodore-Amiga, Inc..." and the inside clearly states Copyright Commodore-Electronics, Ltd.

Link to comment
Share on other sites

He's wrong about the Amiga too... I've got the Amiga Hardware Reference Manual right here.

 

According to some converstations I had with Toni Wilen (WINUAE mantainer), there are lots of undocumented issues in the Amiga. Probably much more than in other cases like the A8.

 

True, that's partly because it is a complex computer. And also true, there is big difference in the "documentation level and accuracy" required for emulation than the one required for developing.

Link to comment
Share on other sites

According to some converstations I had with Toni Wilen (WINUAE mantainer), there are lots of undocumented issues in the Amiga. Probably much more than in other cases like the A8.

 

True, that's partly because it is a complex computer. And also true, there is big difference in the "documentation level and accuracy" required for emulation than the one required for developing.

I don't doubt that at all, but it has nothing to do with carmel_andrews' claim that the Amiga (and C64) documentation was 3rd party. It was in fact produced by Commodore.

Edited by MacbthPSW
Link to comment
Share on other sites

The Amiga docs were all produced by CBM and the text inside clearly shows it. I actually have one of the first AmigaDOS programmers manuals and the Copyright date is before the release of the Machine and guess who owns the Copyrights on the ROM Kernel Manuals? Commodore Amiga. That means a lot more than who published it.

 

I'm not sure what the IBM PC has to do with this but PC Clones were available long before the Atari ST came out. The PC came out in '81 and major brand clones like Columbia were introduced in '82... though I'm pretty sure some import clone parts were already out in '81. A former business partner of mine started selling clones long before the ST was introduced.

Given the decision of Apple vs. Franklin licensing to other companies was a non-issue as long as they developed their own compatible BIOS. Ever hear of a Phoenix BIOS? Developer docs for the PC/MS-DOS were available when the PC was released but I'm not sure at what cost.

 

The reason Apple clones disappeared from the market was they copied the ROMs instead of developing their own and Apple pursued them rigorously. Franklin made a compatible ROM through reverse engineering and they were ruled legal in court after they did so. VTEC Laser machines also had their own ROM. One of the two even licensed Applesoft BASIC... Apple hadn't made the agreement for it exclusive. I think it was VTEC. The Lasers were a pretty decent machine and later versions could run at 4MHz... much faster than the Apple. That prompted the introduction of the IIc Plus with it's high speed mode. I have a couple of those and a Laser 128. They were nice machines for the time... at least as far as 8 bits go. Apple actually licensed a third party product to do the high speed circuit if I remember right.

 

Apple II's included a disassembler/monitor in the Apple II, schematics of the machine and a lot more. They published a lot of ROM calls as well and WOZ even publicly released source to the Sweet 16 interpreter in the Apple II ROM. I'm not sure how that qualifies as withholding information since most of that was in the manuals that shipped with the machines. You have to remember that the founders of Apple came from the hobby computer community of the '70s where everything was given away. They protected their copyrights but their systems were open as far as developer info.

Edited by JamesD
Link to comment
Share on other sites

I don't doubt that at all, but it has nothing to do with carmel_andrews' claim that the Amiga (and C64) documentation was 3rd party. It was in fact produced by Commodore.

 

I know. I didn't mean to argue with you, sorry about that.

 

The point I was trying to make is that early documentation doesn't necessarily mean good or comprehensive. I don't know anything about the Amiga docs (except Toni's comments) or the C64. But if I'd try to compare the docs for the ones I know, I'd say that A8 documentation is superb. True, it was late and that was a big mistake, but the quality, accuracy and details are excellent. In comparison, the ST documentation was very early (since day one if I'm not mistaken) but it stinks.

 

OTOH it is not "fair" to compare the documentation of an 8-bit computer with a later 16-bit one. Not only that the latter is usually much more complex and then more difficult to document. It is also expected that the 8-bit documentation to be more detailed, or at least to cover a more lower level.

Link to comment
Share on other sites

The point I was trying to make is that early documentation doesn't necessarily mean good or comprehensive.

I never had a C64, but I had a VIC20, and although the documentation that came with it was good, it was a far cry from Compute's "Mapping the VIC." A user manual to help the average user program in BASIC is one thing (and important in its own right), but a comprehensive memory map to help serious users exploit the computer's full potential is another thing. :)

 

Michael

Link to comment
Share on other sites

The point I was trying to make is that early documentation doesn't necessarily mean good or comprehensive.

 

A couple points worth noting:

 

-1- A manufacturer may not want to commit early on to an exact ROM version. A person can be pretty sure today that every VIC-20 will have the same entry points in its BASIC interpreter, but there's no way anybody could have known that when the machine was released. If Commodore had published all the various entry points and then a major bug had been discovered, it could have been difficult to fix without breaking code that relied upon the various entry points.

 

-2- Sometimes machines will have undocumented features even the designer doesn't know about. Stuff like Cycle 74 HMOVEs were probably completely unknown by Jay Miner et al.

Link to comment
Share on other sites

The point I was trying to make is that early documentation doesn't necessarily mean good or comprehensive.

Yeah... about that... have you ever even seen the Amiga ROM Kernel Manuals? Comprehensive would be an understatement. Good... they were more for a professional developer than a beginner so if you are a pro they are great. If you are a newb... you wouldn't even know where to begin. I spent a lot of hours searching for stuff in those books and invaluable would be the word I'd use. But then I was part owner in a software company and majored in CS so no big surprise there.

 

For beginners, the AmigaBASIC manual was pretty good but it was just a reference manual for the most part if I remember right.

 

 

I don't know anything about the Amiga docs (except Toni's comments) or the C64. But if I'd try to compare the docs for the ones I know, I'd say that A8 documentation is superb. True, it was late and that was a big mistake, but the quality, accuracy and details are excellent. In comparison, the ST documentation was very early (since day one if I'm not mistaken) but it stinks.

I don't know anything about the C64 docs so... whatever.

As far as Amiga... the OS takes up more ROM than there is total memory in a 130XE and the hardware is more complex.

The volume of docs required to understand it has to far greater.

BTW, there were a lot of books that came later for the Amiga as well and good luck equaling the volume of source code that was released on Fred Fish disks or Aminet.

 

 

OTOH it is not "fair" to compare the documentation of an 8-bit computer with a later 16-bit one. Not only that the latter is usually much more complex and then more difficult to document. It is also expected that the 8-bit documentation to be more detailed, or at least to cover a more lower level.

Not fair to compare one machine that was out over 5 years before the other one was introduced?

Gee... I think the 8 bit had a hell of a head start... that should have been more than fair.

 

I think what you are really saying is that it's not fair to compare the Atari in any way that would show it to be inferior to another machine.

Edited by JamesD
Link to comment
Share on other sites

OTOH it is not "fair" to compare the documentation of an 8-bit computer with a later 16-bit one. Not only that the latter is usually much more complex and then more difficult to document. It is also expected that the 8-bit documentation to be more detailed, or at least to cover a more lower level.

Not fair to compare one machine that was out over 5 years before the other one was introduced?

Gee... I think the 8 bit had a hell of a head start... that should have been more than fair.

 

I think what you are really saying is that it's not fair to compare the Atari in any way that would show it to be inferior to another machine.

 

Huh? Did you read what I wrote, or you are that kind of people that like to fight and love these "battles"?

 

Your comment is ridiculous. In first place, if you'd care to read my posts in this and other similar threads, you'll see that I never take sides on them. I find them completely childish and without any sense. As many others, I only engage in technical issues with a completely neutral position. I love the Atari's because they are my computers. I don't care, and I never did, if they are better or worse than other ones. Let alone that "fighting" about which one was better documented is possibly the ultimate childish fight.

 

In second place, and if you'd care to read your own quotes of my post, you'd see that I was here mainly comparing the Atari 8-bit with the Atari ST. I compared them because those are the ones that I have some expertise for making a meaningful comparison. I am sorry to disappoint you, but I wasn't comparing the Atari with any of the Commodore ones.

 

In third place, and this is even funny (or sad), is that my comment about "not being fair" was actually defending your side and not mine (from your fighting point of view). I was saying that it is expected that the documentation for an 8-bit computer to be "better" (in the sense of being more complete and reaching a more lower level) than the documentation for a 16-bit computer. That means that even if the Atari 8-bit documentation was "better" than the Amiga (something that I really don't know, and again, I was actually comparing with the ST and not with the Amiga), it still wouldn't mean a positive remark for the 8-bit. So if you would have tried to understand what I was saying (or at least, trying to say), it is obvious that I wasn't "defending" the Atari at all.

 

So just in case I wasn't clear enough. I was saying that the A8 documentation was late but great (without judging if the C64 or the Amiga had better or worse docs). And that 8-bit computers are normally better (or at least, it is expected to be better) documented than 16-bit ones. In first place because it is easier to document a less complex computer (just less "things" to document). And in second place because the 8-bit usually needs "better" documentation, because you try to squeeze every possible byte of memory and every possibly cycle of the CPU (not that this is never done in a 16-bit computer, but it is more common in the 8-bit ones).

 

This actually happens not just in computers but also in the processors. I know probably many won't agree because of the "illegal opcodes", but e.g. (IMHO) the 6502 is much better documented than the 68000.

Link to comment
Share on other sites

The 68000, like most modern processors has illegal opcodes in the true sense.

 

Hence the "Illegal Instruction" exception type. Where the 6502 was built to a budget and didn't provide a mechanism to cope with illegal opcodes (despite some of them providing useful functions), most subsequent processors provide a trapping mechanism for undefined opcodes.

 

AFAIK, the makers of the 6502 didn't acknowledge or document the actual functions performed by the illegal opcodes, and didn't necessarily endeavour to make later revisions compatible so far as actions performed by non-documented opcodes go.

Link to comment
Share on other sites

AFAIK, the makers of the 6502 didn't acknowledge or document the actual functions performed by the illegal opcodes, and didn't necessarily endeavour to make later revisions compatible so far as actions performed by non-documented opcodes go.

That broke many a program on the C64 during it's days. They must of had like over a dozen revisions of the thing, each worst then the last.

Edited by Artlover
Link to comment
Share on other sites

This actually happens not just in computers but also in the processors. I know probably many won't agree because of the "illegal opcodes", but e.g. (IMHO) the 6502 is much better documented than the 68000.

How exactly was the 6502 better documented than the 68000?

I have some of the Motorola docs on the 68000. There are complete explanations of the programmer's model, timing diagrams, cycle counts... you name it. This was available from the manufacturer when the CPU was released. It doesn't look any different than the docs I've seen for any other 8 bit cpu. But then these are books from the manufacturers that aren't commonly carried in the local bookstore. You would need to order them from an electronics distributor or the manufacturer.

But then, if you've never programmed embedded systems you may never have seen the manufacturer docs. They aren't for beginners but everything you need is there. If anything, manufacturers were more forthcoming with developer info on later CPUs than earlier ones. I'm sure that was because of past experience. When the first CPUs like the 8008 or 6800 were introduced there wasn't the demand for widely available documentation. With the exploding personal computer and embedded cpu market that changed.

 

There might have been more third party books on the 6502 because it entered the home PC market sooner but that doesn't mean better documentation.

If the books I have for the 6502 are any indication... quantity does not equal quality. It looks like publishers and authors were in a hurry to release anything for the 6502 and Z80. I even have a book that was published under two different names in an attempt to resell the same useless book twice.

In a way the 68000 was better documented because it didn't have undocumented opcodes like 8 bits. There was nothing undocumented about the 68000. Sure, you could discover ways to do the same pieces of code with fewer instructions by using them in less obvious ways... but that's STILL happening to this day.

 

BTW, the 68000 series is not 16 bit. The series is available with an 8, 16 or 32 bit data bus but it has 32 bit registers. Th 68000 is often referred to as 16/32 bit due to the 16 bit external bus. Many Amiga and ST machines came with full 32 bit versions.

 

I see illegal instruction interrupts as more modern and better in a lot of ways but that totally depends on what you want to use the CPU for.

The 6809 and Z80 don't have illegal opcode interrupts either but the compatible Hitachi 6309 and 64180 (aka Z180) do so it's not like it wasn't available in the 8 bit world. I believe the 65816 has illegal instruction traps as well but I haven't programmed it so I'm not 100% certain.

Edited by JamesD
Link to comment
Share on other sites

How exactly was the 6502 better documented than the 68000?

I have some of the Motorola docs on the 68000. There are complete explanations of the programmer's model, timing diagrams, cycle counts... you name it.

 

I'll name: Where is documented the exact timing of the DIV instructions? Where is documented the exact timing of bit manipulation instruction on data registers? Where is documented that CLR is a read/modify instructions? Where are documented the higher bits of the lowest word on a group 0 exception stack frame?

 

Where is documented the exact behavior of the prefetch on self-modifying code? For that matter, where is documented the cycle by cycle execution of each instruction?

 

And just in case you wonder, for each of the above there is at least one ST program that depends on those undocumented features.

 

In a way the 68000 was better documented because it didn't have undocumented opcodes like 8 bits. There was nothing undocumented about the 68000.

 

The 6502 doesn't exactly have "undocumented" opcodes. It has undesigned illegal opcodes which is quite a different thing. Undocumented opcodes (at least in the strict sense of undocumented as hidden and secret) are present in some Intel x86 processors.

 

BTW, the 68000 series is not 16 bit. The series is available with an 8, 16 or 32 bit data bus but it has 32 bit registers. Th 68000 is often referred to as 16/32 bit due to the 16 bit external bus. Many Amiga and ST machines came with full 32 bit versions.

 

Beg to differ. The 68000 is considered a 16/32 bit CPU because its internal architecture. Not because of the data bus width. Most internal buses are 16 bit wide. And most important, the ALU is 16 bits. Of course that later 680X0 CPUs, starting with the 68020, are 32-bit.

Link to comment
Share on other sites

That broke many a program on the C64 during it's days. They must of had like over a dozen revisions of the thing, each worst then the last.

 

I am aware of two NTSC VIC-II revisions, the most serious being a change from 64 cycles/line to 65, and I am aware of three ROM versions. I'm also aware that the Apple//c switched to a CMOS 6502 which got rid of the bonus opcodes the NMOS version had. The C64 always used NMOS, though, I thought. What meaningful changes were there in its CPU?

Link to comment
Share on other sites

Where is documented the exact behavior of the prefetch on self-modifying code? For that matter, where is documented the cycle by cycle execution of each instruction?

 

The documentation specifies the behaviors upon which programmers may rely. I've seen quite a few chips which explicitly state certain aspects of behavior upon which programs must not rely. Such behaviors may vary in different revisions of a chip, or may be affected by a rather complex variety of factors. One factor of particular importance is hardware emulation and development platforms. Things like prefetch buffers may behave differently in a chip that's running "at speed", than in a chip which is being single-stepped on a hardware emulator. Emulators would be much more complicated if they had to mirror the minute internal details of chips; this complexity would not be worth the cost for the users of such emulation equipment. Some implementors of copy-protection schemes may exploit such behavioral differences to try to prevent reverse-engineering of their code, but in most cases the manufacturer makes abundantly clear that the tricks they use may not work in future chip revs.

 

One advantage 2600 developers have today is that we know that all of the 2600's that are ever going to be built will support instructions like LAX, DCP, etc. A 2600 developer in 1980 who wanted to use such instructions would run the risk that his game might not work in the next year's 2600's.

Link to comment
Share on other sites

Beg to differ. The 68000 is considered a 16/32 bit CPU because its internal architecture. Not because of the data bus width. Most internal buses are 16 bit wide. And most important, the ALU is 16 bits. Of course that later 680X0 CPUs, starting with the 68020, are 32-bit.

 

 

The Motorola 68000 is a 32-bit CISC microprocessor core designed and marketed by Freescale Semiconductor (formerly Motorola Semiconductor Products Sector).

Link to comment
Share on other sites

The Motorola 68000 is a 32-bit CISC microprocessor core designed and marketed by Freescale Semiconductor (formerly Motorola Semiconductor Products Sector).

 

Motorola/Freescale might use 32-bit ALUs in all of today's 68xxx designs, but the original 68000 built by Motorola used a 16-bit ALU.

 

On a true 32-bit CPU, adding two 32-bit registers should take no longer than adding two 16-bit registers. On the original 68000, 32-bit register operations were slower than 16-bit.

 

Even though the Z80 supports some 16-bit arithmetic instructions (e.g. ADD HL,DE) few people would call it a 16-bit CPU since such instructions are performed using multiple 8-bit steps. The same argument might be made for regarding the 68000 as simply a 16-bit CPU with some 32-bit features (in the sense that the Z80 is an 8-bit CPU with some 16-bit features), but the fact that most of its instruction set is organized around 32-bit registers would argue that it's more than that. Hence the 16/32 designation.

Link to comment
Share on other sites

That broke many a program on the C64 during it's days. They must of had like over a dozen revisions of the thing, each worst then the last.

What meaningful changes were there in its CPU?

Admittedly, now that you mention it, I maybe recalling compatablity with C-64 mode of the C-128's 8502. :dunce:

Link to comment
Share on other sites

The documentation specifies the behaviors upon which programmers may rely. I've seen quite a few chips which explicitly state certain aspects of behavior upon which programs must not rely. Such behaviors may vary in different revisions of a chip, or may be affected by a rather complex variety of factors.

 

I don't disagree with that, at least not completely, and hence I'm not exactly "blaming" Motorola. Some things should have been documented though.

 

It is true that some particular behavior might change in different revisions, but that actually didn't happen in the 68000. All revisions behave identically in all the aspects I mentioned (and some more that I forgot). Furthermore, Motorola did document some aspects that they explicitely warned it could change (even when in fact, they never changed).

 

Documenting cycle by cycle execution is however, very important. Most 8-bit CPUs do, most 16-bit ones do not. It is true that a 16-bit CPU might need a whole book for that, and not just a few pages. And it is also true that the more advanced the processor, the less you are supposed to rely in low level behavior. That's exactly what I meant when I said that it is expected that an 8-bit computer/CPU would be "better" documented than a 16-bit one.

 

One factor of particular importance is hardware emulation and development platforms. Things like prefetch buffers may behave differently in a chip that's running "at speed", than in a chip which is being single-stepped on a hardware emulator. Emulators would be much more complicated if they had to mirror the minute internal details of chips; this complexity would not be worth the cost for the users of such emulation equipment.

 

ICE and hardware emulators should ideally emulate the internal CPU behavior. Yes, I agree that sometimes might be not worth or not even possible. But if possible, it should. And if it doesn't, then it should document whatever behavior might be different with a real CPU.

 

You probably can't emulate this for a modern Pentium (for that matter, I'm not sure it is technically feasible to completely emulate a modern Pentium at full speed). But you could easily do that for some modern ARM devices.

 

Some implementors of copy-protection schemes may exploit such behavioral differences...

 

It is a common mistake to assume that "undocumented" behavior is used only in copy protections. It is perhaps the most common reason to use them intentionally. But many "normal" programs rely on them, not intentionally by design, but because of bugs. I have seen many of these cases.

 

So let's assume some piece of software that doesn't work because an elusive bug. It might even be a bug in the BIOS/bootstrap code. You try using a hardware debugger, and suddenly it does work. Then you really curse the hardware debugger for not being 100% accurate, and you curse the manufacturer for the missing documentation. Yeah, once again, I agree that in some cases it might be not worth or not possible.

 

And of course that nowadays that we want software emulators and hardware clones of vintage computers, you want/need to emulate the most minimal detail of the original hardware.

Link to comment
Share on other sites

How exactly was the 6502 better documented than the 68000?

I have some of the Motorola docs on the 68000. There are complete explanations of the programmer's model, timing diagrams, cycle counts... you name it.

 

I'll name: Where is documented the exact timing of the DIV instructions? Where is documented the exact timing of bit manipulation instruction on data registers? Where is documented that CLR is a read/modify instructions? Where are documented the higher bits of the lowest word on a group 0 exception stack frame?"

Where is documented the exact behavior of the prefetch on self-modifying code? For that matter, where is documented the cycle by cycle execution of each instruction?

The standard programmers reference manual just breaks down the instructions and only the more hardware oriented docs for a specific CPU had timing info. You still had to calculate cycles for a specific addressing mode but that's easy. I have a non-Motorola book somewhere that had the info as well. Seems to me it was the same publisher as the 6809 book I used and I even think I have 6502 and Z80 books by the same author and publisher.

 

Exception stack frames are diagrammed in the Programmers Ref Manual... but I wouldn't count on all versions of the 68000 to exhibit any undocumented behavior from there. I'm guessing you'd need to refer to the more hardware oriented stuff or errata for behavior that differs from the Programmers Ref Manual.

 

I'd have to look up specific instructions to see if it states whether they are read/modify or not but the programmers reference manual clearly states than on the 68008 and 68000 the CLR instruction exhibits that behavior. Other CPUs in the series do no.

 

You mean the self modifying code that Motorola and Amiga said *not* to write because it wouldn't work on all CPUs?

Sorry, never wrote self modifying code on the 68000 because my code was either in ROM or had to run on CPUs where it would break... but then I wasn't a game developer.

However, if you were to need such information... you'd need to look at when the prefetch takes place in the clock cycle diagrams. I think that's in the more hardware oriented docs... it wouldn't be in the programmers reference manual. It wouldn't be specifically documented but you could look at where the prefetch takes place in an instruction execution to see if the instruction was fetched before you modified it. I wouldn't count on it working the same way on a 68020 or higher CPU. For that matter I wouldn't even count on it being the same in later embedded versions of the CPU where they went to a newer die process. It might but I wouldn't count on it.

 

About the only thing I remember about the prefetch is writing code that ran faster on the 68010 because the loops were only two instructions and both were in the CPU thanks to the 68010 taking advantage of the prefetch buffer.

 

Since playing well with the OS was important on the Amiga I didn't mess with certain interrupts... though I did use interrupts through the OS for some code.

 

All that stuff is interesting but did the software written with those tricks work on later versions of the machines?

I remember a lot of games from Europe breaking on the Amiga do to people being clever. Expecting Chip RAM only, a certain amount of Chip RAM, 68000 cpu... but that's a different issue.

 

In a way the 68000 was better documented because it didn't have undocumented opcodes like 8 bits. There was nothing undocumented about the 68000.

 

The 6502 doesn't exactly have "undocumented" opcodes. It has undesigned illegal opcodes which is quite a different thing. Undocumented opcodes (at least in the strict sense of undocumented as hidden and secret) are present in some Intel x86 processors.

Since there is no "illegal" instruction trap and they are executed I would assume they are legal but not documented.

Some of the "undocumented" opcodes on the Z80 are now "documented" and legal on the eZ80. I'm pretty sure that's how they worded it... pardon me for following Zilog's their lead.

 

BTW, the 68000 series is not 16 bit. The series is available with an 8, 16 or 32 bit data bus but it has 32 bit registers. Th 68000 is often referred to as 16/32 bit due to the 16 bit external bus. Many Amiga and ST machines came with full 32 bit versions.

 

Beg to differ. The 68000 is considered a 16/32 bit CPU because its internal architecture. Not because of the data bus width. Most internal buses are 16 bit wide. And most important, the ALU is 16 bits. Of course that later 680X0 CPUs, starting with the 68020, are 32-bit.

I stand corrected... however it is still 16/32... not 16. There are true 16 bit CPUs that don't have 32 bit instructions or registers so there is a difference.

Link to comment
Share on other sites

Since there is no "illegal" instruction trap and they are executed I would assume they are legal but not documented.

Some of the "undocumented" opcodes on the Z80 are now "documented" and legal on the eZ80. I'm pretty sure that's how they worded it... pardon me for following Zilog's their lead.

 

The problem is assuming that just because said instruction executes, it's legal. Which is not the case, in chips like the 6502 & 6510, illegal opcodes are a side effect of low level technical design of the CPU and are undocumented because said instruction doesn't exist outside of accidental cross-circuit logic. Many of said illegal opcodes do not even execute, it just so happens, by shear luck, some do with unintended but useful results.

 

Your reference to Zilog's Z80 is really a different case somewhat. Those undocumented opcodes, if I recall, regard the index registers which were designed to be flexible 16 bit floating pointers that Zilog choose to consider 16bit only, despite being 8bit compatible by design. This is really a case of a legal by design opcodes existing, but simply being undocumented.

Link to comment
Share on other sites

This is the definition I use:

 

1. Undocumented - The designer/supplier has decided (perhaps incorrectly) that an understanding of a particular behavior of the device is not important for successful usage of the device.

 

2. Illegal - The designer/supplier has specified that a particular behavior of the device should not be relied upon and may be subject to change.

 

Of course, illegal opcodes are generally undocumented as well.

 

What's interesting is when a CPU has undocumented opcodes that work because real microcode was put in for those instructions. They're sort of vestigial opcodes. The 6502 has no microcode, and thus all undocumented opcodes are acted upon by the same mechanisms as for documented ones.

Link to comment
Share on other sites

Documenting cycle by cycle execution is however, very important. Most 8-bit CPUs do, most 16-bit ones do not. It is true that a 16-bit CPU might need a whole book for that, and not just a few pages. And it is also true that the more advanced the processor, the less you are supposed to rely in low level behavior. That's exactly what I meant when I said that it is expected that an 8-bit computer/CPU would be "better" documented than a 16-bit one.

 

Try to find a source of new Z80 CPU chips for sale today. Not terribly difficult. Now try to find a source for 8086 chips.

 

There seems to have been a different design philosophy with the 8-bit chips than the higher ones. The 8-bit chips were seen as replacements for logic, while the higher ones were seen as replacements for computers. Whereas logic often needed to operate at precisely-controlled speeds, computers designs generally subscribed to a 'faster is better' philosophy where software would, ideally, run instantaneously except when it was necessary to wait for some external condition.

 

While there were certainly some situations where precise timing might be needed even in a 16-bit computer system, such situations were often handled by letting an 8-bit microcontroller or specialized piece of logic take care of them. Since many applications for 16-bit and higher processors would include things like interrupts, DMA, multitasking, etc. it would not be possible to achieve the level of microsecond precision available with simpler 8-bit applications.

 

ICE and hardware emulators should ideally emulate the internal CPU behavior. Yes, I agree that sometimes might be not worth or not even possible. But if possible, it should. And if it doesn't, then it should document whatever behavior might be different with a real CPU.

 

Precise emulators do generally exist since, among other things, they're needed by the chip developers themselves. On the other hand, many chips include features to allow emulation of all documented features to be performed at a cost far below that of a 'real' emulator.

 

Some implementors of copy-protection schemes may exploit such behavioral differences...

 

It is a common mistake to assume that "undocumented" behavior is used only in copy protections. It is perhaps the most common reason to use them intentionally. But many "normal" programs rely on them, not intentionally by design, but because of bugs. I have seen many of these cases.

 

I use undocumented opcodes in my Atari 2600 games, but in that case I know that all the 2600's that are ever going to be produced are compatible. That's a far different situation from someone designing a game during the heyday of e.g. the Atari ST.

 

And if code has bugs that unintentionally rely upon undocumented behaviors, I'm not sure what difference it would make if they were documented.

 

So let's assume some piece of software that doesn't work because an elusive bug. It might even be a bug in the BIOS/bootstrap code. You try using a hardware debugger, and suddenly it does work. Then you really curse the hardware debugger for not being 100% accurate, and you curse the manufacturer for the missing documentation. Yeah, once again, I agree that in some cases it might be not worth or not possible.

 

How much is 'perfect' emulation worth? Given a choice between: (1) an in-circuit debug module which plugs into a ten-pin header and offers most of the features of an ICE, including accurate emulation of all documented features, for $1000, or (2) a hardware emulation module which requires soldering on a massive ugly 144-pin connection where the CPU goes, is prone to crashing if anything is bumped, and costs $20,000, which seems a wiser purchase?

 

And of course that nowadays that we want software emulators and hardware clones of vintage computers, you want/need to emulate the most minimal detail of the original hardware.

 

Far more critical with 8-bit CPUs than 16-bit ones, and even there mainly for tricky hardware interfacing (including 'beam chasing' video or copy-protection loaders).

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...