Jump to content
IGNORED

-The real Atari 8bit Computer Successor


Drummerboy

Recommended Posts

except it did not really work well for quite some time and there was no practical use for the AVERAGE user

 

Instability issues were mostly due to low memory. With the same amount of memory it was at least as stable as the ST. Then the Amiga was never limited to only 4 windows, and programs never have to be stopped to switch to another program which are very practical advantages to the average user.

  • Like 2
Link to comment
Share on other sites

 

Instability issues were mostly due to low memory. With the same amount of memory it was at least as stable as the ST. Then Amiga was never limited to only 4 windows, and programs never have to be stopped to switch to another program which are very practical advantages to the average user.

 

I'm going to say that I don't really challenge you, and I'm going to guess that you know more about computers and stuff (i.e. you're a fine fella and a great AA member), but is it not the case that the Amiga lacked "protected" memory that modern systems do, with **MODERN** memory management? As I understand it (and I may not), the concept of "PROTECTED" memory (that is, NOT CRASH-HAPPY) didn't come until 68030 and beyond? Am I mistaken? Anything less than "protected" memory is indeed "crash-happy," is it not? Just asking. Modern stuff has "advanced" (hardware) MMUs that the OS Kernal uses, don't they? (as opposed to those early days with no such equivalence?)

  • Like 1
Link to comment
Share on other sites

This thread would be breaking news, if it were 1984 (some 30 years ago).

 

...

 

Not really. The Atari connection to the Lorraine was very well known back in the day. It was covered by most industry magazines, and in some magazines repeatedly. Anyone attending a user group meeting likely had the same repeated. The Atari user group I attended had even had a sig for the latest Lorraine news.

 

The first Amiga owners broke down pretty evenly into three groups. People who had Atari 8-bits knowing the Amiga was designed by the same group of wizards, people with C64s who thought the commodore logo meant it was invented at commodore, and the rest who didn't have prior experience, or used some other computer and got the Amiga because they were blown away by the specs. A lot of that third group would probably have been Atari 8-bit fans if they had known about the 8-bits earlier.

 

Today, nearly everything is a black box and the average computer user know less about how it works than their car.. But, the nature of using a computer in the time of the 800 meant most users had to acquire more technical proficiency than people have today. Many Atari 8-bit users had decent familiarity with the internals, mainly because it was an interesting box with exotic hardware. (and once Atari became less stingy with docs this became more widespread.) I've never had a problem with an Atari 8-bit user who at least knew what ANTIC and GTIA can do following discussion of Denise/Copper capabilities, because it is a natural progression in design -- starting from the identical aspect ratios. For Atari 8-bit nerds the Amiga hardware is obvious.

 

C64 people... not so much. The C64 camp has a higher portion of people who only learned to type the arcane string to run a game from disk. Even for the programmers and more technically oriented users the way the Amiga hardware works was much more foreign to their mindset. Where are the color cells? Indirection, what?

 

Eventually, the Amiga sig left the Atari user group, because the territory was becoming less friendly. ST users didn't like monthly demonstrations reminding them what they were missing. The DeluxePaint, 64 color, extra-half-brite demo was a particularly bad day for them, but I think it was someone's 68020 accelerator that just clipped onto the expansion slot of the 1000 that really got us kicked out. It was funny listening to people who previously justified buying an Atari 800 vs an Apple or c64 due to better graphics and sound specs turn around and try to find reasons for the ST. It has the Atari logo on it! Then later, it's $1 per Kilobyte!!

 

Atari as everyone likes to remember it died in 1984 and became the new Commodore run by (ex)Commodore execs managing (ex)Commodore engineers. The only thing "Atari" about it after that was the logo. To me, logo worship is not what makes a computer.

Link to comment
Share on other sites

The Amiga didn't really have anything over ST insofar as hardware assistance geared towards multitasking - OK, maybe better hardware timers but little else.

 

In any case, the early 68K CPUs offered little anyway. For "proper" multitasking you really want virtual memory as a minimum and neither had that.

A compromise for VM might be something like a movable base address such that programs don't have to be relocatable but neither system had a flexible implementation of that either.

 

Probably a good deal of the instability of the early Amigas was due to the fact that it was doing in software what more modern machines have in hardware.

Given what was out there at the time though, they did pretty well with it. Took MS another 5 years and Apple nearly 10 years to offer anything similar.

  • Like 3
Link to comment
Share on other sites

 

I'm going to say that I don't really challenge you, and I'm going to guess that you know more about computers and stuff (i.e. you're a fine fella and a great AA member), but is it not the case that the Amiga lacked "protected" memory that modern systems do, with **MODERN** memory management? As I understand it (and I may not), the concept of "PROTECTED" memory (that is, NOT CRASH-HAPPY) didn't come until 68030 and beyond? Am I mistaken? Anything less than "protected" memory is indeed "crash-happy," is it not? Just asking. Modern stuff has "advanced" (hardware) MMUs that the OS Kernal uses, don't they? (as opposed to those early days with no such equivalence?)

 

Yes. And at the time, this was sufficient for the Amiga and the ST. These were single-user computers before the advent of the internet exploit. (and while the 68000 has a supervisor vs user mode, it didn't come with an MMU doing memory protection.)

 

Memory protection is a significant drain on performance especially on slower computers like the first Amiga and ST. It's like wearing a raincoat all the time in Arizona just in case it rains. At the time the Amiga was introduced if it had memory protection it would be protecting the system merely from lazy, sloppy programmers. Free that which you allocate. Don't dereference NULL pointers. Basic kindergarten rules -- clean up your own mess. And then the system doesn't crash.

 

By 1991 I was using an Amiga 3000 with 18M RAM. Although it has an MMU built-in, the system didn't use it for memory protection. Software programmers had improved to the point where I could run paint programs, desktop publishing, 3D modeling/ray tracers, word processors (at the same time) all day without any crashes. And no memory protection.. And no resource tracking.

 

Memory protection schemes becomes practical on a multi-user systems like unix to prevent sloppy programmer A from ruining the system for user B. Today, it is also a good idea to help protect against bad programs attempting intentional exploits. However, still, with memory protection in every modern OS, bad things can still happen. Windows and linux systems still do crash.

  • Like 1
Link to comment
Share on other sites

Probably a good deal of the instability of the early Amigas was due to the fact that it was doing in software what more modern machines have in hardware.

Given what was out there at the time though, they did pretty well with it. Took MS another 5 years and Apple nearly 10 years to offer anything similar.

 

The Amiga at release has 256K, and the first version of the OS would assume any request to allocate memory will be successful. Most of the Amigas I saw with that OS were store demo models. When you figure the size of DeluxePaint, and the amount of memory needed for 16-color 640x400 graphics that's a formula for an instant reboot.

 

With more memory the running-out-of-memory cause happened less frequently. When the consumer systems finally arrived the next OS update was close behind and the assumptions about failing an allocation happened less frequently. And when programmers got better they learned to write programs that behaved better.

Link to comment
Share on other sites

 

The Amiga at release has 256K, and the first version of the OS would assume any request to allocate memory will be successful. Most of the Amigas I saw with that OS were store demo models. When you figure the size of DeluxePaint, and the amount of memory needed for 16-color 640x400 graphics that's a formula for an instant reboot.

Actually, no, it did not. AllocMem() did not work any different for the later Os versions, and there was no substantial change in the Os code, except *plenty* of bug fixes. It's just that programmers were (and are) lazy bastards and did (and do!) not protect their programs for exceptional situations, as to "how to react if I run out of memory". That's not much different today, actually, but the toy operating system AmigaOs (and Tripos) were had no means to stop programs that went bezerk. Any sane Os, such as all the Unix derivates, can stop programs and release all their resources leaving a working system behind.

 

When I started programming on the Amiga, I found it pretty hard to get a program solid in the sense that it would be prepared for every eventuality that might come - let it be OOM, or some other event. It's not much different nowadays, exception handling is still hard, except that you can still be lazy and the Os will remove your program gracefully from the system in case you did not got it right. When I first had the chance to work on an *ix machine I was astonished how many stupid things you could do in the program and *still* keep the machine running. Amiga was more like "found a bug in the program - reboot the machine".

Link to comment
Share on other sites

The Amiga didn't really have anything over ST insofar as hardware assistance geared towards multitasking - OK, maybe better hardware timers but little else.

 

In any case, the early 68K CPUs offered little anyway. For "proper" multitasking you really want virtual memory as a minimum and neither had that.

A compromise for VM might be something like a movable base address such that programs don't have to be relocatable but neither system had a flexible implementation of that either.

 

Probably a good deal of the instability of the early Amigas was due to the fact that it was doing in software what more modern machines have in hardware.

Given what was out there at the time though, they did pretty well with it. Took MS another 5 years and Apple nearly 10 years to offer anything similar.

There's a bit more story to tell on all that. Actually, there were a couple of systems with a 68K *and* virtual memory. I don't remember which machine used it, but while the 68K had no means to safely store its state in case of a bus error and continue from that state, in this specific machine an external dedicated MMU chip locked out the main 68K by requesting the bus, the MMU would then trigger a *second* 68K which would load the necessary page into RAM,and then let the primary 68K continue. Pretty complicated and expensive design...

 

Relocatable base addresses: Yes, and there was something like this as well. The 68K Macs had a construction called "Handles" and "Resources". If a program required the Os to display a requester (or open a window), the window layout was stored on disk in the "resource fork" of the program. What the program got in return was not a pointer to the window, but a "handle", a double-indirect pointer. As soon as the system went low on memory, the window resources could be either relocated, adjusting the handle, or wiped completely from memory, then transparently re-loading the resource from disk when required. It was a pretty smart system, though also a pretty slow and complicated one.

 

However, the way how the Mac did multitasking could really made me cry: MacOs programs typically installed many patches into the "system traps", by altering the Os trap dispatcher in runtime. Thus, when Apple started offering the "Multi-Finder", the multi-finder task switcher would remove all the patches, load the new program, and install its patches back into the Os. Needless to say, multitasking on Mac was a pain, it was damn slow to switch tasks, and inter-process communication was crawling. All AmigaOs had to do is to go through the exec task dispatcher, which was about a hundred instructions in size, even less. Save the registers to stack, load the new registers from the new stack, continue.

Link to comment
Share on other sites

There's a bit more story to tell on all that. Actually, there were a couple of systems with a 68K *and* virtual memory. I don't remember which machine used it, but while the 68K had no means to safely store its state in case of a bus error and continue from that state, in this specific machine an external dedicated MMU chip locked out the main 68K by requesting the bus, the MMU would then trigger a *second* 68K which would load the necessary page into RAM,and then let the primary 68K continue. Pretty complicated and expensive design...

 

 

I recall reading that before.. I thought that would be Sun's first 68000-based unix workstations, but it looks like something from MIT called NU.

 

The next 680x0 versions solved the instruction restart after paging problem.

Link to comment
Share on other sites

You can't really rely on bus errors as a means of implementing VM though.

 

Proper VM implementation has address translation for everything, so you can have shared areas and each task can have address ranges that can be same as others but mapping occurs uniquely. And of course you have the ability to do paging - which in itself is generally implemented by having mappings that are out of range and in fact get used as an index into the paging file/s.

 

The early 68K were somewhat half-baked in many regards, the external FPU is another thing that comes to mind - although of course the contemporaries were 8086, 80286 which themselves were just as primitive in most regards.

 

All that aside though, for what they were, 68K was an excellent architecture which IMO outshone the Intel offerings by a big margin at least until the mid/high range '386 came along.

Link to comment
Share on other sites

 

Instability issues were mostly due to low memory. With the same amount of memory it was at least as stable as the ST. Then the Amiga was never limited to only 4 windows, and programs never have to be stopped to switch to another program which are very practical advantages to the average user.

not really but it helped, it seemed like over a year before it became mostly stable and with it being release 6 months after ST it was way behind. I was a dealer for both back at that time. Amiga was a hard sell. Again, the average user was single tasking, I was not refering to geeks,but only the general public, so for them no practical use especially since it was unstable, heck it used to guru with 3 or 4 windows of the simple line draw demo open. :-D

Link to comment
Share on other sites

 

The Amiga at release has 256K, and the first version of the OS would assume any request to allocate memory will be successful. Most of the Amigas I saw with that OS were store demo models. When you figure the size of DeluxePaint, and the amount of memory needed for 16-color 640x400 graphics that's a formula for an instant reboot.

 

With more memory the running-out-of-memory cause happened less frequently. When the consumer systems finally arrived the next OS update was close behind and the assumptions about failing an allocation happened less frequently. And when programmers got better they learned to write programs that behaved better.

As a dealer there wer not demo systems for us, just the same one the public got, yes 256, then later the 256 addition which helped. Well constructed but released about a year too soon for all the bugs, but I suppose the need to try to keep up with Atari (whatever that means in the context of all the mixed associations)

Link to comment
Share on other sites

I really don't see either machine as being a successor since they are both so different.

The lineage is clearly there though. Compare Jay & crew's 1st 3 generations.

 

2600:

- Crazy simple "sprite" system consisting of a missile and a ball. CPU was responsible for synching the display. Custom sound chip.

- Display was built line by line, mid line interrupts were possible with tight code

8-bit:

- Two custom chips created to handle the display, relieving the CPU of that task.

- 5 independent player/missile objects

- More advanced custom sound chip

- Display still built line by line

Amiga:

- More custom silicone to relieve the CPU (Display Lists & interrupts now handled without CPU)

- "IIRC, 8 hardware "sprites"

- Display still built line by line, with much more capability as far as register changes

 

While the Amiga may not have shared any actual silicon (at the logic design level) as the 2600 and 8-bits did, its design was clearly an evolution of the ANTIC/GTIA/POKEY architecture.

  • Like 3
Link to comment
Share on other sites

Not really. The Atari connection to the Lorraine was very well known back in the day. It was covered by most industry magazines, and in some magazines repeatedly. Anyone attending a user group meeting likely had the same repeated. The Atari user group I attended had even had a sig for the latest Lorraine news.

I wasn't clear enough in my commentary. What I really meant was that the was hardly new news THEN, nor is it new news NOW, but yet this thread seems to treat it as such. It's old news now, and (as you correctly point out) it was old news then. It's old as the hills. That was my point. The great question, of course, is why is this of such concern in 2014, when I'm sure there are 10 years of threads on this subject matter?

 

 

The first Amiga owners broke down pretty evenly into three groups. People who had Atari 8-bits knowing the Amiga was designed by the same group of wizards, people with C64s who thought the commodore logo meant it was invented at commodore, and the rest who didn't have prior experience, or used some other computer and got the Amiga because they were blown away by the specs. A lot of that third group would probably have been Atari 8-bit fans if they had known about the 8-bits earlier.

My experience has been different. I think you're on-the-money as far as the 3 groups, but I never observed (nor do I see reason for it) evidence to believe that it broke down "evenly." Why should it? People aren't that reasonable; if people were [generally] that reasonable, the phenomenon that we know of as the "fanboy" wouldn't exist. I present the obvious, continuing-existence of the [undeniable] "fanboy" as evidence, in this regard.

 

The *vast* majority of Amiga owners that I knew were previous C-64 owners. Likewise, the *vast* majority of the ST owners that I knew were Atari-8 owners. Whether or not this made logical sense, it was observed. If it was (regarless of whether or not it made sense), observed, then there were probably some reasons for its occurrence. My conjecture is as such: (1) You got to keep your fanboy-based brand loyalty (not saying that makes sense, but let's acknowledge that it exists, whether we're talking microcomputers, automobiles, or any other consumer commodity). (2) You got to keep your computer dealer (assuming your dealer didn't deal in both, in which case you're golden either way but I observed a bit of exclusivity in regard to dealers), and (3) You got to keep your user group, and since so many Atari-8 users bought Atari-16 and since so many Commodore-8 users bought Commodore-16, it seemed to make *some* sense, and regardless of whether one chooses to believe that it was sensible, it *did* in-fact happen.

 

Today, nearly everything is a black box and the average computer user know less about how it works than their car.. But, the nature of using a computer in the time of the 800 meant most users had to acquire more technical proficiency than people have today. Many Atari 8-bit users had decent familiarity with the internals, mainly because it was an interesting box with exotic hardware. (and once Atari became less stingy with docs this became more widespread.) I've never had a problem with an Atari 8-bit user who at least knew what ANTIC and GTIA can do following discussion of Denise/Copper capabilities, because it is a natural progression in design -- starting from the identical aspect ratios. For Atari 8-bit nerds the Amiga hardware is obvious.

I submit that the *user* experience for the Amiga user was not as such. While the Amiga was a capable and admirable machine, I think that most Amiga users (most of whom were Commodore 64 users, and even if they were not) didn't give a crap about any interpreted "lineage" from the Atari 800. Aside from the Atari 800 users who jumped-ship to the Amiga, of what relevance was it? None! They (once again, mostly C-64 users) just applauded the machine (the Amiga) on its own merits and capabilities, which were nothing to sneer at. The Amiga was quite a capable machine on its own rights, and didn't need a "tip of the hat" to the Atari 800, to stand on those capabilities. Once again, I submit that most C64-->Amiga users didn't give a crap about the alleged "lineage" between the A8 and the Amiga, never regretted their original C64 purchase (nor do I suggest that they should have), accepted the quite-capable Amiga for what it was (and could afford to pay for it, at least in 1985), and moved on. It's only in the purview of the A8 user who "went Amiga" to try to construct this "loyalty to the architecture" that really wasn't there. Apple had the IIgs, Commodore had the C128, and A8 users had nothing - as far as the "true successor" department goes. Perhaps chip designers (what fraction of the population is that???) could observe Miner's work. But as Commodore users of the time made the jump from C64 to Amiga, I don't think for a second that they factored in that the Atari 800 was the "true" predecessor to the Amiga. They didn't give a shit about that; they either (1) bought the machine as the next offering from Commodore (whom they loved), or (2) bought the machine on its own merits. Neither would really be a mistake. The notion of "Atari 800 successor" was quite irrelevant. To pose a question: Why should it have been relevant, then, now, or ever? It didn't look like an A8, didn't act like an A8, and wasn't compatible with an A8. Should it have done any of those things, it likely would have been less-palatable to upgrading C64 users, of whom were the primary demographic for Amiga purchase. While (undeniably) some A8 users bought Amigas, this was a small minority; most C64 users either held on to their C64 until time to move to PC or bought an Amiga and moved to PC from Amiga.

 

 

C64 people... not so much. The C64 camp has a higher portion of people who only learned to type the arcane string to run a game from disk. Even for the programmers and more technically oriented users the way the Amiga hardware works was much more foreign to their mindset. Where are the color cells? Indirection, what?

I'm not sure what you mean by all of that. The C64 wasn't quite as easy to boot a disk as the A8, but it wasn't that hard, if you could read a line from the manual. The fact that the C64 was *so much more* popular than the A8 (overall) means that aside from the cheap price, it wasn't so hard. Operating the Amiga hardware had absolutely *nothing* to do with operating the C64 (or the Atari8 for that matter) so I don't know where you're going with that. Suffice it to say that C64 fellas had no problem launching software, or the C64 wouldn't have whipped the A8 (in the marketplace, mind you) as it did.

 

Eventually, the Amiga sig left the Atari user group, because the territory was becoming less friendly. ST users didn't like monthly demonstrations reminding them what they were missing.

I don't know what "sig" you're referring to. I never argued that the ST hardware was superior, and as you were not an ST user, how can you claim to speak for as what ST users did or did not like? You seem to be mistaking this line of reasoning as somehow shitting on the Amiga, and I'm clearly not. The ST offered a still-impressive, significant advancement in performance for an absolutely phenomenal price. That's my contention, then, and now. I don't purport as to understand as to what Amiga users liked, nor should you as to what ST users liked. But I'll tell you: they liked bang for the buck, and they got it with ST, even if there were other options with more capabilities, at increased cost, as there always are. As I've already argued, the A500 brought this down to a competitive price later, so please acknowledge the timeline.

 

 

The DeluxePaint, 64 color, extra-half-brite demo was a particularly bad day for them, but I think it was someone's 68020 accelerator that just clipped onto the expansion slot of the 1000 that really got us kicked out. It was funny listening to people who previously justified buying an Atari 800 vs an Apple or c64 due to better graphics and sound specs turn around and try to find reasons for the ST. It has the Atari logo on it! Then later, it's $1 per Kilobyte!!

So, you're making a mockery of the price/performance of the ST? It's been almost 30 years; it doesn't matter anymore. You no longer have to feel guilty for buying an Amiga, and nobody (including me) wants you to. There never was a need to "find a reason" for the ST; it offered uncompromised bang-for-the-buck. I think you're a little to emotionally-involved, and I remind you - once again - that it's been 30 years, so there's no need for it to be so. Choose what meets your need, for the buck. If you can afford (or are only willing to spend) for the Amiga 1000, good choice! If you can afford (or are only willing to spend) for the Atari ST, good choice! What's the beef? Is that not true?

 

Atari as everyone likes to remember it died in 1984 and became the new Commodore run by (ex)Commodore execs managing (ex)Commodore engineers. The only thing "Atari" about it after that was the logo. To me, logo worship is not what makes a computer.

Very well. If that's what it takes to justify your purchase of an Amiga, that's pretty F-ing weak. I'd have rather had you say that you preferred the Amiga because you (1) liked the machine, and (2) could afford to pay for it. It sounds like you're trying to justify something else (abandoning the Atari brand???) with that line of "reasoning." Just make your choice, and that's what you did. Why this latent need to "justify" it? I don't get what your motive is, unless you feel some need to "justify" why you jumped ship and abandoned Atari, and I assure you, there's no need to do such, at least from this end, so you don't need to shit on the ST or Atari - at least from THIS end. From YOUR end, I suggest you just let it go, as you purchased the machine you desired and could afford to pay for - as I and others did the same - and there's no guilt, unless you choose to bring it in and then attempt to justify it by shitting on the later Atari Corp and the ST; you don't need to. Just say that you preferred the Amiga, that's all. That answer shall be accepted.

 

 

Yes. And at the time, this was sufficient for the Amiga and the ST. These were single-user computers before the advent of the internet exploit. (and while the 68000 has a supervisor vs user mode, it didn't come with an MMU doing memory protection.) Memory protection is a significant drain on performance especially on slower computers like the first Amiga and ST.

That's why it was just an interesting experiment, and I'll give it that. Nowdays, when I'm encoding digital video in the background with 10 browser windows open and multiple random events occurring - AND NO CRASHING - I appreciate multitasking. Back then, when computers operated at a relevant snail-pace, not so much. I'll hand the Amiga the crown for multi-tasking, much as I'd hand a 1957 Chevy "Fuelie" the first crown for Fuel Injection. I can't, however, overlook how much better modern implementations of multitasking work, nor can I overlook how much better modern implementations of fuel injection work (particularly, direct-injection, but that's another topic). But oh, how those were the first!!! At the time, I was satisfied with the Print Spooler on the ST, because that was some-sort of multitasking, didn't crash, and not having to wait for the blasted slow printer was most appreciated. However, it wasn't Amiga-worthy multitasking.

 

It's like wearing a raincoat all the time in Arizona just in case it rains.

Actually, it's like expecting a reliable system that doesn't crash. That's why both hardware and operating systems have evolved to assure a reliable and not crash-happy system. If it wasn't necessary, it wouldn't have happened. It did happen. Modern systems with protected memory and ultra-reliable multitasking is the norm today. I can't believe you'd ever *purport* to argue otherwise. I'll assume I've misunderstood your argument, and you're not *really* arguing as to the non-necessity of modern, protected-memory systems and likening them to a raincoat "all the time" in Arizona."

 

At the time the Amiga was introduced if it had memory protection it would be protecting the system merely from lazy, sloppy programmers. Free that which you allocate. Don't dereference NULL pointers. Basic kindergarten rules -- clean up your own mess. And then the system doesn't crash.

Well, I back-off the above. I guess you're arguing that modern, protected-memory systems are NOT necessary, in order to justify the Amiga. You're die-hard - and I admire your dedication, but I respectfully disagree. I cite the emergence of modern, protected-memory systems as evidence as towards my side of the argument. Note that I don't take credit for them (I couldn't possibly!!!) but I merely cite their existence, popularity, and submit that - since necessity is the mother of invention - they merely exist (and work quite well).

 

By 1991 I was using an Amiga 3000 with 18M RAM. Although it has an MMU built-in, the system didn't use it for memory protection. Software programmers had improved to the point where I could run paint programs, desktop publishing, 3D modeling/ray tracers, word processors (at the same time) all day without any crashes. And no memory protection.. And no resource tracking.

That's a nice system, and I like the 3000 better than the 4000. But it doesn't eliminate the need for memory protection. I submit (once again) the fact that modern systems ***ALL*** use memory protection as to the need for memory protection. Or are all these people (he who developed memory protection) fools, and should have consulted you first, and dismissed such efforts?

 

 

Memory protection schemes becomes practical on a multi-user systems like unix to prevent sloppy programmer A from ruining the system for user B. Today, it is also a good idea to help protect against bad programs attempting intentional exploits. However, still, with memory protection in every modern OS, bad things can still happen. Windows and linux systems still do crash.

Of course they crash, but so much less-so than before memory protection. In this last statement, it sounds like you're both FOR and AGAINST memory protection. I'm going to take it (from earlier paragraphs) that you're generally AGAINST MEMORY PROTECTION. That's fine, but I think you're in the minority, and memory protection seems to work out quite well, for the rest of us.

  • Like 1
Link to comment
Share on other sites

And after a long revisit to the 80's, the final decision is........... for me, to run over eBay and get an A500 to play with :). That will be a cool discovery for a fella like me who jumped to the PC bandwagon from an A8 and never had his hands on a 68000 machine of any lineage.

Edited by atari8warez
  • Like 1
Link to comment
Share on other sites

 

[1] "Today, nearly everything is a black box and the average computer user know less about how it works than their car.. But, the nature of using a computer in the time of the 800 meant most users had to acquire more technical proficiency than people have today. Many Atari 8-bit users had decent familiarity with the internals, mainly because it was an interesting box with exotic hardware."

 

[2] "It was funny listening to people who previously justified buying an Atari 800 vs an Apple or c64 due to better graphics and sound specs turn around and try to find reasons for the ST. It has the Atari logo on it! "

 

[3] "Atari as everyone likes to remember it died in 1984 and became the new Commodore run by (ex)Commodore execs managing (ex)Commodore engineers. The only thing "Atari" about it after that was the logo. "

BAM! BAM! and BAM!! Exactly how I feel. Atari 8-bit users in the early 80's were smarter than your average bear. Today's electrical appliances are black as coal little boxes made of "Uncomprehendium" to today's user. Nothing wrong with that, just the truth. I never warmed up to the ST because it didn't feel like an Atari. The Amiga came too late and the IBM grabbed my attention. If I had known Jay Miner and crew made the Amiga, I might have swallowed my pride and bought something made by Commodore. By 1984, everyone that made Atari, well Atari had left the company.

 

There are two basic rules in being a good manager;

1) Surround yourself with good people.

2) and take care of those people or someone else will.

 

I give kudos to Ray Kassar for having the vision to hi-jack the "2600 follow on" and get into the ring with Apple, but unfortunately he inherited #1 and completely sabotaged #2.

Link to comment
Share on other sites

 

Instability issues were mostly due to low memory. With the same amount of memory it was at least as stable as the ST. Then the Amiga was never limited to only 4 windows, and programs never have to be stopped to switch to another program which are very practical advantages to the average user.

no it wasn't, during those early days we had both setup side by side. Amiga crashed just sitting sometimes. we replaced it twice. it wasn't until newer WB arrived that things seemed ok, I would agree with general stability by the time the a500 showed up or late 86 a1000.

Link to comment
Share on other sites

BAM! BAM! and BAM!! Exactly how I feel. Atari 8-bit users in the early 80's were smarter than your average bear. Today's electrical appliances are black as coal little boxes made of "Uncomprehendium" to today's user. Nothing wrong with that, just the truth. I never warmed up to the ST because it didn't feel like an Atari. The Amiga came too late and the IBM grabbed my attention. If I had known Jay Miner and crew made the Amiga, I might have swallowed my pride and bought something made by Commodore. By 1984, everyone that made Atari, well Atari had left the company.

 

There are two basic rules in being a good manager;

1) Surround yourself with good people.

2) and take care of those people or someone else will.

 

I give kudos to Ray Kassar for having the vision to hi-jack the "2600 follow on" and get into the ring with Apple, but unfortunately he inherited #1 and completely sabotaged #2.

Technically Commodore died as well when Jack etc left

Link to comment
Share on other sites

What's sad is these companies didn't have viable 16-bit projects in the works the whole time. So, one company bought a design and the other threw one together as quickly as possible. This isn't how real computer companies act.

Link to comment
Share on other sites

What's sad is these companies didn't have viable 16-bit projects in the works the whole time. So, one company bought a design and the other threw one together as quickly as possible. This isn't how real computer companies act.

Typical "bean counter" vs. "visionary" scenario. Jay & crew knew the 2600 wouldn't be more than a fad. They wanted a successor ASAP. Same with the 8-bit platform. As soon as it was out the door, designs for the next generation were started. Management wanted nothing to do with it, and look where we ended up.

Link to comment
Share on other sites

Typical "bean counter" vs. "visionary" scenario. Jay & crew knew the 2600 wouldn't be more than a fad. They wanted a successor ASAP. Same with the 8-bit platform. As soon as it was out the door, designs for the next generation were started. Management wanted nothing to do with it, and look where we ended up.

Yep, and Apple was always trying to change the world and they're still with us. They built some stinkers, but they took the personal computer very seriously.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...