Jump to content
IGNORED

Classic Computers -- what if they would have lived on?


Recommended Posts

I totally agree about the Amiga. Which of course was really an Atari design (at least mostly). I would have much rather have seen it stay independent and have thrived. The Amiga was doing so many things way ahead of it's time, it was so revolutionary in concept. I remember seeing the 1000 demo'ed at release and I was blown away, and ended up getting a 500 a couple/few years later. Basically, I totally agree with everything you said. Except I'm not so sure about Amiga taking the crown from Apple for imaging and video. Not only for the marketing and distribution reasons, but Apple also had a lot of great partnerships for font and printing technologies. Now, if the Amiga had those instead of Apple, that really would have been the be-all-end-all!

Link to comment
Share on other sites

 

Personally I would have liked to see the Amigas continue. Their audio/visual capabilities just seemed so much further ahead of anyone else, and I think we would have seen the capabilities in animation and design further a lot sooner than we did with the PC/Mac standard. If the Amiga wouldn't have failed I don't think Apple would have taken the crown for imaging/video.

 

Historically, the worst possible consumer products in our country have succeeded all because of marketing. The intelligent few know this. And that's quite embarrassing when you think about it.

  • Like 2
Link to comment
Share on other sites

I would have much rather have seen it stay independent and have thrived.

 

GM - Saturn, end of story.

 

Not saying it would have been possible, just saying I would have liked to have seen it happen. It is possible... Apple isn't a part of Microsoft, and contrary to popular belief, the Microsoft investment was fairly small and didn't really play a big role in saving the company. Plus, Saturn made very cheap derivative cars, cost too much, and after awhile, their quality slipped. Being semi-independent wasn't their only problem by far (and they weren't even as independent as they claimed anyway).

Link to comment
Share on other sites

Some of the later-model Saturns were actually turning out to be pretty good, in their own right. I just think that they took too long to get to the point where they weren't as you said "cheap derivatives", which is what the market deemed them.

 

But that's beside the point. :P You have to remember that that the auto industry has really been fraught with an anti-technological sentiment for generations -- big business is inimical to technology to a certain point. Which is what we see in the computer industry of the time, I think. Technology and the free market are oftentimes at airs with each other -- the best technology simply does not win the battle, because user friendliness, reliability, and other factors come into play. As save2600 said, that has a lot to do with Amiga's downfall -- too much R&D on a tech side and not enough marketing savvy. A lot of the time the BEST product from every standpoint just does not win the battle -- Amiga is a perfect example, did everything better than the competition in almost every way, but still couldn't keep afloat vs. the PC market, which couldn't do anything as good but was more "approachable" and had the Microsoft marketing machine behind it.

Edited by CebusCapucinis
Link to comment
Share on other sites

Right, I totally agree that marketing will usually win with any product, and it's one of the many reasons that Commodore and Atari both failed so dramatically. If that weren't true, most people would be using far different products than the ones they're actually using. It is sad... very sad. (And on the off-topic, I think Saturn started out well and became decent again... it was just those pesky middle years where they seemed to go all wrong).

Link to comment
Share on other sites

Simply put: brilliant people from GM broke away and started their own company called Saturn - that GM ultimately financed, but that was the end of it. Saturn USED to be extremely indie from GM until early 2000's before they took greater hold of them. Saturn was poised not as a cheap alternative but a high-end Euro GM offering (the LS1/2 frame, suspension and tranny were based on German Volvo designs, which GM has had stock in). I bought a brand new LS1 (2003) and it blew away anything GM currently had at the time for comparable dollars. Mid to late 2000's and now GM has totally taken Saturn over, released a multitude of unnecessary Saturn models that all competed amongst themselves AND GM (not unlike Atari and Commodore to a lesser extent back in the 80's) and look what happened. Saturn was a superior concept that was highly successful until corporate politics and greed took over. Same old story.

Edited by save2600
Link to comment
Share on other sites

Oh, gotcha, if Amiga had stayed independent, maybe they'd have had a shot... I think that's what you're saying? Of course, if they'd have stayed independent, they wouldn't have had the marketing muscle to succeed (and still didn't, but that's another story entirely). Eh, let's face it, as things were, the Amiga et al were pretty much doomed. My point is that that's too bad, since it was a fantastic machine with so much potential.

Link to comment
Share on other sites

Oh, gotcha, if Amiga had stayed independent, maybe they'd have had a shot... I think that's what you're saying? Of course, if they'd have stayed independent, they wouldn't have had the marketing muscle to succeed (and still didn't, but that's another story entirely). Eh, let's face it, as things were, the Amiga et al were pretty much doomed. My point is that that's too bad, since it was a fantastic machine with so much potential.

 

Absolutely, 100% agree Mirage :) :thumbsup: :-D

Link to comment
Share on other sites

The problem is that designing the things does not come for free. And if you're designing a whole new computer from scratch, you have to create an infrastructure for it from scratch too. It was a lot cheaper to make yet another 8088 PC clone, and you didn't have to create a whole new set of software for it.

 

BASIC only mattered because in the early days, most of the people who wanted computers wanted them to program. After a transition through "a way to run someone else's program slowly" until the 16-bit era meant enough memory for compilers (yes you could run one on an 8-bit but was slow and cramped), it stopped mattering at all.

 

Apple got lucky. They were hurting multiple times, and Desktop Publishing was probably what gave them any chance at all at first. (It certainly wasn't games.) None of the others could keep their focus long enough. (Focus meaning sticking with their core competency, rather than, say, Coleco trying to turn a very decent video game system into a crappy kludge of a computer because "everybody knew" that video game consoles were dead and home computers were the new hawtness.)

Link to comment
Share on other sites

When I saw the proposed future products for Atari 8-bit systems, etc, it seemed to really have an obsolescence-friendly design. There seemd to be too much hardwired & hard-addressed expansion, with no regard for the future. Scalability? Eh, not so much. Today, you can put as much or as little RAM you want in a system, as well as hard drive space, etc. With hardware only coming from one company in a self-contained unit, and little 3rd party support, it didn't seem destined to last forever.

Link to comment
Share on other sites

When I saw the proposed future products for Atari 8-bit systems, etc, it seemed to really have an obsolescence-friendly design. There seemd to be too much hardwired & hard-addressed expansion, with no regard for the future. Scalability? Eh, not so much. Today, you can put as much or as little RAM you want in a system, as well as hard drive space, etc. With hardware only coming from one company in a self-contained unit, and little 3rd party support, it didn't seem destined to last forever.

 

I agree with you...for that period. But now, look how things have changed. Most people don't do any modification of their computers. They buy a laptop, an iMac, or even a desktop computer, and that's it. They're done. Maybe they'll add on a USB device or three, though. And that's just like adding an SIO device.

 

So Atari's design wasn't necessarily bad. It was just executed at the wrong time in the marketplace. But they're not alone. Same goes for the original Mac design.

 

But, yeah, at the time, prices on memory and disk space were coming down so quickly and made such a big difference that people were adding RAM and (later) HDD space all the time. Plus, the average user of those computers tended to be more technically-oriented than today's average user.

 

I guess you can cut it into four eras: the build-it-it-yourself era of Apple I, Heathkit, MITS, and the rest; the program-it-yourself era of Atari, Commodore, TI, and the rest; the upgrade era of the mid-80s to the new millennium; and the present era of essentially self-contained devices. Call it the "magic box" era if you like. :)

Edited by Ransom
Link to comment
Share on other sites

That does point out one of the reasons the Mac stayed around.

 

Maybe not from the inception of the project, but at least from the time they decided to use the 68000 (instead of the 6809) and pillage the Lisa project, the decision was made that Thou Shalt Not Talk Directly To The Hardware, with the penalty being that the hardware was not guaranteed to remain the same. This worked because they did such a good job on optimizing the code in ROM (particularly Quickdraw) that you wouldn't want to do it yourself, and on providing such a big API. They also left enough abstraction (and found enough ways to hide more data in structures used by the ROM) to keep stuff working after doing something major like adding color. Even those who played fast and loose with holes in the API (like the high bit of pointers) eventually paid the price.

 

It also helped that 1) the 68000 was backward compatible with only certain changes like exception handling, and 2) the awesome 68000 emulator that was put into the PPC Macs that let 68000 code run at a similar speed to the CPU chips that the PPC replaced, and as an almost completely equal citizen.

 

There's stuff from back in the '80s that would still run under Classic on a PPC OS X Mac.

Link to comment
Share on other sites

Not saying it would have been possible, just saying I would have liked to have seen it happen. It is possible... Apple isn't a part of Microsoft, and contrary to popular belief, the Microsoft investment was fairly small and didn't really play a big role in saving the company. Plus, Saturn made very cheap derivative cars, cost too much, and after awhile, their quality slipped. Being semi-independent wasn't their only problem by far (and they weren't even as independent as they claimed anyway).

The Microsoft investment was to instill confidence in the viability of Apple, not to own it.

The $150 million was less than a charge Apple posted for the 2nd quarter of '97 alone but I'd say it had a lot more to do with Apple's survival than you give it credit for.

It instilled consumer confidence and it probably helped them obtain financing in a period where Apple was experiencing major losses.

 

As for Saturn, GM repeatedly had to infuse the company with more cash because GM had made too many deals with the labor union when creating Saturn that kept jobs rather than reduced labor costs with automation. There was a writeup on it while I was taking a management class in college.

Saturn was never fully independent from GM and was subject to corporate political decisions from day one.

Link to comment
Share on other sites

I'm very curious as to why there is really only the two computer standards now.
Commodore and Atari ran themselves into the ground as businesses, knocking them out. PCs took over because of cost. Having essentially an open platform lets a million PC makers into the game, dropping the price of components. That led to the dominance of Windows software, imho. With so many more units out there, it made sense to develop for the largest base. I love my Macs, but there isn't really a "low price" option unless you buy used. PCs give you the whole range.

 

I think it's a fun "what if" to ponder what would have happened if PCs never really took over and Commodore and Atari were strong enough to survive. You really only had three major non-PC players: Apple, Commodore and Atari. I think you would have still seen some shakeout in the market, with two manufacturers claiming most of the market share and the third being more nichey. My guess would be Apple & Commodore taking 80% of the market, with Atari being 20%. Obviously those figures pulled out of my butt. =)

 

I think the education market would be split pretty evenly. Apple obviously had inroads there, but Commodore & Atari had price advantages.

I think Commodore would have taken the Amiga and ran with the graphics and video market, or what that's worth.

Business software is hard to say - Apple probably had the best chance to capture it due to it's strength in desktop publishing, but "business" is so broad who can really say. I loved my Amiga, but the interface was butt ugly, and that matters. The Mac OS (and to a lesser extent GEM) was much nicer looking, and appeared more business-like.

For gaming and general home computing, I could see a pretty even split w/ the three big makers, but the Amiga was a gaming powerhouse at a great price so I could see the Amiga line having the biggest share.

 

As for the actual technology, that's tough. I think you would have still seen some standardization on things like USB ports and hard drive interfaces (I don't know too much about the history of those, they were probably well on their way before the PC became dominant).

 

On the downside, the "standardized" PC market let 3D card makers flourish. Not sure that would have happened (or at least happened as fast) if you had three or for different architectures splitting market share. One of the big three would have worked on real "modern" 3d rendering hardware first and made big strides.

Edited by BydoEmpire
Link to comment
Share on other sites

I loved my Amiga, but the interface was butt ugly

I'd have to agree with that. Starting with that cursor arrow that looked like it was drawn with a crayon. I know the reason was so they could be used with cheap TV sets, but I think the crappy resolution of cheap TV sets was a big problem with the "home computer" industry. Okay for games, crappy for doing "serious" stuff. S-video really wasn't around then, except in the 2-plug version used by Commodore, and even then the monitor wasn't exactly cheap. But a good monitor, even 640x480 15" VGA, was a lot more expensive.

 

The Mac wasn't exactly cheap, and it didn't have color, but damn the picture was sharp. And it was a decent size for the day.

 

As for the actual technology, that's tough. I think you would have still seen some standardization on things like USB ports and hard drive interfaces (I don't know too much about the history of those, they were probably well on their way before the PC became dominant).

USB didn't even exist until 1996. That's quite a long time after the "home computer" market vanished. IDE didn't get decent until 1994 or so with EIDE.

 

In fact, the reason Apple ended up with so many dead-end hardware technologies was that they needed these things beore there was a good industry standard, and had to pick something, sometimes making their own. Then the market finally picked something else. ADB, Appletalk, Nubus and SCSI. Firewire was a partial success (yes, Apple invented it), and still works better than USB at equivalent speeds.

Link to comment
Share on other sites

Some of the later-model Saturns were actually turning out to be pretty good, in their own right. I just think that they took too long to get to the point where they weren't as you said "cheap derivatives", which is what the market deemed them.

 

But that's beside the point. :P You have to remember that that the auto industry has really been fraught with an anti-technological sentiment for generations -- big business is inimical to technology to a certain point. Which is what we see in the computer industry of the time, I think. Technology and the free market are oftentimes at airs with each other -- the best technology simply does not win the battle, because user friendliness, reliability, and other factors come into play. As save2600 said, that has a lot to do with Amiga's downfall -- too much R&D on a tech side and not enough marketing savvy. A lot of the time the BEST product from every standpoint just does not win the battle -- Amiga is a perfect example, did everything better than the competition in almost every way, but still couldn't keep afloat vs. the PC market, which couldn't do anything as good but was more "approachable" and had the Microsoft marketing machine behind it.

Not really, Commodore started by marketing an overpriced, generally not ready (at release) machine against a machine that was perceived as both easier to use and much cheaper. Atari ST. That mistake cost them 2 years in the market place. ST on the otherhand blew that great lead(sales) by not improving the product quickly enough. As it turned out neither improved the products fast enough and they both sucked at marketing.

I would have been much more interested in seeing the ST advance,keeping in mind the old "power without the price" thing.

Link to comment
Share on other sites

Not saying it would have been possible, just saying I would have liked to have seen it happen. It is possible... Apple isn't a part of Microsoft, and contrary to popular belief, the Microsoft investment was fairly small and didn't really play a big role in saving the company. Plus, Saturn made very cheap derivative cars, cost too much, and after awhile, their quality slipped. Being semi-independent wasn't their only problem by far (and they weren't even as independent as they claimed anyway).

The Microsoft investment was to instill confidence in the viability of Apple, not to own it.

The $150 million was less than a charge Apple posted for the 2nd quarter of '97 alone but I'd say it had a lot more to do with Apple's survival than you give it credit for.

It instilled consumer confidence and it probably helped them obtain financing in a period where Apple was experiencing major losses.

 

As for Saturn, GM repeatedly had to infuse the company with more cash because GM had made too many deals with the labor union when creating Saturn that kept jobs rather than reduced labor costs with automation. There was a writeup on it while I was taking a management class in college.

Saturn was never fully independent from GM and was subject to corporate political decisions from day one.

Not to mention Saturn was originally formed to be the american version of The "Japanese way" and to emulate those companies. An idea that turned me off from the beginning. It has been published now that it's all over with Saturn, that the company never made any money for GM. What a mess...

Link to comment
Share on other sites

I loved my Amiga, but the interface was butt ugly

I'd have to agree with that. Starting with that cursor arrow that looked like it was drawn with a crayon. I know the reason was so they could be used with cheap TV sets, but I think the crappy resolution of cheap TV sets was a big problem with the "home computer" industry. Okay for games, crappy for doing "serious" stuff. S-video really wasn't around then, except in the 2-plug version used by Commodore, and even then the monitor wasn't exactly cheap. But a good monitor, even 640x480 15" VGA, was a lot more expensive.

 

You are right Bruce. The blue, white and orange interface was used because at the time, the research indicated that many users would hookup to an existing composite monitor. After monitor usage went up they changed the color scheme to grayscale, which was the easiest on the eyes for the dot pitch of the early RGB monitors. I remember Amiga World or one of the other big magazines would always point out that the choice of large cartoonish icons and high contrast colors put off a lot of folks. Commodore got the hint but their UI was never terrific.

If you ever saw the shareware MagicWB applied to a stock Amiga you can really see how terrific the Amigas OS could have looked right out of the box.

Edited by FastRobPlus
Link to comment
Share on other sites

Aside from marketing failure, what really killed most of the old 8-bit and 16-bit computer lines out was lack of evolution of their CPU's, and the programmers extreme reliance on the architecture itself.

 

All the major 8-bit computers used either the 6502 or Z80 (or some variant of those two). These were excellent processors at the time, primarily because they were both cheap and quite effective. However, we're still talking about 1-4MHz CPU's here, usually running with limited audio/video hardware. In order for programmers to get the most out of the machine, they had little choice but to write in machine language and to use direct access to the extra hardware in the machine. The 6502 had a successor in the 65816, but very few hardware manufacturers chose to release machines based on this chip (The Apple IIGS is one of the few machines to do so). Meanwhile, the Z80 arguably led into the x86 architecture, so there was no upgrade path for the 8-bit machines, and even when there was, reliance on built-in supplementary chips usually meant that you sacrificed most backwards compatibility in the process.

 

The Amiga, Atari, and Mac lines of computers all hit a wall with the 68000 as well. Motorola made many successors to the 68k architecture, and each company made their own attempts to capitalize on newer and faster chips. The Atari ST died out before things got too far, and the Amiga line died out with the transition to PowerPC (Yeah, yeah, PPC Amiga stuff exists... but it's a real mess in my opinion). Both systems had too much of a dependence on their hardware, and the software just couldn't evolve with the newer systems... this led to problems like OCS/ECS/AGA incompatibilities in the Amiga, and BIOS incompatibilities with both systems. Apple managed to carry on with things and do a damn good job keeping things going... but even they had to bail out when the architecture just didn't evolve in the right directions.

 

The Z80 had kind of a special place in history with it's use of the CP/M operating system. To some degree, CP/M machines could share programs. Unfortunately though, this abstraction wasn't quite as pervasive as it is with Windows, so there was still lots of stuff that could not be transferred between computers, often due to silly things like incompatible disk drives. Things we take for granted today were simply not standardized enough at the time.

 

This is where the x86 architecture and MS-DOS (and later, Windows) really did a fairly good job. As the x86 line of processors evolved, each one was extremely backwards compatible. Meanwhile, these machines relied on add-on boards for all your video and audio... since programs could never be sure what actual chips were in the machine, they had to be written in a way that would (hopefully) work with a wide variety of third party cards. Popular brands of cards eventually led to de-facto standards like Adlib and Soundblaster, and eventually programs were abstracted to enough of a degree that the same program could be run on almost any hardware, made by any manufacturer, even stuff that was made 10 years after the program was written.

 

These days, when people write programs for PC or Mac, they never access the hardware directly... instead, they ask Windows or MacOS to do things for them, and the operating system takes care of talking to the hardware directly. This extra level of software is essential to developing a platform which can evolve over time, and the companies of the 80's and 90's just didn't manage to get it right in time and fell behind. When you really think about it, there simply wasn't any good way of doing this on an 8-bit computer. Relying on an operating system just took up far too much system resources, which would severely hinder the system (Ever tried using GEOS on a C64? Impressive, but slow and limited). The "OS Revolution" was bound to happen eventually, and the IBM PC architecture just had a head start on things.

 

--Zero

Link to comment
Share on other sites

I just found a wonderful e-book that provides an overview of the state of the industry from the 1960s though the 1990s: A history of the personal computer: the people and the technology

 

Particulatly relevant to the discussion in this thread is Chapter 11: Competitive Computers [in the 1980s] and Appendix A: Some Technical Details of Various Personal Computers.

Link to comment
Share on other sites

Slightly right and slightly wrong there Ze-ro re : 8bits and a reliance on 8bit processors

 

Going from memory, according to a thread i saw on the A8 forum, Atari (pre tramiel) were apparently working with Rockwell on developing a 'psuedo' 16bit 6502, or was it Rockwell working with Atari (nothing to do with the 65816 by WDC) I dunno what happened to that particular project

 

Also, one of the designers of the 6502 (Bill Mensch) Did a deal with Apple to develop a 16 bit 6502 (which eventually became 65816), unfortunately for Bill during initial development, Apple decided to change their role in the deal and didn't want any involvement in developing the processor they only wanted a preferred supplier of that processor for their upcoming IIgs system)

 

And apparently commodore were even getting MOS to tinker with 16 bit successors to the aging 650x family, I dunno if any of the projects ever got off the drawing board

Edited by carmel_andrews
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...