Jump to content
IGNORED

'Amiga-ised' 8bit computers....can you remember any that actually succeeded


carmel_andrews

Recommended Posts

A really good question here is: If Apple hadn't stolen the GUI idea from Xerox/PARC would we be using GUI on computers now or would it still be all command line interface? Xerox showed no interest in using or marketing the GUI that was developed at PARC. So would it exist if it wasn't for Apple?

 

The one thing I always thought was interesting with the Amiga, though not interesting enough to want one, was that instead of the CPU having to run the entire system it had dedicated co-processors for things like sound, video etc. That freed up the main CPU to do it what needed to to run the program.

 

IMHO the Apple IIGS was an attempt on Apple's part to lure the users of the Apple II line over to the Macintosh, since GS/OS worked exactly like the Mac OS. The GS is a very capable computer, whether you use it with nothing but GS/OS or simply as a souped up Apple II.

Steve Jobs/Apple paid Xerox for access to the Parc. I'm not sure anyone knows what the contract said, but I don't think you could say for certain whether it was stolen or not.

I think Xerox was just happy to recoup some of the R&D money invested in it and it's very possible they sold some sort of rights to Apple.

 

I totally agree with the rest of your post.

Edited by JamesD
  • Like 1
Link to comment
Share on other sites

Someone else would've come up with the GUI idea. It was kind of inevitable.

I think you could start a new thread on this topic: Was the GUI inevitable? I think I could make a pretty good argument that the industry might have gone in the direction of larger and larger keyboards with more and more buttons. Remember those "F-key" templates that they used to have for IBM keyboards when you were using Word Perfect? Proponents of the GUI pointed out that it was user friendly and self-guiding, because all of the possible actions you could take were listed on a menu for you. But this was at the expense of a huge amount of computing power and memory to support it. Recall that the first generation of Mac couldn't really do anything. IBM/DOS users in the early days of Macintosh would argue that by learning a few simple commands and keeping your hands on the keyboard, you could actually get more done in the same amount of time, compared with someone who's right hand was constantly reaching for the mouse, and spending time opening menus and making selections.

Link to comment
Share on other sites

IBM/DOS users in the early days of Macintosh would argue that by learning a few simple commands and keeping your hands on the keyboard, you could actually get more done in the same amount of time, compared with someone who's right hand was constantly reaching for the mouse, and spending time opening menus and making selections.

 

That's your trade-off though isn't it, by removing the need to learn even simple commands the GUI opens a machine up to people who wouldn't otherwise have used it - in that context, the loss of productivity speed and even resources is probably considered worthwhile and the money men behind the companies are never going to sneeze at an enlarged customer base.

 

As for the original topic; an "Amiga-ised" C64 is a C64DTV2. =-)

  • Like 2
Link to comment
Share on other sites

Recall that the first generation of Mac couldn't really do anything. IBM/DOS users in the early days of Macintosh would argue that by learning a few simple commands and keeping your hands on the keyboard, you could actually get more done in the same amount of time, compared with someone who's right hand was constantly reaching for the mouse, and spending time opening menus and making selections.

 

Well, "the first generation of Mac couldn't really do anything" is a huge stretch. The very first one, the 128k, couldn't do anything due to lack of RAM. But the first gen in general was very capable for their time. Your other points are certainly interesting though. The first Macs did have some keyboard shortcuts, but not nearly as many as CLI-based machines did, making the CLI at the time more efficient. I think that was a valid argument for power users, but to bring computers "to the masses" as they are now, I think the GUI was inevitable. But, I can see the keyboard-based paradigm being stretched out for a bit longer, especially with Microsoft as the primary market force.

 

EDIT: Didn't see TMR's response until I replied... exactly, that's the trade off that had to happen to drive computers into more widespread use.

Edited by Mirage
Link to comment
Share on other sites

Take a look at the Xerox 860 dedicated word processor. The keyboard included dedicated keys to select words, lines, and paragraphs. It also had a pointer navigation device called a "Capacitive Activated Transducer" or "CAT" - a "CAT" instead of a "mouse", get it?

 

http://www.old-compu....asp?c=488&st=1

 

Here's another good link:

 

http://www.digibarn....x860/index.html

 

A particularly interesting feature: Since this was a dedicated word processor, it made sense to rotate the monitor 90 degrees to portrait mode, so that the image you saw on the display looked like the sheet of paper you would end up printing. Wouldn't that make more sense for viewing HTML pages displayed on the Internet? Have you noticed how one major feature of the iPad and similar tablets, is that it automatically changes the orientation from landscape to portrait when you rotate the device? Hmmm...

Link to comment
Share on other sites

IBM/DOS users in the early days of Macintosh would argue that by learning a few simple commands and keeping your hands on the keyboard, you could actually get more done in the same amount of time, compared with someone who's right hand was constantly reaching for the mouse, and spending time opening menus and making selections.

 

That's your trade-off though isn't it, by removing the need to learn even simple commands the GUI opens a machine up to people who wouldn't otherwise have used it - in that context, the loss of productivity speed and even resources is probably considered worthwhile and the money men behind the companies are never going to sneeze at an enlarged customer base.

 

As for the original topic; an "Amiga-ised" C64 is a C64DTV2. =-)

On the other hand; put training wheels on a bicycle, and more people can get up on it and ride it sooner, but if you really want to ride a bike and go fast, you have to take the training wheels off.

 

Don't get me wrong, I do believe that ultimately, the GUI is a better way to use a computer. I'm sort of baffled that some Linux purists seem to think that a GUI has no place on a Linux web-server. But I'm suggesting that that the GUI we ended up with; mouse, drop-down menus, folders/directories metaphor - was not necessarily inevitable. There were a lot of ideas floating around at the time including light-pens, touch screens, keyboards with a LOT of buttons, and even speech interface. Remember "Speak-and-Spell"?

 

I think that ultimately, Lisa was a more "complete" computer than Mac. It had an entire suite of office applications. Macintosh came out of the gate with Mac Write and Mac Draw and that was all. What if you wanted to write programs for Macintosh? Guess what? You needed a Lisa to do that. But the Lisa was a $10,000 computer when it was introduced in 1983; almost double what you'd pay for a new Dodge Ram 50 from the same era.

 

By the time Mac came out, a Lisa was only about a 1,000 dollars more expensive but that didn't matter. Steve Jobs was fully behind Macintosh and Lisa was destined for retirement as a failure. Multiple sources claim that a significant number of Lisa computers ended up buried in a landfill in Logan Utah in 1989. I think it is very telling that by then, Woz had already left Apple. You can draw your own conclusions as to the reason, but I would say that the "spirit" of what Jobs and Woz had been originally trying to do, had been lost.

 

This swings back around to the "spirit" of what Nolan Bushnell had originally tried to do - bring the experience of the computer video game to the masses, but without using a $30,000 mini-computer to do it. To me, this is the essence of the early pioneers of the personal technology age. The quest was the answer to the question, "How can we bring the joy and improvement of the human condition based on computing technology to the masses"? Baer, Bushnell and Dabney did it with TTL circuits while the world waited for the microprocessor to be invented. Once the microprocessor was realized in the form of the MOS 6502, Woz and Jobs took the discreet technological pieces that they found, put them together, made them work in a package that didn't require advanced technical skills, and built an entire company around it. Call it the first expression of the "Plug and Play" concept and the baby-steps towards the personal computer as an "appliance".

 

I find it curious now that the same battles of philosophy seem to be ready to be taken on again with Siri technology, the iOS, Android, and Microsoft Windows 8. Does personal computing mean that you have all of your software and data on your personal device? Or is it the same thing if you've licensed access to everything you need through a "cloud"? Ultimately, I want to pick up, or sit down in front of some kind of device and do stuff with it. Whether that means I'm sliding optical disks into a drawer, pushing some kind of a device into a USB port, or accessing everything I need through a cloud; it doesn't really matter.

Link to comment
Share on other sites

This swings back around to the "spirit" of what Nolan Bushnell had originally tried to do - bring the experience of the computer video game to the masses

 

Technically computer game, there was no video involved in computer space. But I get what you're going for, as seeing Computer Space combined with his coin repair work at Lagoon created that spark. Nolan then had the aha moment of using video after he had Ted explain to him the technical details of syncing and the way the beam works in a video display. Ted then created the solution via designing spot motion circuitry to allow them in to TV set via the video signal, which is what in turn allowed them to do it cheap enough to go for the masses.

 

Baer, Bushnell and Dabney did it with TTL circuits while the world waited for the microprocessor to be invented.

 

There were no TTLs in Ralph's circuits. He used DTL circuitry.

Link to comment
Share on other sites

Technically computer game, there was no video involved in computer space.

Wasn't it just the reverse? I thought the "magic" of Computer Space (...and Pong) was that they were computer-like video games that didn't use a computer.

There were no TTLs in Ralph's circuits. He used DTL circuitry.

I stand corrected! Details! You're awesome Marty! It's like you and Curt wrote the book... ...literally!

 

And I'm really looking forward to reading it...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...