Jump to content
IGNORED

Which classic comp offered the best hardware/software integration?


Recommended Posts

For general discussion. Which classic comp do you believe offered the best hardware/software pairing & integration?

 

Biased or not I vote Apple II because of how easily I learned the ins & outs of Applesoft BASIC and then subsequently DOS. The mini-Assembler  and Monitor scores points too. Using DOS was like BASIC suddenly getting new commands. It was like a text adventure! New commands to make your character do new things.

 

It was like a hi-tech version of LOAD & SAVE for cassette, a concept all of us were familiar with. Lots of time to contemplate how things operated by the forced waiting for something to load. So us cassette users instantly knew two primary DOS commands straight away.

 

Consider how to see the contents of a disk, the Apple II was beyond simple. All a user had to do was type CATALOG. A list of the disk files appeared onscreen.

 

C64 way of things was ridiculous and retarded. You had to type LOAD"$",8 and LIST. The 6 year old in me was blown into a sea of confusion. Why would one use the LOAD command to see a list of files? And why the "$",8? Why was I looking on the disk for dollar signs? And after serious thought, why 8? I had only one drive device, and why call it "8" and not a normal sense-making "1". To add insult to brain stress we had to LIST it. More tedious typing. And the fucking procedure deleted your basic program from memory! Just 2x retarded.

 

The C64 required you to type 15 keystrokes and killed your in-memory program. The Apple II was simple, type 8 keystrokes. And it didn't erase your in-progress work!

 

Atari DOS 2.5 wasn't much better with a menu system. And I can only imagine what other systems of the day put their users through.

  • Like 3

The Apple DOS / BASIC interface was kludgy.  Why do I have to PRINT my DOS commands inside a program, rather than just use them?

 

Which makes more sense?

TI BASIC:

10 OPEN #1:"DSK1.FILE",OUTPUT
20 PRINT #1:"DATA"
30 CLOSE #1


Applesoft:

10 D$=CHR$(4)
20 PRINT D$;"OPEN FILE"
30 PRINT D$;"WRITE FILE"
40 PRINT "DATA"
50 PRINT D$;"CLOSE FILE"

 

Integer BASIC was worse because it didn't even have the CHR$ function.

 

I'd argue that DOS 3.3/ProDOS's command interface was nothing more than the same thing as the Commodore DOS Wedge.  If you load that into a C-64, all of the DOS commands are available in short keystrokes and you can display a directory without losing your program by pressing 2 keys.  The 64's native support of disk devices was complicated - I won't disagree.  Commodore did have a BASIC that had all those disk commands, but they expected people to buy tape units in larger numbers than disk.

 

The TI 99/4A's file system could use any device attached to the system without really having to know anything about it.  Replace "DSK1.FILE" with "CS1" in the above example and the exact same program will work with a cassette file instead of disk.  The really poor thing with the TI's disk system is that you had to use a cartridge to do a lot of rudimentary things.  Want to copy a file from BASIC?  Have to lose your program, exit BASIC, and run the Disk Manager cartridge.  It was also not possible to display a disk directory from BASIC unless you wrote a program (and not a small one).  That was dumb.  At the same time, cartridges could add devices that the system could use, which was a nice ability to easily extend the system.

 

I'll admit to my biases also :)  My first computer was a TI 99/4A.  My second, a Commodore 128 (which had a BASIC that had all the proper disk commands).   

  • Thanks 1

As much as I love Commodore machines and place them as my favorites I will admit the basic in the earlier home machines was a struggle for the young kid I was. Poke this, Peek that etc. I had tape only so load/save was easy. When I got a TI-99/4a I went from struggling with programs to writing games in a very short time as the basic was so friendly. However the downside is that when I looked to do more advanced things I was locked out and the basic was very slow. It was not until I used a BBC machine that I found the best BASIC ever for both simple and complex tasks. 

 

I never used an Apple 2 as no one here in the UK that I know could ever afford one.

  • Like 1
  • Thanks 1

I don't think any of the 8-bit systems had particularly good interfaces.   Low memory and storage were serious constraints, which made sure such interfaces were primitive.

 

So whichever you liked was the one that you got used to.   The ones that you weren't used to seemed pretty kludgy in comparison.   In reality they were all kind of kludgy.

 

Even the 16-bit era didn't have great integration.   Well, the Classic Mac seemed nice from my limited exposure to it..   But if I used it on a daily basis I might have found issues with it.

MS-DOS: horrible

Amiga: powerful OS but too heavy for a single floppy disk system.   Seemed like it really needed a hard drive to get the most of it.

Atari GEM:  Nice in that it booted from ROM, but light on features, and no built-in CLI for advanced operations.

 

On 6/29/2022 at 7:25 PM, Keatah said:

Atari DOS 2.5 wasn't much better with a menu system. And I can only imagine what other systems of the day put their users through.

My Atari 1050 came with DOS 3.0, which had a decent menuing system I thought.   Unfortunately because of other issues with 3.0, we were all forced to use DOS 2.0/2.5 and I absolutely hated that menu system,  it seemed so hacky.

  • Thanks 1

The real power of an Apple II made its debut when ProDOS came out.  It allowed the integration of hardware much more readily than DOS did.  It was much easier to install drivers for the hardware one wanted to use.  And it was not just limited to hard drives.  Look at the cards that were recently made for the Apple II.  Various hard drives, ethernet card, Video Overlay card, ROM cards, memory expansion cards ..etc.

 

ProDos is by far the best DOS of any computer.  Even now, not only can I read and write to the old DOS 3.3 with it, I can also read FAT12 and FAT16, MFS and HFS, although these last ones are 3.5 inch floppies or hard drives.

  • Like 1

From what I have seen and what people have been able to squeeze out of a machine text and/or graphics wise, I will go with the Apple II and/or C64.  What you can do sound and graphics wise with the C64 is amazing and is only possible due to its tight hardware and software integration even if the BASIC is not really great to use.  However, the Apple II is probably superior considering what it can do graphically and text wise as the late Apple II machines (your Apple IIe Plantinums and IIc Pluses) really were very utilitarian in what all they could do for work and/or play.  However, this is just my opinions.  I am by no means and expert, but just thought I would throw my two cents into the discussion.

  • Like 1

Really it took Commodore until BASIC 3.5 and 7.0 to really integrate the hardware and software in a useable way. Those versions of BASIC had full support for all of the graphical features, and BASIC 7.0 had very good support of the sound hardware as well.  At the same time, most of the graphical commands added to BASIC 3.5/7.0 were included in the Super Expander cartridges that were released for the 64 and VIC 20, but like any add-on, no one had them in enough numbers to make them a standard.


Going back to the TI computers and the ability to add hardware.  Similar to an Apple II card, anything you could design could be made into a card for the expansion system, or as a device that can plug into the expansion port.  The difference is that on the TI computers, those devices provided their device names in the ROM on the device and as long as it was coded to the general standard, anything could use them.  An example is the TIPI device, that allows the TI to use a Raspberry Pi as a file system.  With the TIPI interface plugged in, the TI recognizes a device named TIPI for the file system, and PI for added functions, and can just use it.  This allows things like:
 

10 OPEN #1:"TIPI.FILES.FILE1", OUTPUT, DISPLAY
20 OPEN #2:"PI.http://some.address.com/file.txt", INPUT, DISPLAY
30 LINPUT #2:A$::PRINT #1:A$
40 IF NOT EOF(2) THEN 30
50 CLOSE #2::CLOSE #1

Also almost any of the software released on cartridge provided options to save/load from tape, disk, or "other future device" - and with a TIPI attached, all of those programs can save their data onto the Raspberry Pi.

  • Like 5
On 7/1/2022 at 2:23 PM, zzip said:

I don't think any of the 8-bit systems had particularly good interfaces.   Low memory and storage were serious constraints, which made sure such interfaces were primitive.

 

So whichever you liked was the one that you got used to.   The ones that you weren't used to seemed pretty kludgy in comparison.   In reality they were all kind of kludgy.

 

Agreed, it was like you were lucky to have to even 'get' a menu interface in your 8-bit program as those home computers expect you to write your own applications.

 

I like the ones that let you use a joystick on 8-bit Ataris.  In fact I modified a BASIC budgeting program to move the cursor around with the joystick instead of holding the Control button to use arrow keys.  Wished I had the CX22 trackball which would have made the experience even better!

 

Quote

Even the 16-bit era didn't have great integration.   Well, the Classic Mac seemed nice from my limited exposure to it..   But if I used it on a daily basis I might have found issues with it.

 

I like using the Macintosh software interface, but the hardware itself felt so limited on the Mac Pluses and I was experiencing constant crashes on the Performas.

 

Quote

MS-DOS: horrible

 

I hated the thought of using the DOS CLI and had to buy manuals from college bookstores on how to use it.  Didn't mind when a program like WordPerfect had a mouse based GUI even though it looked more like a BBS screen.

 

Windows 3.x felt so clumsy and unnatural compared to Macs, even though it excelled at multitasking.

 

Quote

Atari GEM:  Nice in that it booted from ROM, but light on features, and no built-in CLI for advanced operations.

It was very spartan, but since you still need a floppy disk inserted to speed up the loading you might as well add things like desk accessories, auto running TSRs and even a much nicer Desktop replacement.  Or just get it out of the way for auto booting games or full screen programming languages, very versitile IMHO.

 

Quote

My Atari 1050 came with DOS 3.0, which had a decent menuing system I thought.   Unfortunately because of other issues with 3.0, we were all forced to use DOS 2.0/2.5 and I absolutely hated that menu system,  it seemed so hacky.

Again, I didn't mind the menu interface of Atari DOS and prefered it to the MS-DOS CLI.  But power users who felt otherwise got SpartaDOS as a replacement.

 

Plus I'll still take it over whatever the C-64 used which couldn't even do auto booting...

 

  • Thanks 1

The IBM 5110 had a very good set of hardware and software interfaces that handled all the devices that could be added to the base hardware. Over in minicomputer land, there were standard locations to make loading bootstrap from whatever device is attached easy. Later DEC OSes included SYSGEN to create a custom version of the OS that would access whatever devices were available after answering many questions. 

 

Digital Research, beyond its other DEC influences, had a SYSGEN to copy over a disk and an interactive set of questions for setting up MP/M in GENSYS. The Commodore PET was its own thing but there were some similarities with how device numbering was done in comparison with IBM's design which differed greatly from other drive identifying schemes.

  • Thanks 1
15 hours ago, MrMaddog said:

I hated the thought of using the DOS CLI and had to buy manuals from college bookstores on how to use it.  Didn't mind when a program like WordPerfect had a mouse based GUI even though it looked more like a BBS screen.

All the hate MS-DOS gets always amazes me, because my experience was so completely different. I was just a kid and all I knew about handling computers before it was from writing some simple BASIC programs on ZX Spectrum, loading tapes on C64, or inserting disks into an Amiga, and yet I found MS-DOS supremely easy to handle and quite intuitive. I mean, how hard is it to remember that typing a drive's letter plus colon will take you to this drive? The navigation by typing assorted dir or cd.. / etc, is a breeze and other commands also rather self explanatory. Sure, I never had to do any really advanced stuff, but I bet that was the same for 95% of normal users like me.

 

Of course, there's also the fact that Norton Commander was a thing and set a standard of computer handling which I'm still using to this day, via Total Commander on Win10. I can do most file operations this way using keyboard much faster than slogging through the mouse driven Explorer GUI.

 

It's similar with the alleged torment of adjusting autoexec / config.sys to play games, something which in reality was extremely simple and required "writing" (mostly copied from someone else or a magazine) a batch file which would then handle most of everything forever.

  • Like 1
  • Thanks 1
19 hours ago, MrMaddog said:

I hated the thought of using the DOS CLI and had to buy manuals from college bookstores on how to use it.  Didn't mind when a program like WordPerfect had a mouse based GUI even though it looked more like a BBS screen.

 

Windows 3.x felt so clumsy and unnatural compared to Macs, even though it excelled at multitasking.

MS/DOS was a hacked together OS in the first place,  it was literally called "Quick and Dirty OS (QDOS)" when Microsoft bought it,  but it gained more and more hacked-on features as time went by.   I think it's probably the least user friendly of consumer OSes (at least for casual users) and I can't think of another OS that forces the user to deal with memory management issues an OS would normally take care of on its own.

 

Win 3.1 looked kinda slick at the time, but yeah it operated weird.   It also had preemptive multitasking which is more like "task-switching" than real multitasking that later versions of Windows had.

 

19 hours ago, MrMaddog said:

It was very spartan, but since you still need a floppy disk inserted to speed up the loading you might as well add things like desk accessories, auto running TSRs and even a much nicer Desktop replacement.  Or just get it out of the way for auto booting games or full screen programming languages, very versitile IMHO.

Getting a hard drive for ST was a game changer.   I had a bunch of cool desk accessories, TSRs, a replacement desktop all loading from the hard drive, and it still booted quickly.   But the fact that you wanted replacement desktops, and had graphic accelerating TSRs like QuickST just shows how minimal and unoptimized the default OS/desktop was.

 

19 hours ago, MrMaddog said:

Again, I didn't mind the menu interface of Atari DOS and prefered it to the MS-DOS CLI.  But power users who felt otherwise got SpartaDOS as a replacement.

It's just that the DOS 2.x menu system was so inelegant especially once I had seen how much cleaner the 3.0 menu was.   2.x got the job done though.  I never felt tempted into SpartaDOS or other DOS replacements.   But again the fact that a market existed for replacement DOSes and replacement BASICs was a sign of how weak the default options were.

 

4 hours ago, youxia said:

It's similar with the alleged torment of adjusting autoexec / config.sys to play games, something which in reality was extremely simple and required "writing" (mostly copied from someone else or a magazine) a batch file which would then handle most of everything forever.

That was one of the problems,  you would have so many autoexec/config boot disks until you found the magic combo that handled everything.   This was mostly before the internet was common place, so you couldn't Google for best DOS practices.   You often got advice from other DOS users which was sometimes good, sometimes not.    I eventually found that the memmaker app that shipped with later versions of MS/DOS did a pretty good job of optimizing your autoexec/config files,  but I didn't even know that thing existed until I heard about it from another user

  • Like 1
  • Thanks 1
On 7/2/2022 at 12:03 AM, Iamgroot said:

The real power of an Apple II made its debut when ProDOS came out.  It allowed the integration of hardware much more readily than DOS did.  It was much easier to install drivers for the hardware one wanted to use.  And it was not just limited to hard drives.  Look at the cards that were recently made for the Apple II.  Various hard drives, ethernet card, Video Overlay card, ROM cards, memory expansion cards ..etc.

As much as I love the Apple II, power is something I don't associate with the machine, not after about 1985 or so. Not with the up and coming 16-bit machines and PC even. But A2 kept its versatility and never ceased to amaze me in that department.

 

I never cared much for ProDOS. I was firmly entrenched in DOS 3.3 and barely had enough time to do warez, let alone learn a new DOS. I also didn't see a need for all that structure. All the games I were playing either worked with whatever DOS they came with or good old DOS 3.3.

 

I didn't see any real advantage of having better driver support. That makes a difference on PC for sure, but on  Apple II? Not so much. Interface/expansion cards remained simple throughout the II series' life. And whatever program, telecom or wordprocessor or whatever, seemed to support the necessary hardware just fine.

 

When I got my 10MB HDD I found I could change volumes (disks) faster than recalling and typing a /prefix/. After years of playing with the machine I could even swap a floppy faster than looking up and typing that prefix.

When it came to integration of hardware and software, the TRS-80 Color Computer had the best basic hands down for years, and the DOS was similar to what someone posted for the TI.
It supported software sprites with GET and PUT commands (not to be confused with the GET and PUT commands for file I/O), LINE, CIRCLE, DRAW (which would draw complex shapes using Up Down Left Right Move, etc...)
DRAW was sort of a human readable version of the Apple shape tables.
Sound support including a command called PLAY that could play an entire song contained in a string as notes by name.
But the expansion interface and hardware in carts instead of inside slots was kinda weird.   It worked, but it was weird.

 

  • Like 2
14 hours ago, JamesD said:

When it came to integration of hardware and software, the TRS-80 Color Computer had the best basic hands down for years, and the DOS was similar to what someone posted for the TI.
It supported software sprites with GET and PUT commands (not to be confused with the GET and PUT commands for file I/O), LINE, CIRCLE, DRAW (which would draw complex shapes using Up Down Left Right Move, etc...)
DRAW was sort of a human readable version of the Apple shape tables.
Sound support including a command called PLAY that could play an entire song contained in a string as notes by name.
But the expansion interface and hardware in carts instead of inside slots was kinda weird.   It worked, but it was weird.

 

The CoCo also had really good hardware and/or software integration for sure.  Definitely it is up there with the A2 and C64 for sure.  Also, for me personally, the CoCo 3 is one of the most underappreciated and underrated machines of the 8-bit era.

  • Like 1
On 7/7/2022 at 4:04 PM, zzip said:

That was one of the problems,  you would have so many autoexec/config boot disks until you found the magic combo that handled everything.   This was mostly before the internet was common place, so you couldn't Google for best DOS practices.   You often got advice from other DOS users which was sometimes good, sometimes not. 

I'm sorry this was your experience, but  mine and everybody I knew, was completely different. There was no need to use countless different files - just one config, certainly not magical, which would just work, and that would be passed on to everybody else. And it's precisely the same stuff I use everyday on MiSTer's ao486 core these days.

  • Like 2
17 minutes ago, youxia said:

I'm sorry this was your experience, but  mine and everybody I knew, was completely different. There was no need to use countless different files - just one config, certainly not magical, which would just work, and that would be passed on to everybody else. And it's precisely the same stuff I use everyday on MiSTer's ao486 core these days.

That just means that somebody tuned it for you.   It doesn't mean the original design (or lack thereof) of MS-DOS was good.

Considering the original market for PCs was business users and business apps-  these don't tend to be highly technical people or tinkerers.  The amount of knowledge to get MS-DOS working properly was much higher than any other home platform I can think of.   Especially before they created software wizards to help with the process.

  • Like 1
19 minutes ago, zzip said:

That just means that somebody tuned it for you.

If you want to call it that, that's fine, but unless you were some kind of wiz-kid who learned how to handle everything from, I don't know, analyzing the logic gates on the chips or some such - then it means you had help too, be it in person, or written form, and so this something was "tuned for you" too. There's not a single CLI based interface which wouldn't require at least some effort to understand on the part of the user. In fact, you can easily extend it to most of the GUI ones from that era - hell, any era - too. Why do you think "For Dummies" books got so popular? It's probably only with the advent of smartphones that these devices became somewhat intuitive (to an extent).

 

And compared to other platforms, MS-DOS was nowhere near as complicated as modern myths would have it. Literally every single one of my mates was able to troubleshoot the DOS environment, and every single one had only the absolutely basic knowledge of how computers/OSs work. Judging by how popular DOS gaming was, I think it's pretty safe to say it was an universal experience.

32 minutes ago, zzip said:

the original market for PCs was business users and business apps-  these don't tend to be highly technical people or tinkerers.

No, but business people either had IT departments which did it for them, helplines, or simply learned OTJ. The latter, again because this whole MS-DOS thing was was nowhere near as complicated as modern myths would have it. Even a secretary or data-entry monkey (been there) person could learn a few basic commands which would take them to Norton Commander, if it didn't autostart, or that a: means floppy and c your HDD, and dir will show you what's in there. It's really not rocket science.

 

And again, the proof is in the pudding (no matter how many times we will go through the motions with this tired argument ?) Being relatively simple to handle was one of the reasons PCs took over. If they required some Matrix (aka Linux)-level involvement from normal user, they would've never gotten so popular and conquered the world. And PC dominance even in the pre-GUI-Windows era has already been crushing.

  • Like 2
5 minutes ago, youxia said:

If you want to call it that, that's fine, but unless you were some kind of wiz-kid who learned how to handle everything from, I don't know, analyzing the logic gates on the chips or some such - then it means you had help too, be it in person, or written form, and so this something was "tuned for you" too.

Of course I did.   I had lots of bad advice as well as good advice as to what to put into those config.sys and autoexec.bat files.

 

7 minutes ago, youxia said:

There's not a single CLI based interface which wouldn't require at least some effort to understand on the part of the user. In fact, you can easily extend it to most of the GUI ones from that era - hell, any era - too.

The CLI nature of it is the least of the problems.   That's fairly intuitive-  you type the name of the program and it runs.

 

Part of it is the nature of the pre-386 Wintel platform-  the segmented memory and various schemes for getting past 640K,  but MS-DOS passes the burden of memory management onto the end-user where other platforms might make it the developers problem or perhaps the OS would try to manage it for you.    The average DOS user could barely articulate the difference between extended memory and expanded memory,  or why 640K is a big deal when they have 2mb installed-  but they were expected to configure all this anyway.

 

Yeah people did shell out lots of money for books and utilities from Norton and the like to help them with these problems.    But when your system creates a secondary market to make it work properly-  I'd argue there's a problem with the system.

  • Like 1
3 hours ago, youxia said:

I'm sorry this was your experience, but  mine and everybody I knew, was completely different. There was no need to use countless different files - just one config, certainly not magical, which would just work, and that would be passed on to everybody else. And it's precisely the same stuff I use everyday on MiSTer's ao486 core these days.

Yes. That was my experience too. I may not have fully maximized the upper and conventional memory as much as possible, but I had a configuration that worked with everything except for one single game. I was ok with that. I made a boot disk at first. But then I made a simple batch file that copied new CONFIG & AUTOEXEC files and rebooted. After rebooting it restored the original "everything" files and started the game.

 

1 fucking game! Someday I will fix that!

2 hours ago, zzip said:

Of course I did.   I had lots of bad advice as well as good advice as to what to put into those config.sys and autoexec.bat files.

Tell me about it. Even in recent times, when discussing PCs in general there's always someone that thinks they can do better with config and autoexec. I say have at it!

 

2 hours ago, zzip said:

Yeah people did shell out lots of money for books and utilities from Norton and the like to help them with these problems.    But when your system creates a secondary market to make it work properly-  I'd argue there's a problem with the system.

I don't know about that. The PC was on average about 10-years old when these 3rd party tools and books came out. Limits were being explored and pushed. And these books explained how to do that. I'd say the problem is with the original documentation that comes with the system.

 

When I got the Apple II it came with about 800-1000 pages spread across 4 or 5 manuals. It tried to be (and was successful) all-inclusive. Had theory of ops, schematics, monitor rom listings, basic language tutorial and reference, dos tutorial and reference, and general explanations for many things. Useful to a kid, as well as a developer.

 

When I got my 486 it also came with about 1200 pages spread across 5 or 6 manuals. Most of it was procedural. Most explained how to do a certain task by listing the individual steps. Very tedious. It didn't explain the how or why of things like Apple II documentation did. There was a 300 page MS-DOS reference manual, and that was useful. Showed all the conventions and options. Also gave some background. But otherwise forget it.

  • Like 1
13 minutes ago, Keatah said:

When I got the Apple II it came with about 800-1000 pages spread across 4 or 5 manuals. It tried to be (and was successful) all-inclusive. Had theory of ops, schematics, monitor rom listings, basic language tutorial and reference, dos tutorial and reference, and general explanations for many things. Useful to a kid, as well as a developer.

 

When I got my 486 it also came with about 1200 pages spread across 5 or 6 manuals. Most of it was procedural. Most explained how to do a certain task by listing the individual steps. Very tedious. It didn't explain the how or why of things like Apple II documentation did. There was a 300 page MS-DOS reference manual, and that was useful. Showed all the conventions and options. Also gave some background. But otherwise forget it.

That's a good point.   In my the 8-bit days, there wasn't a huge amount of documentation.  But it was on point.   It was written assuming the audience wasn't familiar with some concepts.  For many of us, it was our first computer.   I remember the Atari BASIC reference manual that came with XL's was rather short, but it was clear enough that you could use it and a little bit of BASIC knowledge and achieve a lot quickly

 

When the 16-bit era came, documentation became voluminous, and always seemed to be much longer than it needed to be.   Later when I worked with technical writers in my job, I noticed they had a penchant for padding stuff out.    The documentation was often overwhelming and it wasn't always clear where to go to get started.   It seemed to assume much more prior knowledge.

  • Like 3

As far as integration of hardware and software goes, the Amiga was really well integrated.  Probably too well.
Disks could have logical names rather than just drive numbers, and you could remove a disk from one drive, put it in another and the program *could* use it as if nothing changed if it was written to use the logical name rather than drive #.
If you plugged in a RAM board, hard drive controller, etc... it just worked. 
The autoconfig added the RAM to the memory pool, and the driver for the hard drive was just there.
Properly written software (not all was) could just use the RAM or hard drive without jumping through hoops. 
When installing software on a hard drive, you could assign a logical drive name to the directory where a program was on startup so the program could find all it's components/data.
For custom hardware that was out of the ordinary, the manufacturer could write a library you could use from your program for easy access.
There were standard formats for pictures, sounds, etc... that made exchanging files between programs easy.

I even wrote a library that made loading pictures from Deluxe Paint or other paint programs, or sounds, and you could do that just by opening the library and calling the functions by name, and you could fade the current screen image out, and fade the new one in.
It could even handle color cycling.
It was as simple as this (just from memory):
  OpenLibrary("effects.library");
  LoadILBM(filename, fadeoption);
  StartCycle();
  StopCycle();
etc...

Sadly, the OS and GUI were a little too tightly coupled to the existing hardware, which made it tough to move away from it to completely different hardware.

Edited by JamesD
  • Like 1
53 minutes ago, Keatah said:

Yes. That was my experience too. I may not have fully maximized the upper and conventional memory as much as possible, but I had a configuration that worked with everything except for one single game. I was ok with that. I made a boot disk at first. But then I made a simple batch file that copied new CONFIG & AUTOEXEC files and rebooted. After rebooting it restored the original "everything" files and started the game.

At one point I had a menu system that you selected the game/program you wanted from a menu, and it setup the config/autoexec files appropriately!   It seemed like the best solution until I found out about Memmaker.   Memmaker optimized the memory usage of my TSRs in autoexec/config and I finally had a single set of files that had enough free memory for everything.

  • Thanks 1
37 minutes ago, Keatah said:

Tell me about it. Even in recent times, when discussing PCs in general there's always someone that thinks they can do better with config and autoexec. I say have at it!

 

I don't know about that. The PC was on average about 10-years old when these 3rd party tools and books came out. Limits were being explored and pushed. And these books explained how to do that. I'd say the problem is with the original documentation that comes with the system.

 

When I got the Apple II it came with about 800-1000 pages spread across 4 or 5 manuals. It tried to be (and was successful) all-inclusive. Had theory of ops, schematics, monitor rom listings, basic language tutorial and reference, dos tutorial and reference, and general explanations for many things. Useful to a kid, as well as a developer.

 

When I got my 486 it also came with about 1200 pages spread across 5 or 6 manuals. Most of it was procedural. Most explained how to do a certain task by listing the individual steps. Very tedious. It didn't explain the how or why of things like Apple II documentation did. There was a 300 page MS-DOS reference manual, and that was useful. Showed all the conventions and options. Also gave some background. But otherwise forget it.

The Amiga came with 2 or three manuals depending on what point in time you got it.  AmigaDOS, AmigaBASIC, and... can't remember the other one.
For programming you could buy the ROM kernel manuals which, when stacked were around 2 1/2 feet thick with complete documentation of built in libraries, devices, and hardware.

  • Like 1

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...