Jump to content
IGNORED

Which classic comp offered the best hardware/software integration?


Recommended Posts

I had the manuals that came with the A1000 and A500. The consumer-oriented stuff. Plus a couple of books for AmigaDOS, AmigaBASIC and general Programming. Some from Compute!

 

Coming from the Apple II, the Amiga was severely overwhelming to me. I didn't have the time or inclination to get into programming it. Felt like I had no traction, adrift on a frictionless sea. I was also growing up and had more responsibilities like jobs and cars and women and other annoyances. So sitting on my fat ass learning another system was out of the question. Time with the Amiga wasn't as carefree as it was 8 years prior with the Apple II. But one thing was always memorable: going from BlazingPaddles to PhotonPaint was HUGE! Discovering real digital painting with more than 8 colors made me feel like I was 10 years in the future.

 

When I got into MS-DOS and Windows 3.1 I had decided to stick with the application/user aspect and shied away from programming. Not because developing was inaccessible or anything but because there was so much I had to catch up on.

Link to comment
Share on other sites

23 minutes ago, JamesD said:

As far as integration of hardware and software goes, the Amiga was really well integrated.  Probably too well.

I felt that way too. But I never fully grasped how WorkBench, Intuition, KickStart, AmigaDOS, and AmigaBASIC all fit together. And which parts were hardware specific. Maybe having had a hard disk to tie it all together into one package may have helped somewhat.

 

23 minutes ago, JamesD said:

I even wrote a library that made loading pictures from Deluxe Paint or other paint programs, or sounds, and you could do that just by opening the library and calling the functions by name, and you could fade the current screen image out, and fade the new one in.
It could even handle color cycling.
It was as simple as this (just from memory):
  OpenLibrary("effects.library");
  LoadILBM(filename, fadeoption);
  StartCycle();
  StopCycle();
etc...

Like I say, not a developer, but it seems to me that that one effect (for example) would have a certain look and feel about it. And other proggies using that library and function would naturally look identical.

 

This didn't happen on the PC till 3D APIs became a thing. Everything had to be programmed separately, little or no common framework existed yet. Thus each DOS game had a unique look and feel to it. DOOM was different than DUKE-3D, Tempest 2000 from Atari Interactive different than Hi-Octane. Zone Raiders different from Comanche different from Nano-Tank different from Outpost.

 

Games started looking alike when things like 3DFx's Glide hit the scene.

Games for OpenGL and Direct 3D fared better for longer, but they too succumbed to a cookie-cuttle me-too look. Especially today's GPU accelerated games. All have the same flavor and atmosphere about them.

 

23 minutes ago, JamesD said:

Sadly, the OS and GUI were a little too tightly coupled to the existing hardware, which made it tough to move away from it to completely different hardware.

Agree. Or even just faster hardware.

Link to comment
Share on other sites

6 minutes ago, Keatah said:

Coming from the Apple II, the Amiga was severely overwhelming to me. I didn't have the time or inclination to get into programming it.

Yeah I was hitting hurdles programming on the ST as well.   I had GFA Basic,  which was good, and I did a lot with it.  but it wasn't clear how to do some of the more advanced stuff with it.    In the 8-bit days we referred to memory maps.  We'd read/write to certain memory locations and interacted with the hardware that way.   In the 16-bit era it was API calls or interrupt driven stuff.  It was a different concept.   I had a book called The Atari Compendium or something that explained all the ST hardware, TOS/GEM/AES/VDI calls,  LineA interrupts and what not,  but it wasn't obvious how to utilize it in GFA Basic.   There was a missing link there.

 

I really wanted to learn C because that's what the professionals seemed to use, and maybe it would all become clear.   I got a C compiler off somebody,  it had some API references.   I checked C programming books out of the library and would try the beginner examples, and they would inevitably fail because they would tell you to include stdio.h,  and apparently ST was using a completely different library to do basic i/o,  but I had no idea what that was or how to find out.    Again there was a missing link in the documentation I had available that kept me from getting from point A to point B.   

 

Of course this was all pre-internet.  I'm sure I could find the answers within 10 minutes to the things that stumped me back then.

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

If you had the proper documentation it wouldn't have been an issue.  Borland Turbo C came with two books, a command reference and a user's guide.  Writing pixels to the screen was documented.  Of course it's a lot of work to go from pixels to graphics.

  • Thanks 1
Link to comment
Share on other sites

37 minutes ago, mr_me said:

If you had the proper documentation it wouldn't have been an issue.  Borland Turbo C came with two books, a command reference and a user's guide.  Writing pixels to the screen was documented.  Of course it's a lot of work to go from pixels to graphics.

Exactly,   I had plenty of reference documentation that assumed you were already familiar with C.   The "Learn C" books I had were generic.   I was missing a "getting started with C on the ST" book or magazine articles  .I'm sure they existed, just not at my bookstore or library.    I didn't learn C until I got a PC and the examples in my books actually worked.

  • Like 3
Link to comment
Share on other sites

2 hours ago, zzip said:

Yeah I was hitting hurdles programming on the ST as well.   I had GFA Basic,  which was good, and I did a lot with it.  but it wasn't clear how to do some of the more advanced stuff with it.    In the 8-bit days we referred to memory maps.  We'd read/write to certain memory locations and interacted with the hardware that way.   In the 16-bit era it was API calls or interrupt driven stuff.  It was a different concept.   I had a book called The Atari Compendium or something that explained all the ST hardware, TOS/GEM/AES/VDI calls,  LineA interrupts and what not,  but it wasn't obvious how to utilize it in GFA Basic.   There was a missing link there.

 

I really wanted to learn C because that's what the professionals seemed to use, and maybe it would all become clear.   I got a C compiler off somebody,  it had some API references.   I checked C programming books out of the library and would try the beginner examples, and they would inevitably fail because they would tell you to include stdio.h,  and apparently ST was using a completely different library to do basic i/o,  but I had no idea what that was or how to find out.    Again there was a missing link in the documentation I had available that kept me from getting from point A to point B.   

 

Of course this was all pre-internet.  I'm sure I could find the answers within 10 minutes to the things that stumped me back then.

I had that same problem too on the ST, GEM made everything more complicated and it wasn't as well documented as say MacToolkit was...

 

Nearly all the programming I've done for the ST was using STOS BASIC but that was because it completely bypassed TOS and hit the bare metal cause it was primary made for games.  In other words, it felt more like programming on an 8-bit micro.  But once I got up to high level compilers like C, forget it!  I got a copy of GST C after taking a C programming course but both the compiler & the documentation were too complex for me to use.

 

Someone mentioned how easy the documentation of Turbo Pascal & C was,well there was ST versions called Pure C & Pure Pascal that were 100% compatible with Turbo C & Pascal for the PC and still used GEM library calls.  But they were only sold in Germany...

 

  • Like 1
Link to comment
Share on other sites

6 hours ago, zzip said:

Of course I did.   I had lots of bad advice as well as good advice as to what to put into those config.sys and autoexec.bat files.

 

The CLI nature of it is the least of the problems.   That's fairly intuitive-  you type the name of the program and it runs.

 

Part of it is the nature of the pre-386 Wintel platform-  the segmented memory and various schemes for getting past 640K,  but MS-DOS passes the burden of memory management onto the end-user where other platforms might make it the developers problem or perhaps the OS would try to manage it for you.    The average DOS user could barely articulate the difference between extended memory and expanded memory,  or why 640K is a big deal when they have 2mb installed-  but they were expected to configure all this anyway.

 

Yeah people did shell out lots of money for books and utilities from Norton and the like to help them with these problems.    But when your system creates a secondary market to make it work properly-  I'd argue there's a problem with the system.

I'd say the secondary market for PC books was largely due to the OS being pirated, same goes for popular applications software.  The first PC I worked on in the eighties came with a pirated copy of PC-DOS installed and our supplier provided books from third parties.  It's true the OS was lacking and something like Norton Commander was a really nice file manager.  And third parties seem to come out with key utilities first like the QEMM memory manager, which Microsoft would later integrate something similar.  Isn't the PC segmented memory a problem for low level programmers.  Using high level compilers, I didn't know anything about segmented memory, or maybe it's because my programs were small.

 

Upper memory was just a workaround for a computer that was designed for programs to use only 640KB ram, including the OS.  Once microsoft provided upper memory support, it's not that big a deal to figure it out as all the upper memory command options are documented.  The other thing people forget is that besides the commands and options, you use a utility like msd.exe to tell you what upper memory locations are available or not.  Otherwise you're using the options blind.  The ms-dos installer walks you through Memmaker, if I'm not mistaken.

  • Like 1
Link to comment
Share on other sites

 

4 hours ago, Keatah said:

I had the manuals that came with the A1000 and A500. The consumer-oriented stuff. Plus a couple of books for AmigaDOS, AmigaBASIC and general Programming. Some from Compute!

 

Coming from the Apple II, the Amiga was severely overwhelming to me. I didn't have the time or inclination to get into programming it. Felt like I had no traction, adrift on a frictionless sea. I was also growing up and had more responsibilities like jobs and cars and women and other annoyances. So sitting on my fat ass learning another system was out of the question. Time with the Amiga wasn't as carefree as it was 8 years prior with the Apple II. But one thing was always memorable: going from BlazingPaddles to PhotonPaint was HUGE! Discovering real digital painting with more than 8 colors made me feel like I was 10 years in the future.

 

When I got into MS-DOS and Windows 3.1 I had decided to stick with the application/user aspect and shied away from programming. Not because developing was inaccessible or anything but because there was so much I had to catch up on.

As someone who basically had to learn to program the machine from a handful of examples and the ROM Kernel manuals... it was the learning curve from hell. 
I had barely learned C, only knew one person that can done any programming on the machine, and he wasn't helpful at all.  In fact, he told me what I was working on wasn't possible (he just didn't know how).
 

  • Like 1
Link to comment
Share on other sites

4 hours ago, zzip said:

Yeah I was hitting hurdles programming on the ST as well.   I had GFA Basic,  which was good, and I did a lot with it.  but it wasn't clear how to do some of the more advanced stuff with it.    In the 8-bit days we referred to memory maps.  We'd read/write to certain memory locations and interacted with the hardware that way.   In the 16-bit era it was API calls or interrupt driven stuff.  It was a different concept.   I had a book called The Atari Compendium or something that explained all the ST hardware, TOS/GEM/AES/VDI calls,  LineA interrupts and what not,  but it wasn't obvious how to utilize it in GFA Basic.   There was a missing link there.

 

I really wanted to learn C because that's what the professionals seemed to use, and maybe it would all become clear.   I got a C compiler off somebody,  it had some API references.   I checked C programming books out of the library and would try the beginner examples, and they would inevitably fail because they would tell you to include stdio.h,  and apparently ST was using a completely different library to do basic i/o,  but I had no idea what that was or how to find out.    Again there was a missing link in the documentation I had available that kept me from getting from point A to point B.   

 

Of course this was all pre-internet.  I'm sure I could find the answers within 10 minutes to the things that stumped me back then.

One of the real problems with the Amiga was that there was a glut of info, but it lacked examples.  Basically, no context to piece things together.
Once easier books came out some of that wasn't as much of an issue, but I'd already moved on.
 

  • Like 1
Link to comment
Share on other sites

Examples're important. I loved the Apple tutorials, both from Apple themselves and some 3rd party material. Read the purpose, read how it works, type in an example, then adapt the subroutine to your own use.

 

It was like installing the software clock into the BBS. Read the theory of ops, read the tutorial, looked through the demonstration program, and when it came time to incorporate it into the BBS it just kinda happened. And months later when I got a real hardware clock, a TimeMaster II H.O. from Applied Engineering, all I had to do was read the specs and data format. The examples and theory from the previous software clock tutorial carried right over.

 

Loved some of the early magazines. They weren't really magazines so much as huge big-ass newsletters. Or newsletters disguised as a magazines. The flavor was distinct from pulp publications. Peelings II would be one of those. Hardcore Computing another. Very grass roots. I felt that was missing from the Amiga. Then again all I really read was AmigaWorld. It felt very conceptual and philosophical. Layman reading.

  • Like 1
Link to comment
Share on other sites

13 hours ago, MrMaddog said:

I had that same problem too on the ST, GEM made everything more complicated and it wasn't as well documented as say MacToolkit was...

Yeah, I was writing some GEM apps in GFA Basic with full menus and everything,  but I always had this nagging feeling that I wasn't doing it correctly.  There were things in the documentation that didn't make sense so I just did what seemed to work.   Any custom dialog boxes were complete hack jobs on my part, never built using a resource editor because I just didn't know how to integrate that.   I'm sure the end user was none the wiser.   Of course you could completely bypass GEM as I often did for simple programs and games.

 

14 hours ago, MrMaddog said:

Nearly all the programming I've done for the ST was using STOS BASIC but that was because it completely bypassed TOS and hit the bare metal cause it was primary made for games.  In other words, it felt more like programming on an 8-bit micro.  But once I got up to high level compilers like C, forget it!  I got a copy of GST C after taking a C programming course but both the compiler & the documentation were too complex for me to use.

I was tempted by STOS, seeing all the games produced by it.   Yeah you could do almost anything with GFA, but I got the sense STOS made games easier?

 

Glad it's not just me having trouble with a C compiler!

 

14 hours ago, MrMaddog said:

Someone mentioned how easy the documentation of Turbo Pascal & C was,well there was ST versions called Pure C & Pure Pascal that were 100% compatible with Turbo C & Pascal for the PC and still used GEM library calls.  But they were only sold in Germany...

There was also Personal Pascal by OSS that seemed pretty powerful and popular.   I had Pascal courses in high school and also in College, but was taught that Pascal was "just a teaching language" and it's not used in the real world, so that biased me against using Pascal for my personal stuff.   

 

The school also taught us COBOL, "because it's out there and you will encounter it".    Funny enough my first job had me using Pascal and I never encountered COBOL again :)

 

That school didn't teach any C courses.   One professor argued that C is just like assembly language.   "we already taught you (x86) assembly, why do you need to know another assembly language?" 

  • Like 1
Link to comment
Share on other sites

For me, it would be BBC BASIC. It offered full access to all the machine's features. It even had a built in assembler. You could write assembler *in-line* with your BASIC code, and it would just work.

 

The language could be esoteric though. *FX commands, anyone? VDU commands? All a bit weird, and probably arose out squeezing as much into the ROM as possible.

 

A very good machine for its time. It had named functions (DEF PROC!), integer variables etc. Excellent.

  • Like 3
Link to comment
Share on other sites

13 hours ago, mr_me said:

  Isn't the PC segmented memory a problem for low level programmers.  Using high level compilers, I didn't know anything about segmented memory, or maybe it's because my programs were small.

Possibly. It was definitely an issue in ASM, and I've seen old C programs for PC that had "near" and "far" pointers.   Other than that, it's possible the language would hide the pain from you.

 

13 hours ago, mr_me said:

Upper memory was just a workaround for a computer that was designed for programs to use only 640KB ram, including the OS.  Once microsoft provided upper memory support, it's not that big a deal to figure it out as all the upper memory command options are documented.  The other thing people forget is that besides the commands and options, you use a utility like msd.exe to tell you what upper memory locations are available or not.  Otherwise you're using the options blind.  The ms-dos installer walks you through Memmaker, if I'm not mistaken.

I don't think MS-DOS automatically ran memmaker,   I always had to run it after the fact,  also after installing all my TSRs and drivers so that it optimizes my final autoexec/config and not the initial one.    These kinds of tools are still hacky- created to deal with issues created by the Intel architecture and MS-DOS.  A better integrated OS would make these issues more seemless if not hide them completely.  Eventually we got there with the 32-bit architecture and OSes like Windows

 

16 hours ago, JamesD said:

One of the real problems with the Amiga was that there was a glut of info, but it lacked examples.  Basically, no context to piece things together.

So seems like a common problem in that era.   Imagine the kinds of software we might have had if this information was easier to obtain?

  • Thanks 1
Link to comment
Share on other sites

5 hours ago, zzip said:

There was also Personal Pascal by OSS that seemed pretty powerful and popular.   I had Pascal courses in high school and also in College, but was taught that Pascal was "just a teaching language" and it's not used in the real world, so that biased me against using Pascal for my personal stuff.   

 

The school also taught us COBOL, "because it's out there and you will encounter it".    Funny enough my first job had me using Pascal and I never encountered COBOL again :)

 

At the time I used to cal them "Dead Programming Languages" because the old professors expected you to learn outdated languages like it's Latin.

 

And meanwhile C & C++ was powering many commercial shrink-wrapped programs & even games.  I had to take a elective course on C programming to keep relavant but it sure added a huge strain to my overall course work that semester!

 

Quote

That school didn't teach any C courses.   One professor argued that C is just like assembly language.   "we already taught you (x86) assembly, why do you need to know another assembly language?" 

What an idiot...didn't that person even know the difference between high and low languages? 

 

Assembly is also useful and after learning x86 assembly in college, I applied it to learning it on the 68K.

 

Quote

@Mr_Me - Isn't the PC segmented memory a problem for low level programmers.  Using high level compilers, I didn't know anything about segmented memory, or maybe it's because my programs were small.

MS-DOS is no problem if your running a single program within the 640K convention memory, it's when you start adding things in the background like TSR's and drivers that should be loaded in the upper memory (very carefully).

 

The real issuse is you can't just access the megabytes of memory on a 386/486 computer only using 16-bit MS-DOS.  You either have to run the DOS program in Windows which uses virtual mode & it's own 32-bit disk I/O or compile the program using a DOS Extender which uses 32-bit protected mode to access the extra memory and switch to 16-bit real mode to use DOS calls.

Edited by MrMaddog
  • Like 1
Link to comment
Share on other sites

9 minutes ago, MrMaddog said:

At the time I used to cal them "Dead Programming Languages" because the old professors expected you to learn outdated languages like it's Latin.

Demand for COBOL programmers did later rise with Y2K,  and I'm sure the COBOL powered mainframes are still in operation out there,  but I never pursued that world even that I kind of liked COBOL.

 

12 minutes ago, MrMaddog said:

What an idiot...didn't that person even know the difference between high and low languages? 

Yes he did.   But C is kind sits in the middle between high-level and low-level.   So I get why he said that,  I just don't agree.   It was obvious to me that C/C++ were extremely useful and in-demand.    Instead they taught us Modula-2 as a "modern" language.   Modula-2 is basically Pascal with Library support (modules) and I've never encountered it in real life.

  • Like 1
Link to comment
Share on other sites

18 hours ago, JamesD said:

One of the real problems with the Amiga was that there was a glut of info, but it lacked examples. 

In re-thinking that. I recall having stacks of books for other micros I had access to, c64, coco, ti994a, timex-sinclair, atari 400/800. And I never got into those because of time. Only focused on apple2.

Link to comment
Share on other sites

14 hours ago, zzip said:

Demand for COBOL programmers did later rise with Y2K,  and I'm sure the COBOL powered mainframes are still in operation out there,  but I never pursued that world even that I kind of liked COBOL.

 

 

I remember jobs for COBOL programmers being advertised around 1998-99 for the Y2K, and that was because some old programmer from the 70's didn't think they still use the same mainframe by the year 2000 so they made the two digit year fields.  :roll:

 

Quote

Yes he did.   But C is kind sits in the middle between high-level and low-level.   So I get why he said that,  I just don't agree.   It was obvious to me that C/C++ were extremely useful and in-demand.    Instead they taught us Modula-2 as a "modern" language.   Modula-2 is basically Pascal with Library support (modules) and I've never encountered it in real life.

Again, that's my complaint against university education...they only teach outdated concepts instead of what is of pratical use in the real world.  Pascal and Modula-2 were made by the same guy who thought other languages like BASIC were "broken" even though that was used outside the Ivory Tower campuses.

 

Even now I'm taking online courses in CompTia+ certifications just to get any kind of IT work because all the stuff I learned for my B.S. in Computer Science is all obsolete (and nearly drove me into eternal student loan debt).

 

  • Like 1
Link to comment
Share on other sites

2 hours ago, MrMaddog said:

I remember jobs for COBOL programmers being advertised around 1998-99 for the Y2K, and that was because some old programmer from the 70's didn't think they still use the same mainframe by the year 2000 so they made the two digit year fields.  :roll:

I think it was more that storage and memory were at a real premium back then, so anywhere you could save a few bytes you took it.   They say a lot of COBOL programmers came out of retirement and earned some big money in the Y2K era.

2 hours ago, MrMaddog said:

Again, that's my complaint against university education...they only teach outdated concepts instead of what is of pratical use in the real world.  Pascal and Modula-2 were made by the same guy who thought other languages like BASIC were "broken" even though that was used outside the Ivory Tower campuses.

Yeah it was always high on theory, low on practical examples.    Sometimes you would get a professor who had real-world experience and it would be a totally different kind of class.  I remember having a networking class taught by a guy who worked for Bell (now Verizon), and that was one of the more interesting classes.

 

And the real world doesn't care how proper/elegant a language's design is, just that it can be used for real work and you can find programmers/support for it.

  • Like 2
Link to comment
Share on other sites

Enhanced practical Pascal had a large user base in the late 70s and early 80s. Witness the success of UCSD Pascal and especially the Apple Pascal variant. Relatively modest memory requirements and a lot of college graduates with knowledge of it made it a common choice for development. Pascal growth largely ended with UCSD selling the line to a military contractor who priced it out of reach of most colleges. Academia switched to the cheaper world of Unix and its included C compilers. Meanwhile, Standard Pascal did whatever was possible to prevent the use of Pascal in normal business settings. Turbo Pascal carried the banner for a decade but a single company can only do so much to push a language. 

 

C is a horrible language for beginning programmers. So many surprises await as different compilers would give different results to the same code and all the results would be correct according to the standard. 

  • Like 1
Link to comment
Share on other sites

9 minutes ago, Krebizfan said:

C is a horrible language for beginning programmers. So many surprises await as different compilers would give different results to the same code and all the results would be correct according to the standard. 

I can understand it not being the first language taught,  but it shouldn't be excluded it from a college curriculum completely for dumb reasons when it was a widely used and important language.

 

Pascal compilers weren't perfect either.   I remember my final project in my high school Pascal class just would not run correctly.   I went over the code again and again and could not find a bug.   Showed it to the teacher and she couldn't find an issue with it either.   But something was not getting compiled correctly and the results were incorrect.   I don't recall the exact Pascal dialect, but it was one of the Apple II ones.

  • Like 1
Link to comment
Share on other sites

14 minutes ago, zzip said:

I can understand it not being the first language taught,  but it shouldn't be excluded it from a college curriculum completely for dumb reasons when it was a widely used and important language.

 

Pascal compilers weren't perfect either.   I remember my final project in my high school Pascal class just would not run correctly.   I went over the code again and again and could not find a bug.   Showed it to the teacher and she couldn't find an issue with it either.   But something was not getting compiled correctly and the results were incorrect.   I don't recall the exact Pascal dialect, but it was one of the Apple II ones.

C wasn't excluded. It was reserved for grad students whose thesis was creating an extension to Unix. UCSD's grad students had already created the development environment needed for Pascal so there wasn't much left for new theses. The transition of C into an undergrad language was a mistake. To properly program in C requires devoting a lot of effort to error trapping which the undergraduate courses didn't bother with. A lot of C programmers graduated without the knowledge of how to write larger programs that didn't have hidden bugs. Pascal insulates the programmer from those bugs allowing the student to focus on learning design algorithms and later using those good practices with other languages. 

  • Like 2
Link to comment
Share on other sites

Man... I love all old computers, but I have to say, hands-down, the Tandy 1000.

 

Aside from the old classic Tandy 1000s, they sold specific newer versions of the slimline Tandy 1000 (TL and RL models) that supported the following:

 

  • Super VGA + Tandy specific graphic models
  • Tandy 3-Voice Polyphony integrated Sound
  • 286+ Processor w/ at least 1mb of ram
  • Slimline case w/ 3.5" 1.44 Floppy drive
  • 40mb+ internal hard drive
  • Built-in Left and Right joystick ports

 

They could support ALL IBM/PC games, as well as games with specific Tandy requirements. Many of these systems even came pre-packaged with Adlibs or early Sound Blasters, while STILL supporting integrated Tandy Sound.

  • Like 1
Link to comment
Share on other sites

On 7/13/2022 at 9:37 AM, Krebizfan said:

Enhanced practical Pascal had a large user base in the late 70s and early 80s. Witness the success of UCSD Pascal and especially the Apple Pascal variant. Relatively modest memory requirements and a lot of college graduates with knowledge of it made it a common choice for development. Pascal growth largely ended with UCSD selling the line to a military contractor who priced it out of reach of most colleges. Academia switched to the cheaper world of Unix and its included C compilers. Meanwhile, Standard Pascal did whatever was possible to prevent the use of Pascal in normal business settings. Turbo Pascal carried the banner for a decade but a single company can only do so much to push a language. 

 

C is a horrible language for beginning programmers. So many surprises await as different compilers would give different results to the same code and all the results would be correct according to the standard. 

Well, there was a bit of snobbery in academia.  You can't be taught anything once you've learned BASIC, I don't like Pascal because, and then everyone takes that to heart and it's no longer taught.
Modula 2 was never really taught, and Oberon wasn't really taught many places. 
Let's face it, changing the name constantly didn't help.
UCSD had a rule against making a profit, so they had to sell UCSD Pascal.  Seriously... that's why they sold it.

Plus they were all in on object oriented programming and Pascal didn't really have an answer for that. 
Borland came out with Delphi but it's proprietary.

C may be horrible for beginners, but they are teaching java to beginners.
Yeah, had to help a friend with his intro programming class.
Never mind what this does yet because we haven't taught you objects and everything is an object. 
Once I explained things to him, he stopped having problems.

 

Edited by JamesD
Link to comment
Share on other sites

On 7/12/2022 at 11:36 AM, zzip said:

So seems like a common problem in that era.   Imagine the kinds of software we might have had if this information was easier to obtain?

Operating systems and GUI's were just enormous compared to what people were used to.
There were two camps of C compilers when I started on Amiga.  ManX and Lattice C.
The two required very different code at times.  Even if you had an example, it might not work with your compiler.
Lattice had a more Unix like dev environment so I went that route, and it turned out to be the right choice.

Edited by JamesD
Link to comment
Share on other sites

2 hours ago, JamesD said:

Well, there was a bit of snobbery in academia.  You can't be taught anything once you've learned BASIC, I don't like Pascal because, and then everyone takes that to heart and it's no longer taught.
Modula 2 was never really taught, and Oberon wasn't really taught many places. 
Let's face it, changing the name constantly didn't help.
UCSD had a rule against making a profit, so they had to sell UCSD Pascal.  Seriously... that's why they sold it.

Plus they were all in on object oriented programming and Pascal didn't really have an answer for that. 
Borland came out with Delphi but it's proprietary.

C may be horrible for beginners, but they are teaching java to beginners.
Yeah, had to help a friend with his intro programming class.
Never mind what this does yet because we haven't taught you objects and everything is an object. 
Once I explained things to him, he stopped having problems.

 

For about 14-15 years of my adult life, I was a computer programmer... meaning that this is literally all I did. I loved being a programmer. Now I mostly sit in meetings, send e-mails, etc. But one of the companies I worked for earlier on in my career, I used Delphi. I'd learned to program in Pascal when I was in high school, so Delphi was second nature to me. I loved Delphi. Last one I ever used was version 7, and then I never used it again for work.

 

I spent the last few years of my programming career doing C#/.Net and Coldfusion. I mean, occasionally I'll write a Python script or do something in PowerShell or whatever... but my programming days are long gone. :(

 

I also used to program in a language called "MUMPS." It stood for something like Massachusetts University Medical Programming System. Essentially, the operating system was an interpreter (like BASIC), and also the database at the same time. It was a hierarchical database, non-relational.

 

Here's an example:

ei4p9.png

 

 

Basically, because MUMPS was originally made to work on computer that had limited ram, it was designed so that every command, function, etc., could be shortened to a single character. Anything with a "$" denotes that it is a function. Anywhere you see a "." is the start of a loop (I'm not sure I see any in here though). S is set, F is for, etc. I forget what @ does, but it did something important... oh well, it's been a while. Love that language.

  • Like 2
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...