Jump to content
IGNORED

50 Years of BASIC - TIME


Mr SQL

Recommended Posts

 

I respectfully disagree. Today's computers and computing devices are many multitudes more useful in the household because of their connectivity. Of course, that connectivity also makes them a greater tool for distraction, but, in any case, if we're being honest, it was a lot more effort to actually do useful things with computers back in our youth. Computers were also more for us back then as well, rather than for everyone like they are today. That in and of itself has pluses and minuses.

You stopped too soon. Connectivity, speed, memory, data storage, ease of use, size...

 

Agreed on doing useful things. Who would have even thought of using a computer that fits in your pocket to give you turn by turn directions somewhere.

That also applies with programming though. Now there are libraries to do just about everything. You just have to learn how to use them.

One of the biggest differences now, is there are examples of how to do almost anything you might want to try available for download.

 

The some of the problems with BASIC back in the day for me:

 

#1 Line numbers

#2 Interpreted

#3 Designed around completness instead of performance

 

When I was younger I wanted to make games and BASIC was just too slow. There was the odd interesting languge that was designed for speed but these where not what you got with the machine. Then maybe when ST/Amiga came along you had STOS/AMOS/Blitz which where ok.

 

Compiled basics like IntyBasic and Batari Basic are totally different, they are designed to make games,

designed around the limitations of the hardware. They are BASIC , but are more related to Assembly - just sugar coated.

 

I actaully wish I had learned Assembly back in the 80s, because it was never as hard as I thought it was.

 

Every time I started working on a game on the CoCo, the graphics engine would just start looking good and I'd run out of RAM. It takes so much memory for graphics, sound, music, etc... that by the time you have that stuff the way you want it there isn't room for a lot of game logic on 8 bit computers. It took 12K just to have double buffered hi-res graphics on my CoCo. That leaves about 12K of free RAM for program space, of which the page flipping and sprite engine took around 4K, and I really needed at least 20K just for the game logic.

 

The first time I ever used a compiler I said screw BASIC, this is how things should work. I had sort of understood that before, but seeing it in action was an eye opener. If I had known Pascal in high school, I probably would have bought a Pascal compiler which would have helped with speed and memory limitations, but it just kicks the bottleneck down the road to something else. I would have needed to know assembly to create custom libraries and there was nobody around I could ask questions about that in high school. (oh of the joys of rural living) I still managed to embed some assembly in BASIC programs I wrote, but I didn't fully understand it until college. I think it was literally the first week of my assembly language class where everything clicked and I was off and running in assembly. But by then, I didn't have the time I used to in high school. I had to work part of the time, and I was more interested in spending my free time on women instead of computers.

 

 

The one area where I see the old BASICs having an advantage is complexity.

The complexity of programming under modern systems has drastically increased.

Using a compiler requires learning a lot before you can even print "HELLO WORLD".

Just setting up a development environment can be complex.

Then? Turn on the compute, type PRINT "HELLO WORLD" and hit return. Wow, look what I made the computer do!

You don't even have to declare all your variables.

I was helping a friend with a Java class a couple months ago.

Java requires setting up a development system, some understanding of compilers, and some understanding of object oriented programming before you can output hello world.

By that time several members of the class are wondering why anyone would want to do this for a living.

Edited by JamesD
  • Like 2
Link to comment
Share on other sites

  • 1 month later...

 

What I object to is elevating it as the end-all-be-all perfect programming language (as you sometimes seem to be doing), with such flexibility to invite the novice and support the expert. I disagree with this because BASIC has many flaws and does not lend itself to learning good habits, which in my mind is very important for a novice; and for experts, there are many other languages better suited to advanced applications and higher cognitive thinking.

On the contrary, BASIC's flaws help one learn good habits. As soon as someone tries to debug a program they wrote, they think "Well, maybe I shouldn't have used GOTO so often." ;)

  • Like 2
Link to comment
Share on other sites

Yes very much that.

 

When I learned to program as a kid I was quite happy to imagine a mathematical formula in my head, and then write it out in BASIC. I could do this very easily and quickly in Applesoft. Simply turn on the machine and within a faction of a second I could be typing away. And moments later my program would take shape.

 

Basic, Fortran, Pascal, 6502, whatever. Basic was the first language I learned because it was so prevalent. I think I said it before somewhere, but the local super market food store had magazines with Basic type-in programs. The TRS-80 Pocket Computer used Basic. To a kid it was everywhere! The idea of computer literacy back then knowing some sort of language. Today parents are impressed that little junior can touch app buttons.

 

Anyways, I found GOTO statements valuable when thinking of a BBS feature late at night and I'm like ohh I just gotta put this here. But my program was short on line numbers, so I used it to "escape" to a bigger range and then come back. In any case, it all worked and got the job done.

 

When trying to go back and edit it some months later or something it wasn't too difficult. Just follow the program as if you were the computer. But of course I would begin using REM statements at some point. Eventually I would make complete sentences describing exactly what was going on. It's not hard to do.

  • Like 1
Link to comment
Share on other sites

  • 3 months later...

I was helping a friend with a Java class a couple months ago.

Java requires setting up a development system, some understanding of compilers, and some understanding of object oriented programming before you can output hello world.

By that time several members of the class are wondering why anyone would want to do this for a living.

 

HelloWorld in Java, here you have it :D

https://gist.github.com/lolzballs/2152bc0f31ee0286b722

 

Every time I hear someone say Strategy-Pattern, Solid-Principle, Dependency-Injection, Singleton, etc. I cringe a little.

  • Like 2
Link to comment
Share on other sites

 

HelloWorld in Java, here you have it :D

https://gist.github.com/lolzballs/2152bc0f31ee0286b722

 

Every time I hear someone say Strategy-Pattern, Solid-Principle, Dependency-Injection, Singleton, etc. I cringe a little.

 

I don't know... I thought those phrases "have legs", so long as you look at it from the perspective of "the totality of the circumstances." :grin:

Link to comment
Share on other sites

  • 3 weeks later...

Well let me tell you people something... Anyone who ever says BASIC was a bad unstructured language was doing it wrong! All that "spaghetti code" that people complain about was a result of not planning ahead and just adding stuff as they were writing it.

 

You can procedural and structured programs in BASIC if you know how. I took a course in college that taught how to write procedures using GOSUB & WHILE commands w/o having to use a single GOTO command. It can be done folks, I learned BASIC programming this way after many years of typing in spagetti code programs from magazines and I went on to learn PASCAL, COBOL and even C. And anyone who says nobody can learn other languages after BASIC was a just lousy teacher.

  • Like 2
Link to comment
Share on other sites

I think in some ways the 70s and the 80s were way better in terms of computing, compared to today.

 

Today we have loads of spyware, virtually all the major communication platforms are social media and are all designed to get you addicted and spy on you...

 

By contrast Usenet had a big philosophy of singling out conflicts of interest and keeping information free.

 

I feel like BASIC had a little bit to do with their democratic nature of early tech. It was simple enough for anyone to learn how to use as an operating system, and it captured a lot of people's imagination. Imagine 12K RAM games now... nobody can make a game that small now.

 

I personally hate modern computing because it is so bloated and all the AI in everything is like robotic used car sales.

 

I've commented in this thread before - but I think for what it was for beginners, it was pretty good. You'd just have tp graduate to something better if you were gonna get serious.

 

Personally I'd prefer a BASIC or a bash shell over my android phone any day - at least you know what's running on your computer now. You have to root your phone in order to get access to all that, and void your warranty.

 

Too many marketing departments and people with business degrees.

 

Not enough collective hardware and software engineering.

  • Like 2
Link to comment
Share on other sites

Well let me tell you people something... Anyone who ever says BASIC was a bad unstructured language was doing it wrong! All that "spaghetti code" that people complain about was a result of not planning ahead and just adding stuff as they were writing it.

 

You can procedural and structured programs in BASIC if you know how. I took a course in college that taught how to write procedures using GOSUB & WHILE commands w/o having to use a single GOTO command. It can be done folks, I learned BASIC programming this way after many years of typing in spagetti code programs from magazines and I went on to learn PASCAL, COBOL and even C. And anyone who says nobody can learn other languages after BASIC was a just lousy teacher.

 

Right, simply avoid using GOTO as I mentioned previously. But still, many (most?) BASICs on home computers back in the day did not support named subroutines or labels; which are pretty integral to top down design and modular programming.

Link to comment
Share on other sites

Here's an excerpt from Dr. Nikolas Wirth when justifying his invention of the Pascal programming language:

 

 

The desire for a new language for the purpose of teaching programming is due to my dissatisfaction with the presently used major languages whose features and constructs too often cannot be explained logically and convincingly and which too often defy systematic reasoning. Along with this dissatisfaction goes my conviction that the language in which the student is taught to express his ideas profoundly influences his habits of thought and invention, and that the disorder governing these languages imposes itself into the programming style of the students.

 

*emphasis mine

 

In that, I agree with Dr. Wirth. This is not to say that BASIC makes you "brain damage" (as Dr. Knuth has been known to claim), nor that Pascal is necessarily better than BASIC; but that the latter's lack of rigor and allowances of loose structure necessarily influences the habits and style of burgeoning programmers -- In very much the same in which the spoken language to which you are accustomed colours your perception of the world by the expression it affords to your thoughts and ideas.

 

-dZ.

Link to comment
Share on other sites

 

Right, simply avoid using GOTO as I mentioned previously. But still, many (most?) BASICs on home computers back in the day did not support named subroutines or labels; which are pretty integral to top down design and modular programming.

 

Yeah you're right, although there were a few prodecdural BASICs that did use labels you still had a lot that used line numbers. But you can still plan ahead and use the line numbers as a reference point muich like the Dewey Decimal System libraries have.

 

From what I remember from my college course, the main routines were in the thousands and maybe the sub-routines were only in the hundreds. Say you put the print out routine at line 3000 so you use GOSUB 3000 and at the end of that routine you use RETURN.

 

But if you still need use of labels than you can use varibles for your GOSUB commands like this...

1000 REM Main Routine

1010 PRINTOUT = 3000

1020 GOSUB PRINTOUT

1030 END

3000 REM Print Out Routine

3010 PRINT "Blah Blah Blah..."

3020 RETURN


Anyway, this is what colleges at the time were teaching to transition "Street BASIC" users into a more disiplined mindset for learing other languages like Pascal. (Of course it all goes out the window when trying to learn Assembly...)

Link to comment
Share on other sites

Yeah you're right, although there were a few prodecdural BASICs that did use labels you still had a lot that used line numbers. But you can still plan ahead and use the line numbers as a reference point muich like the Dewey Decimal System libraries have.

 

From what I remember from my college course, the main routines were in the thousands and maybe the sub-routines were only in the hundreds. Say you put the print out routine at line 3000 so you use GOSUB 3000 and at the end of that routine you use RETURN.

 

But if you still need use of labels than you can use varibles for your GOSUB commands like this...

1000 REM Main Routine

1010 PRINTOUT = 3000

1020 GOSUB PRINTOUT

1030 END

3000 REM Print Out Routine

3010 PRINT "Blah Blah Blah..."

3020 RETURN


 

When I was in High School in the mid-to-late 80s, my friend and I took a "Computer Class" which taught BASIC. I was already pretty proficient in BASIC and had moved on to Assembly Language (on the C=64), and my friend was a rather advanced hacker, so the class was a bit lame for me and my buddy.

 

Nonetheless, the teacher tried to mentor me by introducing me to Structure Programming and other "advanced" topics. This concept just blew my mind! I can't really overstate how much that insight blinded me. it was so obvious in retrospect.

 

At the time I was writing an archiving/indexing database system for my (huuuuuge!) collection of floppies which the teacher thought was good enough to apply to the problem of organizing the High School software library as well. It was good and efficient, but mostly written in pure, adulterated spaghetti code, with extra messy sauce. Teenage-hacker style!

 

I printed the whole thing out on green and white tracked paper on the library's printer and spent a week or so "refactoring" it by hand on the printout paper, with a highlighter, a red pen, and a bunch of pencils. The whole thing came to about 40 feet of paper. I went through it obsessively, line by line, identifying repeating patterns of code or potentially re-usable chunks.

 

At the end, I re-wrote the entire thing and it came out to be less than half the size of the original, tight and even more efficient -- and more importantly -- so easy and pleasurable to read and debug.

 

This has always been a very powerful turning point in my programming life. Yet, it was not BASIC that opened my eyes. I'd go as far as say that it was in spite of BASIC. In fact, my biggest problem with BASIC is that its syntax and friendly demeanor never made it obvious or even slightly apparent that it was ever possible to do it differently from the spaghetti way to which I was accustomed.

 

This is the reason that, as soon as I discovered other languages such as Pascal, C, Perl, etc., I moved out of BASIC and never ever looked back. I am very much not alone in this regard.

 

I owe some things to BASIC, yes, but it is mostly its role as doorway into a greater world, a world of power and creativity and control which I always felt it teased me about, yet never really fulfilled. At least not to me.

 

Anyway, this is what colleges at the time were teaching to transition "Street BASIC" users into a more disiplined mindset for learing other languages like Pascal. (Of course it all goes out the window when trying to learn Assembly...)

 

Why do you say that? I write very structured Assembly Language code. Just as you can write really crazy spaghetti code in Pascal (I've seen it! It's out there!), you can just as easily write elegant, structured, and modularized code in Assembly Language. Like with many things in life, it just takes discipline, hard work, and a little common sense.

 

-dZ.

  • Like 1
Link to comment
Share on other sites

 

Yeah you're right, although there were a few prodecdural BASICs that did use labels you still had a lot that used line numbers. But you can still plan ahead and use the line numbers as a reference point muich like the Dewey Decimal System libraries have.

 

Even such a ghetto solution as that has a significant limitation: a renumber operation will blow it all up. Sure, you can try and manage to number things originally so that you can insert or move code where needed etc, but the reality is it can be a chore to manage and if a program is large or complex enough, a renumber operation becomes more a necessity than a luxury.

 

 

But if you still need use of labels than you can use varibles for your GOSUB commands like this...

1000 REM Main Routine

 

1010 PRINTOUT = 3000

 

1020 GOSUB PRINTOUT

 

1030 END

 

3000 REM Print Out Routine

 

3010 PRINT "Blah Blah Blah..."

 

3020 RETURN

 

 

Anyway, this is what colleges at the time were teaching to transition "Street BASIC" users into a more disiplined mindset for learing other languages like Pascal. (Of course it all goes out the window when trying to learn Assembly...)

 

 

Hehe LOL. Been there done that. Just one thing though, now instead of having to be bothered fixing line 1020 to correctly point to the "printout" subroutine, the chore has simply shifted to maintaining line 1010!

Link to comment
Share on other sites

 

 

When I was in High School in the mid-to-late 80s, my friend and I took a "Computer Class" which taught BASIC. I was already pretty proficient in BASIC and had moved on to Assembly Language (on the C=64), and my friend was a rather advanced hacker, so the class was a bit lame for me and my buddy.

 

Nonetheless, the teacher tried to mentor me by introducing me to Structure Programming and other "advanced" topics. This concept just blew my mind! I can't really overstate how much that insight blinded me. it was so obvious in retrospect.

 

At the time I was writing an archiving/indexing database system for my (huuuuuge!) collection of floppies which the teacher thought was good enough to apply to the problem of organizing the High School software library as well. It was good and efficient, but mostly written in pure, adulterated spaghetti code, with extra messy sauce. Teenage-hacker style!

 

I printed the whole thing out on green and white tracked paper on the library's printer and spent a week or so "refactoring" it by hand on the printout paper, with a highlighter, a red pen, and a bunch of pencils. The whole thing came to about 40 feet of paper. I went through it obsessively, line by line, identifying repeating patterns of code or potentially re-usable chunks.

 

At the end, I re-wrote the entire thing and it came out to be less than half the size of the original, tight and even more efficient -- and more importantly -- so easy and pleasurable to read and debug.

 

This has always been a very powerful turning point in my programming life. Yet, it was not BASIC that opened my eyes. I'd go as far as say that it was in spite of BASIC. In fact, my biggest problem with BASIC is that its syntax and friendly demeanor never made it obvious or even slightly apparent that it was ever possible to do it differently from the spaghetti way to which I was accustomed.

 

This is the reason that, as soon as I discovered other languages such as Pascal, C, Perl, etc., I moved out of BASIC and never ever looked back. I am very much not alone in this regard.

 

I owe some things to BASIC, yes, but it is mostly its role as doorway into a greater world, a world of power and creativity and control which I always felt it teased me about, yet never really fulfilled. At least not to me.

 

 

Why do you say that? I write very structured Assembly Language code. Just as you can write really crazy spaghetti code in Pascal (I've seen it! It's out there!), you can just as easily write elegant, structured, and modularized code in Assembly Language. Like with many things in life, it just takes discipline, hard work, and a little common sense.

 

-dZ.

 

 

That's such an awesome story!! I really enjoyed reading that.

  • Like 1
Link to comment
Share on other sites

Line numbers and text labels are both abstract references that point to a memory address. The text labels can be friendlier, but the numeric labels provide more information.

 

Programming in BASIC is like being in the Matrix in that regard because you can't see the underlying memory address.

 

Addresses are labels too for Assembly programmers - bitd Assemblers used to show the memory address on the left just where you would see a line number or text label in BASIC.

 

Modern Assemblers can use numeric or alphanumeric labels just like BASIC to gain a similar level of abstraction except when the programmer must write page aligned code.

 

Link to comment
Share on other sites

Line numbers and text labels are both abstract references that point to a memory address. The text labels can be friendlier, but the numeric labels provide more information.

 

Programming in BASIC is like being in the Matrix in that regard because you can't see the underlying memory address.

 

Addresses are labels too for Assembly programmers - bitd Assemblers used to show the memory address on the left just where you would see a line number or text label in BASIC.

 

Modern Assemblers can use numeric or alphanumeric labels just like BASIC to gain a similar level of abstraction except when the programmer must write page aligned code.

 

Line numbers were an artefact of implementation on resource-contrained machines -- a "leaky abstraction," if you will. Interpreted BASIC compilers used a linked-list to string the statement lines of a program. This linked-list has at the head of each node a numeric value which is used sort them in order during editing and garbage collecting operations, and to find specific lines.

 

Scanning the linked list of a line by its number is almost trivial, and requires very cheap machine operations. Moreover, the line number can be stored on a single 16-bit word (or 2-bytes) on most machines.

 

Implementing labels would require specialized storage for strings (which is not cheap), and expensive string-compare operations in order to find what you're looking for. Moreover, you would have to assign a label to each line, which is impractical, if you had any hope of finding a specific line in your program without visually scanning the whole thing from beginning to end.

 

Also, labels offer a higher-level of abstraction to line numbers precisely because it affords you treating the source code as groups of self-contained, named blocks. This leads straightly into a compartmentalized mental model and structured programming. In contrast, line numbers sort of force you to work at the individual statement level, hunting, editing, typing in individual anonymous lines of code that may or may not be related.

 

It is very much the same difference as programming on a Machine Code Monitor (remember those?), where you enter individual op-codes per memory address; and writing more stylized and structured Assembly Language, with procedures, macros, and re-usable chunks, etc.

 

 

Addresses are labels too for Assembly programmers - bitd Assemblers used to show the memory address on the left just where you would see a line number or text label in BASIC.

 

That's not an assembler. Assemblers have always abstracted that from the programmer -- that is their job: You give the machine instructions in order and it will "assemble" them into memory, using the known memory map. The only difference in this regard, between an assembler and a compiler is that the compiler performs a translation between the source code and the machine code, while the assembler uses mnemonics directly representing a machine operation code, with some syntax conventions on how to accept and translate operands. A compiler may translate one statement into many operations; an assembler will always convert a mnemonic into one specific op-code.

 

What you describe is sounds more like a "machine code monitor," which gives you a window into a particular chunk of memory of the machine and lets you inject op-codes directly to them. It is like "POKEing" instructions directly into specific memory locations.

 

Some machine code monitors may also include an assembler and "assembly mode," and some even accept assembly language mnemonics while in "absolute-addressing mode." However, these are more like classical "on-line debuggers."

 

Modern Assemblers can use numeric or alphanumeric labels just like BASIC to gain a similar level of abstraction except when the programmer must write page aligned code.

 

It is important to note that these facilities, including sophisticated macro languages, existed early on in assemblers -- even before BASIC languages started supporting them.

 

 

-dZ.

Link to comment
Share on other sites

Here's an excerpt from Dr. Nikolas Wirth when justifying his invention of the Pascal programming language:

 

 

 

*emphasis mine

 

In that, I agree with Dr. Wirth. This is not to say that BASIC makes you "brain damage" (as Dr. Knuth has been known to claim), nor that Pascal is necessarily better than BASIC; but that the latter's lack of rigor and allowances of loose structure necessarily influences the habits and style of burgeoning programmers -- In very much the same in which the spoken language to which you are accustomed colours your perception of the world by the expression it affords to your thoughts and ideas.

 

-dZ.

I did so much Turbo Pascal coding in the 80s and 90s that I still type things like

 

PlayerX := PlayerX + PlayerDirX;

In IntyBASIC

 

Edit:Forgot the ;

;)

Link to comment
Share on other sites

 

Line numbers were an artefact of implementation on resource-contrained machines -- a "leaky abstraction," if you will. Interpreted BASIC compilers used a linked-list to string the statement lines of a program. This linked-list has at the head of each node a numeric value which is used sort them in order during editing and garbage collecting operations, and to find specific lines.

 

Scanning the linked list of a line by its number is almost trivial, and requires very cheap machine operations. Moreover, the line number can be stored on a single 16-bit word (or 2-bytes) on most machines.

 

Implementing labels would require specialized storage for strings (which is not cheap), and expensive string-compare operations in order to find what you're looking for. Moreover, you would have to assign a label to each line, which is impractical, if you had any hope of finding a specific line in your program without visually scanning the whole thing from beginning to end.

 

Also, labels offer a higher-level of abstraction to line numbers precisely because it affords you treating the source code as groups of self-contained, named blocks. This leads straightly into a compartmentalized mental model and structured programming. In contrast, line numbers sort of force you to work at the individual statement level, hunting, editing, typing in individual anonymous lines of code that may or may not be related.

 

It is very much the same difference as programming on a Machine Code Monitor (remember those?), where you enter individual op-codes per memory address; and writing more stylized and structured Assembly Language, with procedures, macros, and re-usable chunks, etc.

 

 

 

That's not an assembler. Assemblers have always abstracted that from the programmer -- that is their job: You give the machine instructions in order and it will "assemble" them into memory, using the known memory map. The only difference in this regard, between an assembler and a compiler is that the compiler performs a translation between the source code and the machine code, while the assembler uses mnemonics directly representing a machine operation code, with some syntax conventions on how to accept and translate operands. A compiler may translate one statement into many operations; an assembler will always convert a mnemonic into one specific op-code.

 

What you describe is sounds more like a "machine code monitor," which gives you a window into a particular chunk of memory of the machine and lets you inject op-codes directly to them. It is like "POKEing" instructions directly into specific memory locations.

 

Some machine code monitors may also include an assembler and "assembly mode," and some even accept assembly language mnemonics while in "absolute-addressing mode." However, these are more like classical "on-line debuggers."

 

 

It is important to note that these facilities, including sophisticated macro languages, existed early on in assemblers -- even before BASIC languages started supporting them.

 

 

-dZ.

 

You're right dZ it was the Monitor programs that showed the memory addresses and some of those accepted Assembly mnemonics as you pointed out; there were also some Assemblers that actually used line numbers:

 

post-30777-0-01610600-1518972327.jpg

 

post-30777-0-75356100-1518972315_thumb.jpg

 

When BASIC came out it was compiled like it's predecessor Fortran; I'm curious how the designers of the first BASIC compiler at Dartmouth implemented line number lookups and if their routine could have handled text labels too; a native interpreter on a home computer is going to be very resource constrained like you described above, particularly compared to a compiler on a mainframe.

 

I designed Atari Flashback BASIC (which is also compiled) so that when the programmer does not provide a label (either text or numeric) a hidden incrementing line number is created in the Assembly output as a label for that line of code, similar to what is shown in the Assembler pic above.

 

After Assembly the output is bound with the runtime for the BASIC, that component is the same for compiled or interpreted BASIC.

Link to comment
Share on other sites

there were also some Assemblers that actually used line numbers:

 

That's not the "assembler"; that's a monitor with an assembler built-in allowing you to write assembly in absolute-address mode. Assemblers, by design, abstract addresses from the user. Otherwise there wouldn't be a point to "assemble" anything, it would just be a mnemonic machine code editor.

 

I am assuming that the above addresses were automatically defined by the interactive monitor, and not entered by hand. If you had to enter the addresses by hand (and account for the length of op-codes yourself), then well... that's a very crappy assembler. :P

 

 

When BASIC came out it was compiled like it's predecessor Fortran; I'm curious how the designers of the first BASIC compiler at Dartmouth implemented line number lookups and if their routine could have handled text labels too; a native interpreter on a home computer is going to be very resource constrained like you described above, particularly compared to a compiler on a mainframe.

 

I believe that the original Dartmouth BASIC did the same thing: used a linked list to track the lines and their numbers. My understanding is that the line numbers was a way to identify individual lines due to the dependency on line-oriented teletypes/teleprinters as the main input and output device.

 

I designed Atari Flashback BASIC (which is also compiled) so that when the programmer does not provide a label (either text or numeric) a hidden incrementing line number is created in the Assembly output as a label for that line of code, similar to what is shown in the Assembler pic above.

 

After Assembly the output is bound with the runtime for the BASIC, that component is the same for compiled or interpreted BASIC.

 

Those numeric values in the picture you posted are not really labels, they are absolute addresses. The difference is that they increment not only by one, but by the size of the op-code, which may contain one or more operands. An assembler takes care of this by knowing the size of each operation once translated, and incrementing the program counter accordingly.

 

What do you mean by "the output is bound with the runtime for the BASIC"?

 

-dZ.

Link to comment
Share on other sites

 

 

DZ-Jay, on 18 Feb 2018 - 3:58 PM, said:

 

That's not the "assembler"; that's a monitor with an assembler built-in allowing you to write assembly in absolute-address mode. Assemblers, by design, abstract addresses from the user. Otherwise there wouldn't be a point to "assemble" anything, it would just be a mnemonic machine code editor.

 

I am assuming that the above addresses were automatically defined by the interactive monitor, and not entered by hand. If you had to enter the addresses by hand (and account for the length of op-codes yourself), then well... that's a very crappy assembler. :P

 

 

 

 

Close, rather than a monitor with an Assembler built in, it's an Assembler with a monitor built in. The manual for Edtasm+ in this thread explains it.

 

 

 

I believe that the original Dartmouth BASIC did the same thing: used a linked list to track the lines and their numbers. My understanding is that the line numbers was a way to identify individual lines due to the dependency on line-oriented teletypes/teleprinters as the main input and output device.

 

Those numeric values in the picture you posted are not really labels, they are absolute addresses. The difference is that they increment not only by one, but by the size of the op-code, which may contain one or more operands. An assembler takes care of this by knowing the size of each operation once translated, and incrementing the program counter accordingly.

 

 

 

I think absolute addresses qualify as labels because we can jump to them; goto and gosub must lookup the absolute address from the line label. The Dartmouth BASIC compiler likely outputs a binary using absolute addresses and computes them all from the linked list during compile time.

 

Unlike interpreters, compilers can include full text labels with no impact to the size of the compiled program because like comments, all labels including line numbers are for people only.

 

I've seen the type of Assembler/monitor you describe, but in Edtasm+ line numbers cannot be referenced as labels they are only for execution order.

 

What do you mean by "the output is bound with the runtime for the BASIC"?

 

 

The BASIC runtime library must be bound to the compiled BASIC program to create a single binary executable. IntyBASIC and compiled BASIC's from bitd do that too while modern languages generally have a separate runtime library.

Link to comment
Share on other sites

Close, rather than a monitor with an Assembler built in, it's an Assembler with a monitor built in. The manual for Edtasm+ in this thread explains it.

 

I just read through the manual. Those are not arbitrary "line numbers" like in BASIC. Those are the absolute addresses. From what I can see in the manual, you only see those in the "Assembly Listing" file, which is the assembled source as it appears in memory. All assemblers offer some sort of listing file to allow you to review and debug your program as it is arranged in memory.

 

You can let the assembler decide where in memory to store your program, or you can tell it via the ORG and other directives. You can edit an existing program by locating individual operations via their absolute address. You can even change the assembled location by injecting an ORG to an existing program, but then you'll have to re-assemble in order to see the changes in the listing file.

 

This is the impression I get from the manual, which seems to follow my expectation of how an assembler works. It is very much different from how BASIC (compiled or interpreted) works, since line-numbers are merely arbitrary numeric labels, albeit kept in sequential order.

 

 

I think absolute addresses qualify as labels because we can jump to them; goto and gosub must lookup the absolute address from the line label. The Dartmouth BASIC compiler likely outputs a binary using absolute addresses and computes them all from the linked list during compile time.

 

Hmmm... perhaps we should get down to semantics. A label is not something "we can jump to." A label is an arbitrary symbolic representation of an absolute address. You don't "jump to the label," you jump to the underlying address. The label offers a more intuitive way to keep track of locations by letting the programmer associate a symbolic string with it. These symbolic strings presumably have higher cognitive resonance than a mere abstract number. They are conceptually different things, just like your name or avatar is a symbolic representation of you, but it is very much not you. ;)

 

Yes, the Dartmouth BASIC compiler originally output object code. However, this is not the same as translating all line numbers to absolute addresses, because each statement line may compile into multiple op-codes each in its own address. There isn't a 1:1 relation between line-numbers and addresses. As a matter of fact, depending on the way the compiler works, there may not even be a correlation between the ordering of the code itself. For instance, IF/ELSE blocks may be reversed in order to optimize the flow. I have no idea if Dartmouth BASIC did any of that, though.

 

 

Unlike interpreters, compilers can include full text labels with no impact to the size of the compiled program because like comments, all labels including line numbers are for people only.

 

That is true, but I do not know what point you are trying to make with that assertion. Did I give the impression that this was not the case? :? If so, I'm sorry. The point I was trying to make was that labels on interpreted BASIC dialects would require additional resources to store and manage, which were not necessarily available on 8-bit macros. This is probably why they didn't support that.

 

The other point I was making was that the line-numbers in the original Dartmouth BASIC were strictly for ordering the source and editing them using a line-oriented device such as a teleprinter; and that this was kept in the interpreted versions because it was more efficient to order, sort, edit, and manage that way in memory.

 

I've seen the type of Assembler/monitor you describe, but in Edtasm+ line numbers cannot be referenced as labels they are only for execution order.

 

Again, I do not know what point you are trying to make. Those are not line numbers, those are addresses and the assembler defines them. You cannot jump directly to an address for the simple reason that you do not know (or at least are not expected to know) the address at which a particular op-code will be assembled; the assembler handles that. You can tell it on which address to start, you can tell it to assemble a particular chunk of code starting at a specific address, and you can edit an op-code at a specific location.

 

However, you cannot write your Assembly code with manually entered arbitrary addresses (except in absolute-address "Monitor" mode) like you can with BASIC.

 

In BASIC, I can do this:

100  PRINT "DZ"
500  GOTO 10
5    REM This is my silly program
150  PRINT "JAY"

And the interpreter or assembler will sort them accordingly. In Assembly, however, I do this:

 

          ORG  $1234
BEGIN     JMP  START
          LDX  #DATA
          ADDA #$30
 

And the assembler will start assembling at location $1234 and will not expect me at all to know at which location START would be; it will do the translation itself.

 

 

The BASIC runtime library must be bound to the compiled BASIC program to create a single binary executable. IntyBASIC and compiled BASIC's from bitd do that too while modern languages generally have a separate runtime library.

 

I'm still not sure what you mean by "bound"... Do you mean "linked"?

 

IntyBASIC doesn't include a linker. In fact, it doesn't even output object code. It generates an Assembly Language file which can then be assembled with AS1600. What IntyBASIC does is include library modules with a basic engine and utility functions. These are not "bound" in any way. They are just either "injected" in-line into the compiler's Assembly output stream, or "included" as any other library module.

 

The result is a combined Assembly Language program with co-routines and libraries, which will then be assembled accordingly into a single program by the assembler. IntyBASIC does not have to worry about where any of that stuff resides, or how they fit together, that's the assembler's job. It does not even use or reference any addresses at all, it just use symbols (labels) for everything and let the assembler worry about translating them.

 

-dZ.

Link to comment
Share on other sites

 

I just read through the manual. Those are not arbitrary "line numbers" like in BASIC. Those are the absolute addresses. From what I can see in the manual, you only see those in the "Assembly Listing" file, which is the assembled source as it appears in memory. All assemblers offer some sort of listing file to allow you to review and debug your program as it is arranged in memory.

 

You can let the assembler decide where in memory to store your program, or you can tell it via the ORG and other directives. You can edit an existing program by locating individual operations via their absolute address. You can even change the assembled location by injecting an ORG to an existing program, but then you'll have to re-assemble in order to see the changes in the listing file.

 

This is the impression I get from the manual, which seems to follow my expectation of how an assembler works. It is very much different from how BASIC (compiled or interpreted) works, since line-numbers are merely arbitrary numeric labels, albeit kept in sequential order.

 

 

 

Hmmm... perhaps we should get down to semantics. A label is not something "we can jump to." A label is an arbitrary symbolic representation of an absolute address. You don't "jump to the label," you jump to the underlying address. The label offers a more intuitive way to keep track of locations by letting the programmer associate a symbolic string with it. These symbolic strings presumably have higher cognitive resonance than a mere abstract number. They are conceptually different things, just like your name or avatar is a symbolic representation of you, but it is very much not you. ;)

 

Yes, the Dartmouth BASIC compiler originally output object code. However, this is not the same as translating all line numbers to absolute addresses, because each statement line may compile into multiple op-codes each in its own address. There isn't a 1:1 relation between line-numbers and addresses. As a matter of fact, depending on the way the compiler works, there may not even be a correlation between the ordering of the code itself. For instance, IF/ELSE blocks may be reversed in order to optimize the flow. I have no idea if Dartmouth BASIC did any of that, though.

 

 

 

That is true, but I do not know what point you are trying to make with that assertion. Did I give the impression that this was not the case? :? If so, I'm sorry. The point I was trying to make was that labels on interpreted BASIC dialects would require additional resources to store and manage, which were not necessarily available on 8-bit macros. This is probably why they didn't support that.

 

The other point I was making was that the line-numbers in the original Dartmouth BASIC were strictly for ordering the source and editing them using a line-oriented device such as a teleprinter; and that this was kept in the interpreted versions because it was more efficient to order, sort, edit, and manage that way in memory.

 

 

Again, I do not know what point you are trying to make. Those are not line numbers, those are addresses and the assembler defines them. You cannot jump directly to an address for the simple reason that you do not know (or at least are not expected to know) the address at which a particular op-code will be assembled; the assembler handles that. You can tell it on which address to start, you can tell it to assemble a particular chunk of code starting at a specific address, and you can edit an op-code at a specific location.

 

However, you cannot write your Assembly code with manually entered arbitrary addresses (except in absolute-address "Monitor" mode) like you can with BASIC.

 

In BASIC, I can do this:

100  PRINT "DZ"
500  GOTO 10
5    REM This is my silly program
150  PRINT "JAY"

And the interpreter or assembler will sort them accordingly. In Assembly, however, I do this:

 

          ORG  $1234
BEGIN     JMP  START
          LDX  #DATA
          ADDA #$30
 

And the assembler will start assembling at location $1234 and will not expect me at all to know at which location START would be; it will do the translation itself.

 

 

 

I'm still not sure what you mean by "bound"... Do you mean "linked"?

 

IntyBASIC doesn't include a linker. In fact, it doesn't even output object code. It generates an Assembly Language file which can then be assembled with AS1600. What IntyBASIC does is include library modules with a basic engine and utility functions. These are not "bound" in any way. They are just either "injected" in-line into the compiler's Assembly output stream, or "included" as any other library module.

 

The result is a combined Assembly Language program with co-routines and libraries, which will then be assembled accordingly into a single program by the assembler. IntyBASIC does not have to worry about where any of that stuff resides, or how they fit together, that's the assembler's job. It does not even use or reference any addresses at all, it just use symbols (labels) for everything and let the assembler worry about translating them.

 

-dZ.

 

In the sample programming exercise in Chapter 3 the author explains the line numbers are for the programmers convenience, and that Edtasm+ can edit, delete and renumber lines just like you do in BASIC.

 

There is still a one to one relation between line numbers (or text symbols) in BASIC and memory addresses even though the statements can compile into multiple opcodes that each have an address; those statements memory addresses are not visible to the BASIC programmer, only the linenumber lookups are and those always have a 1:1 correlation to the address.

 

An interesting related aspect is that the original authors envisioned every BASIC statement having it's own symbol.

 

When the dialect was expanded with the concatenator it lost that functionality - concatenated statements have no exposed addresses, while classic BASIC has a 1:1 relation.

 

 

While Edtasm organizes with line numbers like BASIC, it lacks another set of lookups that would have enabled it to use the line numbers as symbols - you can only jump to labels.

 

From your description of IntyBASIC I think we're down to semantics again because Flashback BASIC and Visual bB work similarly, they just wrap everything so that you click the play button and the IDE builds a binary.

 

If the IntyBASIC Assembly output that's ready for the AS1600 to turn into a binary contains both the BASIC program and the IntyBASIC runtime I would consider the runtime bound to the program, how else would you describe it? From my perspective you described a linker too.

 

With interpreted BASIC, the runtime is also bound alongside the interpreter. Semantically the programmer might have the runtime linked in another file when assembling the language instead of inline, but it's necessarily bound together with the interpreter when the binary is created.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...