Jump to content
IGNORED

New GUI for the Atari 8-bit


flashjazzcat

Recommended Posts

@Prodatron: I removed the context switch from the end of MessageSend and of course everything works fine - just "different". :) The test application sent a DrawMenuBar message to the UI, then fell straight into its mainloop, calling MessageSleepReceive, causing a context switch back to the UI process, which received the menu message, drew it, and then started yielding since there was nothing else to do (currently), and there's nothing to yield to (yet) since the test application is asleep waiting for a message from the UI. Thanks again for your advice and guidance over this past year or so.

I am really glad that it seems to work so fine! :) I am afraid, somehow I missed it, but how fast is the 6502-based context switch now from one process to the next? (including stack and zero page?) It's so crazy that it already works so cool! I completely underrated the 6502 in the far past :P

Btw, did you already implement the idle process? The one which only receives the CPU time if noone else uses it (which means that everyone else with a higher priority went into idle or sleep mode). Ok, it shouldn't change the behaviour of the system, but it's an interesting and completely simple to implement piece of the whole thing! (and you can calculated the real CPU usage :) )

Edited by Prodatron
  • Like 2
Link to comment
Share on other sites

I am really glad that it seems to work so fine! :)

Me too, but without your insights, I doubt it would have worked so well. :)

 

I am afraid, somehow I missed it, but how fast is the 6502-based context switch now from one process to the next? (including stack and zero page?) It's so crazy that it already works so cool! I completely underrated the 6502 in the far past :P

Very fast until we hit four processes. The stack is split into four, and - following advice again - I decided we'll allocate page zero for each process from a pool instead of copying a small block every context switch. We have lots of page zero space, after all (since the Atari's OS has been jettisoned), and an application won't need more than eight or so bytes of page zero at the most, I'd say. Jac! has even offered to help with a tool which builds relocation tables (both absolute and zero page) for "standard" binaries from assemblers other than MADS (which can create the necessary tables during normal assembly). The UI process and the "desktop" are mandatory processes, so their page zero references are hard coded. Only when we exceed four processes do we need to start copying stacks, and some time ago I worked out what will hopefully be an efficient optimisation which tends to keep ready processes in the hardware stack space and cache the stacks of sleeping or low-priority tasks: thus we have to undertake stack copying quite infrequently, and background processes waiting on messages from the UI (as most will be) won't be occupying contended hardware stack slots until they receive a message. So it all looks pretty positive.

 

Btw, did you already implement the idle process? The one which only receives the CPU time if noone else uses it (which means that everyone else with a higher priority went into idle or sleep mode). Ok, it shouldn't change the behaviour of the system, but it's an interesting and completely simple to implement piece of the whole thing! (and you can calculated the real CPU usage :) )

I didn't, but I'm in a bit of a hurry to get there. A CPU usage meter is one of the things I want to add next, since it's most intriguing to see on an 8-bit machine. I imagine the formula for calculating the percentage CPU usage by each process is pretty simple? There are lots of things the system idle process could probably do; one thing would be working out the free cluster count on FAT16 volumes (not that we have a filesystem process yet) to avoid a long pause when first requesting a directory listing. Not an issue with FAT32, of course.

Edited by flashjazzcat
  • Like 2
Link to comment
Share on other sites

For a somewhat coarse measurement you could keep a running tally of the difference between VCOUNT at entry and exit for each process. Then every so often report the sum over the total for each process and reset the counters. Problem is you'd probably need 16-bit tallies to even things out over many frames. And you'd need a 16-bit divide to compute the percentage. Though you could probably just use a look-up table on the high byte if you read the counters at precise intervals, like say every 50 VBIs.

 

Also, maybe it's better to have the idle task truly idle (i.e. jmp *) and put any background stuff in an actual processs. That way if the free cluster count algorithm is blocking waiting for data to be put into a buffer, for example, the real idle process can run instead. The nice thing is that you won't have to allocate any ZP for the idle process.

  • Like 2
Link to comment
Share on other sites

I decided we'll allocate page zero for each process from a pool instead of copying a small block every context switch. We have lots of page zero space, after all (since the Atari's OS has been jettisoned), and an application won't need more than eight or so bytes of page zero at the most.

That means, if you want to keep 128 ZP bytes for the system, you could handle 16 processes, when each uses 8 ZP bytes. I think that's already enough! :)

 

I imagine the formula for calculating the percentage CPU usage by each process is pretty simple? There are lots of things the system idle process could probably do; one thing would be working out the free cluster count on FAT16 volumes (not that we have a filesystem process yet) to avoid a long pause when first requesting a directory listing. Not an issue with FAT32, of course.

At least on the CPC, MSX and PCW it's hardly possible to calculate the CPU usage for each process as you only have one IRQ, which occures only every 1/50 or 1/300 second. Maybe on the A8 you have better possibilities because of your more flexible IRQ system?

So on my side it's only possible to calculate the remaining CPU time, which is quite easy: You have a system counter, which is incremented every 1/50 second. And you have the idle process which does nothing else then incrementing an own counter all the time. By comparing both counters you can figure out the exact free CPU time (which is the time, the idle process is "working").

  • Like 2
Link to comment
Share on other sites

That means, if you want to keep 128 ZP bytes for the system, you could handle 16 processes, when each uses 8 ZP bytes. I think that's already enough! :)

Yep: we can either decide on a "block" size (8 bytes) or just allocate them ad-hoc. As long as we allocate PZ locations in pairs (for pointer use) there'll be no problems allocating and deallocating from a "pool"; probably 64 or so bytes would do in that case (some small processes might not require any PZ at all). If we use blocks, on the other hand, the space is used less efficiently but we can do easy swapping when we run out. My preference is for no swapping (and no fixed blocks) and the most efficient allocation from the pool.

 

At least on the CPC, MSX and PCW it's hardly possible to calculate the CPU usage for each process as you only have one IRQ, which occures only every 1/50 or 1/300 second. Maybe on the A8 you have better possibilities because of your more flexible IRQ system?

We've got the VBlank NMI running at 1/50 (or 1/60) of a second, but I'm using a timer IRQ at the same frequency for the scheduler, as it's a little more flexible and can be restarted on an early context switch. There are three timer IRQs, and one's used for the mouse (1/800 second), leaving one free for "other stuff".

 

It would sure be nice to display CPU usage for each process, however. I suppose the best we can reasonably calculate is how many full CPU slices (or part thereof) a process receives during each pass through the run queue.

 

So on my side it's only possible to calculate the remaining CPU time, which is quite easy: You have a system counter, which is incremented every 1/50 second. And you have the idle process which does nothing else then incrementing an own counter all the time. By comparing both counters you can figure out the exact free CPU time (which is the time, the idle process is "working").

Sounds clear - thanks. ;)

Edited by flashjazzcat
Link to comment
Share on other sites

Why couldn't Atari couldn't do this 30 years ago!?!

That's a good question and I think the answer is that we needed to witness the development which occurred during the past 30 years in order to write this now. It also takes a LONG time to develop (and how could we forget that) when done as a hobbyist project. I really loved Prodatron's comments over on the MSX forum (in the SymbOS topic), reacting to an old video of Bill Gates promoting Windows 95 and saying "finally, computers are powerful enough to run a proper GUI", or words to that effect. It's absolutely laughable when you see what a 1.7MHz CPU is capable of. I know we already discussed the real-life reasons for this, however, and they include it being totally impractical to write commercial products in machine code. I mean: the output listing of this thing is already over 20,000 lines long, and will probably top out at around 40-45,000 lines, and that's not counting applications.

Link to comment
Share on other sites

Simply amazing! Why couldn't Atari couldn't do this 30 years ago!?!

** Disclaimer: The following is solely my personal opinion and does not necessarily represent GUI author's or other Atari users opinions **

 

They could have, but a commercial establishment has to think about sales. Atari 8 bits even though they are capable of running many different kinds of software were (and still are) mainly geared for games. Commercial quality games have almost always depended on dedicated usage of computer's resources like the CPU, the GPU and took the whole machine over when they started to run, so that they can control the hardware the way they needed to and utilize every bit of power they can squeeze out of the beast and this was unavoidable given those machines were very slow and primitive compared to today's technology. I am sure Atari executives never seen a GUI environment as a viable business solution for a computer what is essentially designed to be a game platform. (how many business machines had sprites(or PM graphics) and multi-channel sound and or paddles and joysticks back in the day?). I think there is a reason why the first usable Atari GUI environment came with the ST series, with those machines Atari wanted to get into the business and music markets where a graphical environment made sense and the technology was also more advanced to support such an environment.

 

A GUI and even a simpler multitasking environment is possible with an 8 bit computer (as it will be soon demonstrated by the author of this thread), but possible doesn't always mean viable especially if your main objective was to increase sales rather than keep the wow factor high....

Link to comment
Share on other sites

I wonder if an Apple //e with 128K (but only 1.02 MHz CPU) could pull this off, given that there was a Mac-like desktop for it.

 

Dunno much about that machine, but am I right in thinking the video access is interleaved with CPU, so all processor cycles are available for program execution?

Link to comment
Share on other sites

 

Dunno much about that machine, but am I right in thinking the video access is interleaved with CPU, so all processor cycles are available for program execution?

Yes, both the Apple and C64 ran their CPUs at half the RAM speed so you don't have DMA overhead for every byte displayed. Of course, this means the CPU is still slow during VBLANK.

Link to comment
Share on other sites

** Disclaimer: The following is solely my personal opinion and does not necessarily represent GUI author's or other Atari users opinions **

 

Who cares what your opinion is, after all the trolling you've done in this thread? You yourself agreed to BUTT OUT of this thread in this post of yours (page 80) in this thread, back on page 80 of it, much earlier. But you just couldn't stay out. Why don't you follow your own advice? Why are you here????

 

You have a definite pattern which you've repeated in many threads and with many people here, demonstrative of some type of mental issues. For the sake of brevity, I'll stick to the confines of this (flashjazzcat's GUI thread), because it's all that's necessary to observe your pattern.

 

(1) You start out playing nice, as you did in this post and shortly thereafter, again right here. (both page 74)

 

(2) You then start to be a critical jerk in this post, (page 78) where you start to question the motives behind creation of this GUI - (1) As if it's your place to question the motive, (2) as if anybody cares what you think, and (3) as if you are even capable of a fraction of the skill and knowledge employed in this project. For the record, I haven't the foggiest how he's doing this project, but I don't question it; on the contrary, it's one of the most impressive - if the single most impressive - software effort I've seen on an Atari.

 

(3) On page 80, you get "treated" with the HILLARIOUS Stewie video, which I'll link to again for your enjoyment. Also on page 80, you agree to "butt out" of the thread ; it's the first post of yours that I began this post with, but here is the same link again, as I want to make it as easy as possible for you to track your own words, taste your own foot, and perhaps follow your own advise.

The Stewie video again, for your pleasure:

https://www.youtube.com/watch?v=x3i4I5KXnsY

 

(4) On page 84 (after being smacked down of course) you start to play nice again in this post, congratulating flashjazzcat on his new job. Oh, all is well and atari8warez is happy, right? Well, of course not, or the patten wouldn't be complete.

 

(5) After indeed butting out for a while, you come back in page 102 being a critical jerk again, in this rude post where you sarcastically chide fjc for his timeline - which is entirely none of your business. Since then, "it's on again" in this thread, and every other that you can pollute with your childish vitriol.

 

(6) Also on page 102, you try to play innocent in this post where you take a laughingly-poor stab at playing innocent by claiming, "This whole feud between me and him started long time ago when I (rather honestly) commented on this project..." HA! I've never seen such a piss-poor attempt at trying to shroud one's own nefarious, antagonistic, and wicked behavior in a wrapper of alleged innocence. :lol: :lol: :lol: :lol:

 

(7) By page 103, when several other AtariAge users rightfully object to your attitude and hostility (likely driven by some kind of jealousy or inferiority complex), you claim in this post that, you "Haven't been laughing so hard lately until I read the last few messages here." That message was posted at 2:46am (according to the time zone Atariage has me in).

You then continued to troll, the whole day in this thread!!! You trolled at the aforementioned 2:46am (the presumably slept), then again at 1:54pm, 2:01pm, 2:20pm, and 11:34pm. You made a day of it!! :lol: :lol: :lol: :lol:

 

(8. In the Centron 3D thread, you saw a video that reminded you of your earlier Stewie video. That must have fired you up again and your renewed your nastiness, proving the effectiveness of the earlier Stewie video in getting to you. My compliments to the artist. You then began doggedly defending the [fellow] trolling huckster behind that scam, probably because you're both comfortable in the doghouse. Here's that video, btw:

https://www.youtube.com/watch?v=xvqNBbMHqQk

 

And you say you pity me? Sorry buddy, but the emotional yo-yo that you've ridden in this thread alone (not to mention others) is clearly something to be pitied, if only for the mental affliction that drives such a pattern - one that I've clearly illustrated by linking directly to your own words, rather than any doing of mine. I'll get right to that pity - as soon as I quit laughing. :lol: :lol: :lol:

 

I've seen you somewhere attempt to "dismiss" this GUI project as "academic." It sort-of is academic, but that certainly doesn't warrant a dismissal. The techniques and methodology discussed here is clearly beyond both of our comprehension - and many peoples' comprehension - but is rather academic. It reads about as academic as does a graduate thesis in Computer Science. That's quite a compliment! I don't understand those, either.

  • Like 4
Link to comment
Share on other sites

They could have, but a commercial establishment has to think about sales. Atari 8 bits even though they are capable of running many different kinds of software were (and still are) mainly geared for games.

Yeah. I remember, at the time, I didn't have any concept of a GUI or multitasking. Even business machines were still console-based, single application experiences. DOS and Lotus 1-2-3, or WordPerfect.

 

...there is a reason why the first usable Atari GUI environment came with the ST series, with those machines Atari wanted to get into the business and music markets where a graphical environment made sense and the technology was also more advanced to support such an environment.

And GEM really got in the way sometimes. Many games booted directly from floppy to avoid it.

 

(as it will be soon demonstrated by the author of this thread)

I like this. Regardless of your thoughts about the usefulness or commercial viability of the project, this kind of acknowledgement is reasonable and encouraging. The high road.

Edited by pixelmischief
Link to comment
Share on other sites

Man I am soooo sorry! I was just trying to say that this is an excellent demonstration of what the machine is capable of while simultaneously dismissing Atari's direction by not exploiting the hardware to its fullest potential.

For the sake of further argument I'll leave it at that.

 

I follow this thread and learn something about OS design almost daily. Fascinating stuff!

Link to comment
Share on other sites

Man I am soooo sorry! I was just trying to say that this is an excellent demonstration of what the machine is capable of while simultaneously dismissing Atari's direction by not exploiting the hardware to its fullest potential.

No problem: this is what I assumed you meant. My theory about the benefit of hindsight and thirty years of learning about the hardware obviously holds no water, though, since presumably there have been multi-tasking GUIs kicking around for the 8-bits since the mid-eighties, but they just weren't commercially viable. :)

  • Like 2
Link to comment
Share on other sites

Yeah. I remember, at the time, I didn't have any concept of a GUI or multitasking. Even business machines were still console-based, single application experiences. DOS and Lotus 1-2-3, or WordPerfect.

 

True, and I did use some business apps on the Atari 800XL like SYN series from Synapse etc. None of which really broke any performance records at the time even without a GUI.

 

 

I like this. Regardless of your thoughts about the usefulness or commercial viability of the project, this kind of acknowledgement is reasonable and encouraging. The high road.

 

I never doubted he can do it (despite his negative comments about my own skills ;-)), I only questioned the usefulness of a GUI, but how could I dare to?

Link to comment
Share on other sites

No problem: this is what I assumed you meant. My theory about the benefit of hindsight and thirty years of learning about the hardware obviously holds no water, though, since presumably there have been multi-tasking GUIs kicking around for the 8-bits since the mid-eighties, but they just weren't commercially viable. :)

 

You have a very clever ways of distorting my words, where did I say "multi-tasking GUIs kicking around for the 8-bits since the mid-eighties". I said it could have been done but Atari software engineers were busy creating salable and usable products instead of spending their time on ego boosters.

Edited by atari8warez
Link to comment
Share on other sites

You have a very clever ways of distorting my words, where did I say "multi-tasking GUIs kicking around for the 8-bits since the mid-eighties". I said it could have been done but Atari software engineers were busy creating salable and usable products instead of spending their time on ego boosters.

I don't think Atari was capable of producing something like this (meaning, it would have sucked). I mean, what sort of usable products did they blow their R&D on? DOS 3?

 

Developing the best GUI possible for the Atari is a fun project whether anyone uses it or not. People who aren't interested in the project shouldn't be clogging up the topic with constant bickering. I've ignored tons of topics in my time here. It's easy!

 

And... I know you weren't writing to me. :P

  • Like 3
Link to comment
Share on other sites

I don't think Atari was capable of producing something like this (meaning, it would have sucked). I mean, what sort of usable products did they blow their R&D on? DOS 3?

 

Developing the best GUI possible for the Atari is a fun project whether anyone uses it or not. People who aren't interested in the project shouldn't be clogging up the topic with constant bickering. I've ignored tons of topics in my time here. It's easy!

 

And... I know you weren't writing to me. :P

 

Assuming that Atari software engineers couldn't have been able to write a multitasking GUI is simply a gross exaggeration and a direct insult to their skills. The technology already existed on various other platforms (especially multi-tasking) and an Atari 8 bit wasn't the most complex to deal with (although definitely one of the most limited as far as the hardware resources go). If you're so sure about this tell me what possible trick that is employed today in this GUI that could not have been done 30 years ago?. I am really curious to know :)

 

Developing the best GUI possible for the Atari is a fun project whether anyone uses it or not.

Oh I have no arguments with that.

 

And... I know you weren't writing to me. :P

That's totally alright so long as it is a normal conversation like it is here in this case.

Edited by atari8warez
Link to comment
Share on other sites

Assuming that Atari software engineers couldn't have been able to write a multitasking GUI is simply a gross exaggeration and a direct insult to their skills.

When I read this, I am reminded of the history of the "4-minute mile". For those who are unfamiliar, it basically goes like this. For years, scientists maintained that it was not only dangerous, but that it was physiologically impossible for a human being to run a mile in 4 minutes. For a thousand years, it was largely regarded as simply unattainable.

 

Then, on May 6, 1954, Roger Bannister did it. 3:59.4. Barely a year later, someone else did it. Now, it's done all the time. Even strong High-School runners do it routinely.

 

What's the point? For humans, it's all about faith and focus. As soon as people came to understand, as a matter of fact, that a 4-minute mile was possible, the "friction" of doubt disappeared. Suddenly it was easy. I suspect that what developers are NOW able to do on the Atari 8 is powered by a similar phenomenon. Understanding what today's computers are - what they have become - opens up their conceptual range for what systems are capable of in general. Armed with these expectations, they squeeze more from the 8-bit than was even conceivable at that time. And as the 4-minute mile showed us, if it isn't conceivable, it isn't achievable.

Edited by pixelmischief
  • Like 2
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...