Jump to content
IGNORED

sdcc 3.4.0 released yesterday


PkK

Recommended Posts

A new version of sdcc, the compiler used with both Daniel's and my ColecoVision development tools, has been released yesterday. Compared to the previous release, the improvements relevant to Z80, and thus ColecoVision development are mostly moderate improvements in code size and speed, bug fixes, and (for those using some forms of Megacarts) the named address spaces in ROM.

 

Here's the release announcement from Maarten:

 

Hello SDCC friends,

Today a new release of SDCC was made. We are now at version 3.4.0.
You can get it at:
http://sourceforge.net/projects/sdcc/files/

So what's new?
* New TLCS90 (Toshiba Z80 clone) target support
* New STMicroelectronics STM8 target support
* Support for named address spaces in ROM
* Detects supported devices by gputils when building SDCC

And of course numerous feature requests and bug fixes are included as
well.

I hope you will enjoy using this new release.

Maarten Brock
SDCC 3.4.0 Release Manager

 

 

  • Like 3
Link to comment
Share on other sites

  • 9 months later...

When i tried to compile bagman with last version of sdcc, i have this message :

 

sdcc -mz80 -c -I/d/colecodev/sdcc/include --std-c99 --vc -I/d/devperso/COLECO/bagman/ miscobj.c
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp137. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp137. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp139. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp139. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp141. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp141. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp143. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp143. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp145. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp145. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp146. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp146. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp142. Please contact sdcc authors with source code to reproduce.
Warning: Non-connected liverange found and extended to connected component of the CFG:iTemp142. Please contact sdcc authors with source code to reproduce.

What does it means.

Also, the rom is 800 bytes more than with sdcc 3.3 ... So i will continue to use sdcc 3.3 :/

Edited by alekmaul
Link to comment
Share on other sites

  • 3 weeks later...

When i tried to compile bagman with last version of sdcc, i have this message :

What does it means.

Also, the rom is 800 bytes more than with sdcc 3.3 ... So i will continue to use sdcc 3.3 :/

 

The warning is not a big issue, and should only have a minor effect on code quality (possibly slightly bigger code for the function being compiled).

If you care about code size, you might want to use --opt-code-size and possibly --max-allocs-per-node with a suitable parameter (the default for --max-alloocs-per-node is 3000; lower values speed up compilation, higher values result in more optimization).

 

Philipp

 

Edit: Fix typo.

Edited by PkK
Link to comment
Share on other sites

  • 1 month later...

Hi, Was wondering if there is something that would cause sdcc to gobble up every bit of memory my computer has.

 

I currently have 6gigs of memory in my desktop and i'm having problems with sdcc running out of memory during compile and crashing. I have increased the memory pagefile size to 6gigs min to 9 gigs max. hasn't helped.

 

i'm using 64bit windows 7

 

I have ordered additonal ram so I'll have 12 gigs in a couple of days but it just seems odd because I can compile it on a windows tablet which has only 4gigs of ram.

 

//EDIT I tried again and it failed on the 4gig tablet this time. Must be something in my program it doesn't like.

 

thanks

Link to comment
Share on other sites

here is the output. I even put in 8 gigs of memory temporarily and got the same result. Uses every availble bit of 8 gigs and still runs out.

 

Try to compile : level1.c
C:\Users\gerry\Desktop\ERIS TEMP DOCKETS\coleco programming\z80\Mr_Turtle>sdcc -mz80 -c --std-c99 --oldralloc level1.c
Try to compile : Mr_Turtle.c
Mr_Turtle.c:174: warning 85: in function score_print unreferenced function argument : 'y'
C:\Users\gerry\Desktop\ERIS TEMP DOCKETS\coleco programming\z80\Mr_Turtle>sdcc -mz80 -c --std-c99 --oldralloc Mr_Turtle.c
Mr_Turtle.c:2722: warning 110: conditional flow changed by optimizer: so said EVELYN the modified DOG
ERROR - No more memory
Link to comment
Share on other sites

I'm still on sdcc 3.3.0 and haven't tried 3.4.0. I think rolling back to 3.3.0 might solve the issue. Using high amount of memory is typical and Flappy uses almost 2 GB. That one has a lot of warning. Spunky SuperCar took uses 100-200 MB and has no warning. Rockcutter, game I'm working on uses 500 MB.

Flappy also use a lot of recursive programming to draw the pipe, move the pipe, and etc. The error I used for(ID=0;ID<=4;ID++) was triggering the warning, and the warning will go away if I change that to, for(ID=0;ID!=4;ID++){. And I tend to have 1 c programming file.

 

I added a goto to the last line of my text adventure game to point to the beginning of the program, the compiling time increased sharply in additionally it took up a lot of RAM. Without that 'goto', compile time was normal few seconds. I think it has to explore every possible problem that the program may have happen so the thread was too big. That game need to be reprogrammed to have a engine instead of being linear program with few branches. I'll get to it eventually.

I think Mr_Turtle.c:2722: warning 110: conditional flow changed by optimizer: so said EVELYN the modified DOG, should be checked on and see if you used '=' instead of '=='. I think that may clear it up.

Link to comment
Share on other sites

thanks.i tried sdcc 3.3. it seems to be 32 bit only and thus the memory is limited to about 3.2 gigs so it just crashed even faster. I compiled another program and though it compiled the ram maxes out in usage around 2.96 gig. in sdcc 3.3.

i'll check that conditional flow error but it wasn't stopping anything from working though it is puzzeling. that's a good hint cuz I also use (I=0;i<5;i++) form.

What I'm attempting to do is dynamically update the sound tables.

so if sound1={0x40,0x8f,0x30 ... etc change it to sound1={0x40,0xAA,0x30... etc...

that I could replace any starting data for that song or sound fx dynamically rather than having it hard coded on import

Link to comment
Share on other sites

I shall try. My additional ram just showed up as well so 12 gigs should be enough to comile a 32k ram cart. So I'll install that first and give it a run.

 

I gotta move on to a megacart or something soon. I'm spending most of my time finding tricks to further compress everything. Just figured out a good music compression trick yesterday. That should help save some rom space.

 

 

 

Did you try the --max-allocs-per-node parameter with low value. ?

Link to comment
Share on other sites

  • 2 weeks later...

Hi, Was wondering if there is something that would cause sdcc to gobble up every bit of memory my computer has.

 

Using a high value for --max-allocs-per-node without using --oldralloc does. It also gives excellent optimization. For --max-alloc-per-node 50000000 I'd recommend at least 64GB of RAM. The default is equivalent to --max-alloc-per-node 3000.

 

Philipp

Link to comment
Share on other sites

 

here is the output. I even put in 8 gigs of memory temporarily and got the same result. Uses every availble bit of 8 gigs and still runs out.

 

Try to compile : level1.c
C:\Users\gerry\Desktop\ERIS TEMP DOCKETS\coleco programming\z80\Mr_Turtle>sdcc -mz80 -c --std-c99 --oldralloc level1.c
Try to compile : Mr_Turtle.c
Mr_Turtle.c:174: warning 85: in function score_print unreferenced function argument : 'y'
C:\Users\gerry\Desktop\ERIS TEMP DOCKETS\coleco programming\z80\Mr_Turtle>sdcc -mz80 -c --std-c99 --oldralloc Mr_Turtle.c
Mr_Turtle.c:2722: warning 110: conditional flow changed by optimizer: so said EVELYN the modified DOG
ERROR - No more memory

 

 

sdcc essentially compiles one function at a time. So there probably is a single function in your code for which the excessive memory usage happens. Can you isolate it and make a small, compileable code sample? Can you check if the problem still happens with curent development versions of sdcc, such as the snapshot builds?

 

Philipp

Link to comment
Share on other sites

I upped my memory to 12 gigs of ram and it is compipling again now. I also play with the --max-alloc-per-node and that helps for sure. I can get a quicker compile by lowering it and a much smaller compile by incresing it but niot too much. Mayby about 30,000 max or it might happen again. I'll try to isolate the code causing the excessive memory usage as you said by removing large chunks of code to try and figure out which routine is worst and see if I can make it better.

 

Using a high value for --max-allocs-per-node without using --oldralloc does. It also gives excellent optimization. For --max-alloc-per-node 50000000 I'd recommend at least 64GB of RAM. The default is equivalent to --max-alloc-per-node 3000.

 

Philipp

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...