Hi All
First post here, so hopefully I'm doing this right?! :-)
this may be me being stupid (entirely possible!), but I'm having a problem with using Switch on tables of greater than 256 bytes
This is using the latest (just downloaded) version of Codewarrior and 'C' of course...
To explain...
I have a look up table which is 448 rows deep and 5 bytes wide (2,240 bytes in all)... it looks like this...
unsigned char const IncTable [448] [5] =
{
EBeep,EBeep,EBeep,EHorn,0,
EBeep,EBeep,EHorn,EHorn,1,
EBeep,EBeep,EAux,EHorn,0,
EBeep,EBeep,EHnAx,EHorn,1,
EBeep,EBeep,EWait,EHorn,0,
EBeep,EBeep,ELoop,EHorn,0,
EBeep,EBeep,ELPxx,EHorn,0
EBeep,EBeep,EStop,EHorn,0,
EBeep,EHorn,EBeep,EAux,0,
EBeep,EHorn,EHorn,EAux,0,
EBeep,EHorn,EAux,EAux,1,
etc. FYI (not relevant, but to explain), the 1st three entries are what went before, now and after; the last two entries are what we then do.
This is my switch statement (Where EVinTab is a worked out offset)...
switch (IncTable [EVinTab] [3])
{
case EBeep:
break;
etc
Now, if I want a result residing in the first 256 bytes all is dandy. If it's a higher byte it vectors to the correct case statement but doesn't execute it. Go figure.
If I make the table unsigned shorts it all works fine, but why should I need to do that? I don't really have the memory to give a table of bytes twice
the allocated storage to get the compiler to work correctly. I don't see why I should have to do that either. ATEOTD it's a big table of single bytes,
surely they don't need to be defined as shorts for the compiler to figure out to use 16bit math?
Any thoughts gratefully received.
Best
John