HC12: char and short in an union?

cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

HC12: char and short in an union?

5,259 Views
Pang
Contributor I

Message Edited by CrasyCat on 2007-04-13 11:04 AM

Labels (1)
Tags (1)
0 Kudos
5 Replies

913 Views
bigmac
Specialist III
Hello Pang,
 
Yet another possibility is for the union to use a two byte array.
 
union DATA
{
   unsigned char chData[2];
   unsigned short shData;
} data; 
 
You can then individually assign a value to either byte.
 
  data.ChData[1] = 8;
 
Regards,
Mac
 
0 Kudos

913 Views
marc_paquette
Contributor V
What processor are you targeting? It appears that the compiler is generating big-endian code.

This setup will give you the behavior you're looking for:

#include <assert.h>

typedef struct twochars
{
/* Assumes big-endian. */
/* Reverse order of hi and lo for little-endian. */
char hi;
char lo;
} twochars;

union DATA
{
twochars chData;
unsigned short shData;
} data;

data.chData.lo = 8;
assert(data.shData == data.chData.lo);

Marc.
0 Kudos

913 Views
Pang
Contributor I
Marc,
 
Thanks for your prompt reply. My target process will be MC9S12XDP512. I will try your method.
 
I just tried the same code in Visual C++, both chData and shData have the same value at the first byte. So the Intel kingdom is different from the Motorola world
 
Does the compiler have an options to change that? Any way, I doubt it.  I do remember when I use LDD $1000 in HC11 language, the data in 0x1000 will be put in A and 0x1001 will be in B.
 
Pang
0 Kudos

913 Views
marc_paquette
Contributor V
No, the CodeWarrior compiler does not have an option to change endianness for the HC(S)08 & HC(S)12 targets because these targets are big-endian exclusively. Other Freescale products have configurable endianness, like the Power Architecture processors for example.

To handle endianness in a processor-independent manner, you'll have to manipulate data programmatically and not rely on processor/compiler behaviors.

Marc.
0 Kudos

913 Views
pittbull
Contributor III
Pang wrote:
>> Does the compiler have an options to change that?

You could change the byte order at compile time by using platform dependent, pre-defined constants of the compilers.
For example:

typedef struct twochars
{
#if defined (__HIWARE__) // big endian
char hi;
char lo;
#elif defined (WIN32) // brain damaged 'intel' byte order
char lo;
char hi;
#endif
} twochars;

..but it's generally not a good idea to access parts of a variable by mixing structs into a union. Compilers might insert alignment and padding bytes into structures. Better use shifts an bit-operations. It is much more portable.

Cheers,
-> pittbull
0 Kudos