# Cortex-A vs Cortex-M. Differents CRC output calculation ?

Discussion created by Maxime Dolberg on Apr 27, 2016
Latest reply on Apr 29, 2016 by Maxime Dolberg

Hello,

I have an issue with a software that originally work for an embedded Linux system on ARM-A7 core.That software contain a check sum based on a polynomial CRC-16 modbus. It work very well on ARM-A7 core and even other architectures (x86, amd64). But when I port the code to KL25 ARM-M0+ core, the check sum result give me a wrong output.

My input buffer contain 6 bytes : 0x 01 05 00 02 FF 00 (from MSB to LSB respectively)

The output result should be : 0x FA 2D. I double checked with this link On-line CRC calculation and free library

But with ARM-M0+, I got : 0x 45 92 ?!

So, where is the bug ? From ALU ? Compilation ? Data type or sign ?

Down below is the code I used. I did not write it myself. It is just a copy paste from a website found somewhere on internet. I believe that if I analyse the process of that code and read the link above, I could find a way to solve my problem but... Nah ! I am bit lazy to night.

I am very curious to understand why it does that. Anyone have an idea ?

```unsigned int CRC(char* buf, int len)
{
unsigned int crc = 0xFFFF;

int pos;
for(pos = 0; pos < len; pos++) {
crc ^= (unsigned int)buf[pos];          // XOR byte into least sig. byte of crc

int i;
for (i = 8; i != 0; i--) {            // Loop over each bit
if ((crc & 0x0001) != 0) {      // If the LSB is set
crc >>= 1;                    // Shift right and XOR 0xA001
crc ^= 0xA001;
}
else                            // Else LSB is not set
crc >>= 1;                    // Just shift right
}
}
// Note, this number has low and high bytes swapped, so use it accordingly (or swap bytes)

return crc;
}
```

Max