Sometimes the interoperability between .NET and Win32 could by a pain in the ass. Especially in places you don’t expect it to be. I’ve worked recently on a cryptography algorithm. I had the source code in Clarion and all I had to do was to implement it in C#. No problem I guessed. But… The Clarion algorithm used pointers extensively. I have worked with byte arrays. I read the text as a char array and I copied the bytes into long variables using a binary shift. Lets say we have a
string s = "abcd";
uint ui = (uint)((s << 24) | (s << 16) | (s << 8) | s);
we will get something like this number 1633837924.
Lets look at this from the other side. Our string goes direct into memory:
Using Win32 compiler we want to read it to a ulong variable x from a px address. In Clarion the code to do this will look like this:
x &= (px)
What do we get in x?
Hmm, variable ui in C# and x in Wind32 are not the same. Why? Because Intel is using a small endian way to store the numbers. So it means that the highest byte is saved at the right side and the lowest at the left. Like this (small endian):
and not (big endian)
* In Jonathan Swift’s Gulliver’s Travels the Little Endians broke their eggs at the small end, where the Big Endians broke theirs at the large end. And the ones were not very fond of the others 😉