ah, now I see what you guys were saying was wrong. using the & sign in the cast. that’s what I get for following the examples in this thread: Encrypt/Decrypt a string
So, in my head, I’m imagining a memoryBlock of size=4 to basically look like this in memory:
[0xAE][0x12][0x45][0x34] or [1010 1110][0001 0010][0100 0101][0011 0100]
each element is a char, so if I cast it to uint32, then (depending on endianness), I’ll get:
0xAE124534 or 0x344512AE
[1010 1110 0001 0010 0100 0101 0011 0100]
I personally don’t care about the value directly. I just care that 0xAE looks like [1010 1110] in memory, and those bytes/bits are what is being transformed inside the BlowFish::encrypt() call. so as long as I pass blowfish 4 valid bytes at a time, I don’t think how the cast happens is something worth worrying over. But you guys have way more experience than me.
All i know is that so far, all of my encrypt()/decrypt calls have resulted in whatever I pass in being decrypted correctly, so that’s why I didn’t stress over the syntax correctness for reading a uint32 from a block of chars. It worked as expected against everything I threw at it…