Which is why I should do so more. You know, practice written communication skills, seem organised and on top of things (though I don’t recall missing an assignment deadline or copping a late penalty in the last 4.5 years of uni…bonus marks excluded because sometimes they are ridiculously hard and thus fun, but not necessarily solvable)
One thing I should have blogged a week or so ago was the sheer joy at being reminded of:
#define char UINT8
Now…this line was written by one of my professors for COMP343 Cryptography and Computer Security @ Macquarie for a project early last year. No one in the class could figure out why our simple cryptalg.cc was simply not producing the results we expected when compared to a desk check of the algorithm, though a tutor spotted it early enough on he never explained why his fix worked.
Have a think about the above line, is it correct?
The signed/unsigned thing has been biting programmers for a looooong time in languages it is relevant to 🙂
If you haven’t figured it out (probably because you aren’t a coder), the following line probably won’t help greatly either:
#define char UINT8
#define unsigned short int UINT16
Even the first year programmers should be able to see the issue by now – char by default is a signed data type in C. Thus one fix in this context is:
#define unsigned char UINT8
This and other quirks of C are why personally, I prefer Python, Java, Scala and even C# far more than C/C++ these days (though there are times when C/C++ is useful and it was the first real programming language I learnt). Today the performance argument is almost moot, even Cassandra was written in Java. It feels like it has long been argued that programmer time is more valuable than computer time – to the point that internet behemoths like Google, Amazon and Yahoo have huge farms or datacenters of thousands upon thousands (if not millions) of servers and far fewer employees.