Created 11th December, 2007 12:39 (UTC), last edited 11th December, 2007 12:48 (UTC)

I can probably agree with this sentiment from Martin J. Laubach over at Smalltalk of Oz:

All that just screams that many people in IT today do quite obviously not have the necessary basic training. They might learn how to plug Hibernate into an enterprisey Java application and how to battle XML config files, but there does not seem to be a computing 101 course any more. It's as if a garage mechanic could only read the diagnostic codes from the motor microprocessor but has no idea that there is oil in the engine (or why, for that matter).

And it seems that's not only me, see for example Raganwald noticing the same lack of compentence in the field.

That scares the shit out of me. Who writes the software to control nuclear power plants again?

What scares me more though is the example he pulls from Alberto Savoia's essay in Beautiful Code:

I checked with several other Java programmers, and most of them were not familiar with the unsigned bit shift operator, or where not 100 percent sure how it worked. For them, seeing calculateMidpoint( low, high ) is more obvious than seeing (low + high) >>> 1.

There's a trade off here. Do we have a very wide API that contains all of these little functions that do useful things, or do we expect people to knock up their own when they need one?

Clearly we need to have the skills to write our own when we have to, but we should also have the wisdom to use the library whenever possible anyway. Not because we're dunderheads, but because details matter in software development and details are easy to get wrong.

The detail at question here is what does (low + high) >>> 1 do when the sum overflows? In C or C++ you'll get an incorrect calculation. In Haskell it'll work fine (but you'll have (low + high) / 2 instead) because it has big number support. What Java will do I don't know, but then I'm not a Java programmer. This implies that it's probably a very bad thing indeed.

I'd expect any competent programmer to be able to read (low + high) >>> 1 and be able to see what it is meant to do. I'd expect a good (and experienced) developer to use calculateMidpoint( low, high ) because it is a little easier to read, but more importantly, much less likely to have subtle bugs in it.

For the curious, a better one liner is this: ( ( high - low ) >>> 1 ) + low, but maybe that wasn't beautiful enough?