In Transition
It's been a really nice feeling letting everyone around me know about the decision to shift locations. Around work everyone's wishing me well and commenting on how
calm I am, when they'd be bouncing off the walls with an impending move.
Shucks, I just live here. From where I'm sitting, there're very few things worth getting too excited about, and so much is being done for me in this move that I'm inclined to sit back and enjoy the ride. :-)
Some particulars: I'll be doing similar work to what I've been up to for the past year or two; lots of web development, specifically intranet stuff, with rumors of data warehousing and some life-cycle-management stuff, and certainly lots of little urgent projects that will likely come up from day to day. Lots of "data plumbing," by all accounts.
The Problem of Software "Engineering"
Rumination
"Data plumbing" reminds me... A subject that's been in and out of my mind a lot lately is the concept of "software engineering," and what that might mean. In the one hand we've got Microsoft turning out Microsoft Certified System Engineers (MCSEs) by the bushel, but on the other there's no certified discipline in the engineering field that deals with software.
What I do certainly isn't engineering. I likened it to bricklaying once, but Matt helped me clarify: it's more like artisanship; there's a lot of creativity and expertise involved, but it's not nearly as concerned with principles or existing practice as engineering seems to be. Want fries with that? :-)
Software certainly
seems like something that could benefit from some engineering, along the lines of a bridge, a fuel formulation, or a power grid. An application suite like Microsoft Office or an operating system like Linux certainly has the complexity of an engineering project, but in terms of performance, neither exhibits the robustness that a bridge or power grid
must exhibit from day to day.
State of the Art
Part of the problem, I think, is that software as a discipline has only been around for 60 or so years (if we're generous), and as such isn't well enough understood for a discipline to have emerged. We have schools and disciplines within the field of programming: object orientation, scripting, embedded logic; and plenty of toolsets, from myriad languages like the C/C++/Java/C# standbys, to Basic in all its incarnations, to old workhorses like COBOL and Fortran, to fringe innovators like SmallTalk, Perl, Ruby and Python, to odd document-layout pseudolanguages like HTML and whatever XML is becoming, to hard-on-the-metal assembler and straight hexadecimal.
There's a proliferation of tools, to be sure, but when it comes to using them there's precious little standardization or accountability; nobody who's ever used a Linux, Windows or Macintosh computer has escaped program crashes, lockups, inconsistencies from program to program, and the inevitable friction between applications and the hardware upon which they're expected to execute. In the embedded space (cell phones, auto-engine controllers, phone-switching boxes, video-game consoles a la Playstation and XBOX) the world is simpler and the stakes higher, but again, it's catch as catch can.
It's a sobering statistic that of all the programming projects begun in the wide world of business, only
one in ten will ever see the light of its implementation day. In my experience this is true; I'm actually doing better than that: of the fifteenish big projects on which I've worked over the years, I think something like four have gone live. I'm
beating the odds with these numbers.
Certainly there are many variables around software development: shifting markets, various potential roadblocks at the talent and management levels, caprice in the flow of funding; but these aren't unique to the field of software, and to be fair I'm not sure what the failure rate of projects in other fields is: what is it for buildings, for medicine, for power plants? Anyone know?
And for apps that make it out into the world, Microsoft is an instructive example: MS is widely derided for its first two or three attempts at a product
utterly sucking. Windows 1.0? 2.0? 3.0? All slightly better than previous attempts (someone might argue that Windows 1.0 was better than straight DOS, but not me), and 3.x got a
lot of use, but it wasn't until Windows 95 (otherwise known as version 4.0) that Microsoft was seriously in the ball park. And it
still sucks, compared to say, Brooklyn Bridge 1.0, or Empire State Building 1.0.
(
Clarification: here, I'm leaning heavily on the Open Source maxim that
all software sucks, and that the best software simply sucks the least. Still, any program that ever loses data should be compared to a bridge occasionally "losing" a car, or a building "losing" an occupant.)
Stopping the Madness
How many programmers out there today would be willing to stake their reputations, incomes and careers on their programs
never failing catastrophically? As in, never crashing to the operating system, losing data or irretrievably hanging? People who write the software in hospitals know a little about this, but how about those programming the computers that will be controlling all the interwoven traffic paths through Boston's Big Dig?
Anyhow, I'm getting way too long here, but I've felt for a long time that the world of software needs some sort of "bulletproof programming" certification, admissible in a court of law, and maintained by some sort of standards body, like real engineering works today.
There's a lot in the way of this sort of thing today: for one, a programmer is only as good as his tools: can high-level tools like Visual Studio .NET or Java be used to build failure-proof apps? The simple answer is no: no one's watching the watchmen, so to speak.
Then there's the matter of testing: how does one certify a software design as "good?" We can't know down to the for-loop or iterator-object what will work and what won't - a simple reversal of variables in a programmer's mind can
still cause a compiler-invisible flaw analogous to badly forged metal in a load-bearing member; hit it with any load at all, and the bridge fails.
My guess is that we're simply not there yet: the tools need to mature, the meta-tools that would catch errors like the one I mention above need to be created. And that's just what I've come up with through idle thinking.
Heh - what will programming be like in fifty years? Will the concept even apply?
-Rich