One Bullet Is Never Enough
Tim Daneliuk (tundra@tundraware.com)
(A version of this originally posted on comp.lang.forth, 17 May, 2002)

Copyright © 2002 TundraWare Inc.  Permission to freely reproduce this material is hereby granted under the following conditions: 1) The material must be reproduced in its entirely without modification, editing, condensing, or any change. 2) No fee may be charged for the dissemination of this material.  Commercial use such as publishing this material in a book or anthology is expressly forbidden. 3) Full attribution of the author and source of this material must be included in the reproduction.

Getting Religion


Every few years someone manages to come up with "Yet Another Revolution In Programming".  These revolutions, we are told, will transform the face of software development, retire all or most previous programming technologies, and generally give us the software Free Lunch we all long for - rapid development of highly efficient, completely correct code.  Sooner or later, a Movement is born to carry forth this new gospel.  At the moment, the True Believers come from the True Church Of Object Orientation, but they are merely  the latest in a series of evangelists that have included the AI Apostles, the Functional Fanatics, the Heuristic Harmonizers, and DBMS Dervishes.

Having been in the software profession for over n years (where, n is none of your business), I never cease to be amazed by the implicit assumption by so many of our fellow practitioners that some "silver bullet" is just around the corner, waiting to be found, that will solve our software productivity and quality problems for all time.  We may be the only profession that spends a considerable amount of skull sweat trying to reduce our practice to a single tool or technique.  I cannot imagine an auto mechanic trying to become so elegant as to only use a screwdriver for all repairs.

In my observation, these revolutions pretty much never live up to their claims.  For example, OO's nominal virtues notwithstanding, it frequently makes development efforts more complex, longer in duration, and the final product harder to maintain.  In my opinion, this is because OO never reduces net complexity, it merely transforms where that complexity appears.  Moreover, trying to conquer a non-trivial problem with any single software paradigm ends up creating a new set of problems having nothing to do with your primary task.  The designer/programmer spends 80% of their time jamming that paradigm down the throat of the last 20% of the problem because that last 20% does not map well to the paradigm chosen.

But this is really not an "OO Problem", as such.  It is true of all  design/programming approaches and OO merely happens to most recent incarnation.  The problem is that ... One Bullet Is Never Enough!
 
 

Church History


I like to use the following matrix with newly-minted programmers (and other True Believers) when explaining why they need More Bullets:
 

Birth Of                                        Source Of Complexity/
Paradigm        Paradigm                        Principal Intellectual Problem
--------        --------                        ------------------------------

1950s           Algorithm decomposition/design  Comprehending/reducing
                                                O(n) complexity.

1960s           Heuristic algorithms for        Finding heuristics that map
                NP-Complete problems            well to the problem space
                                                and then determining whether
                                                their worst-case behavior is
                                                acceptable.

1960s           Artificial Intelligence         Defining a rule-set that
                                                maps well to the problem
                                                space and then finding
                                                an ordering of those rules
                                                that scales for non-trivial
                                                problems.

1970s           DBMS Approaches                 Ordering the taxonomy
                                                and semantics of data so
                                                as to create a complete
                                                and manageable schema.

1960s           Procedural Programming          Maintaining coherence and
                                                consistency among all the
                                                decomposed pieces of code.

1980s           Object Oriented                 Defining a meaningful
                                                inheritance hierarchy
                                                that appropriately factors
                                                generics from derived
                                                properties.

1980s           Theorem Proving Techniques      Finding mathematical
                                                analogs for the real world
                                                that work for anything more
                                                than "toy" problem spaces.

1980s           Functional Programming          Transforming real world
                                                problems into functional
                                                equivalents upon that
                                                FP can operate.
 
 

Grasping Theology


In each of these approaches, all that was really done was to transform where the hard problems showed up - much like doing a Laplace Transform turns a differential equation into algebra at the (new + hard) cost of trying to figure out an inverse transform.  Here's an analogy that might better make the point.  We know from algorithm complexity theory (Omega Analysis) that there is a lower-bound to the amount of work any algorithm of given type must do.  For instance, sorting by means of comparisons cannot be done with an asymptotic complexity of less than n*log(n) - No matter what language or paradigm you use, if you are comparison sorting, n*log(n) is the least work you can do - though you can do worse!  In other words, different optimal algorithms can only divide the computational complexity up in different ways, but the sum will be the same.

That's exactly what the different programming language technologies over 50 years have done.  They've solved, mostly the same set of problems with different programming paradigms because the implicit hope was that we could find a "Principal Intellectual Problem" that was easier for the human mind to grasp and conquer.  Well, guess what?  It didn't work, at least not completely.  It turns out that some kinds of problems lend themselves more neatly to certain kinds of programming approaches.  Heuristics opened the door for cracking some really tough problems in Operations Research, but are probably not well suited for doing General Ledger and Payroll.   Similarly, OO works well for many problems until the inheritance hierarchies that reflect Reality are so complex that they are incomprehensible to the human mind.  In principle, all Turing Complete languages can compute anything that is computable.  But in practice, they are not all equally suited to all classes of problems.  Most importantly, New Bullets Are Not Inherently Better Than Old Ones. They each just solve different classes of problems well.
 

Turning Faith Into Practice


Trying to convince the True Believers that this is so is an uphill battle until they actually have to build a significantly complex system that includes UI, data store, transaction protection, fault recovery, and so on.  For example, the first thing the OO crowd runs into is the "impedance mismatch" between how OO inheritance sees the world and how a DBMS schema sees the world.  Add to this the need to transactionally protect key events, and the OO folks walk around mumbling about a yet-to-be-delivered Sun extension to Java or a new Microsoft .NET feature.  It frequently seems not to occur to them that the transactional protection problem was solved long ago, just not with "objects".  So, an immense amount of brain cycles get burned trying to shoehorn transaction processing into, say, EJB instead of mating an old technology with a new one and letting each do what it does best.  Notice that I'm picking on the OO world here only because it is the current Faith for many programmers, but this exact behavior has been with us from the beginning of programming.

Furthermore, real problems are almost always event or condition driven.  That is, the structures of time and events outside the program are way more important than the internal organization of the code itself or what programming paradigm is chosen.  In realtime it is hardware events that dictate program structure and flow.  In transaction systems it is typically a business event that initiates the work.  In operating systems it is a combination of hardware, process, and user events that dictate how things get done.  It is amazing, with all the advances we've seen in languages, that so little attention is paid to supporting (a)synchrony, concurrent processing, locking, and rendezvous as first-class properties of the languages themselves.  Forth clearly lead the way here but newer languages like Erlang are starting to think this way too.

I reserve the right to be wrong, but it seems to me that we ought to be evolving to a world in which programmers of complex systems write in meta-languages like Python - that embrace OO, procedural, DBMS and Functional Programming notions - but whose runtimes would be best delivered in something like Forth (or Erlang).  Different software paradigms could thus be used simultaneously within a given software system because they are first-class properties of the selected language  - You Would Use Lots Of Bullets. The intermediate code output would be realized in a highly efficient, easily optimized, event-sensitive runtime environment   This is is probably a pipe-dream because trying to efficiently map the semantics of a late-bound dynamic language like Python onto a sleek Forth runtime is likely too costly in terms of runtime size and speed.
 

Heaven Or Hell?


Every year, universities crank out a new generation of programmers who are actually fluent in only one paradigm or language. - For awhile it was Data Processing, then it was AI, lately it's been OO.   These graduates come to industry with a hammer and assume everything is a nail.  If they went to a decent school, they've been exposed to other programming models, of course, but they've only mastered one, so this behavior is understandable.  The problem is that this has a really bad longer-term effect on the industry as a whole.  As these people mature in their jobs they bring with them their One True Way.  This means that new problems only get considered in light of how the problem might be solved using that one way.  Technical and Executive management is rarely able to discern the bull from the cowpatties.  In effect, there is practically little control or oversight of the final work product.  (Someone once observed that the structure of software resembles the structure of the organization that created it.  My correlary to this is that the architecture of a computer system is actually defined by the last programmer to modify or maintain it.)

This results in really ugly things getting done like writing huge applications or operating systems in C++.  If the results were acceptable, I guess we could all live with "ugly'.   But the results are decidedly not acceptable.  In most fields, commoditization and mass production yield higher quality.  It is an irony that as software has increasingly become a commodity and more widely used, it has actually become less reliable, on-time, and on-budget. If you don't believe it, compare today's desktop operating systems with IBM's mainframe OSs of the 1960s.  I am convinced that this "one bullet fits all" mentality is at least, in part, responsible.  There is no other engineering profession of which I am aware in which the practicioners cling stubbornly to the flavor of the month.  Imagine a Civil Engineer saying , "All bridges regardless of size or location will be suspension bridges."  How about an Architect who only designs brick buildings or a Mechanical Engineer that never uses bearings, only brass sleeves, around rotating shafts..

It seems to me that, at a minimum, a professional programmer should be expected (and expect of themselves) proficiency in one assembly language, one procedural language, one object language, one scripting language, and several design/definition methodologies.  Only with these "bullets" in hand will they be able to make rational design and implementation tradeoffs.

In any case, I think we're all going to get forced down this route eventually.  Every other profession lives with the specter of professional liability and lawsuits hanging over them.  Sooner or later, the "This software is not warrantied to work ever and if it does you got lucky" school of thought is going to get punished by a smart liability lawyer and a sympathetic jury.  If you can get millions for dumping hot coffee on your lap, it shouldn't be too hard to get millions for selling bad Java.

It is also entirely clear that hacking is moving away from relatively benign amateurs into the arms of organized crime, professional terror, and foreign powers.  If for no other reason than this, we ought to be serious-as-a-heart-attack about more systemic "right tool for the right problem" approach to software engineering.