I am of the "Old Ones". My years number sixty and one. I have spoken
with "the machine" for nigh over forty years. Think not that my
knowledge is outdated and I do not know the new ways. I live by the
consultants' sword and I can UML with the best of them.
The machines are simple things. They can but Add, Subtract, Multiply,
Divide, Compare things and move them around. Just six basic things, that
is all (and for those of us that know the metal, they can really only
Add). Much is spoken of the advances in our craft over the years, but I
say to you that little has changed. The machines still do only the same
things they did forty years ago, nothing more. True, it is on a much
grander scale and with much more speed, but these are things of the
realm of hardware not software.
The basic building blocks of our trade, the data structures, the
processing algorithms were all invented many decades ago. Nothing really
new has come forth in many many years, and yes, this includes all of the
land of OO. Open the covers of any modern day application, compiler or
tool and you will find only the building block of old.
How then, have we come to the place that we are today? A discipline
filled with "isums" and "ions". A discipline where even the simplest
and most basic of definitions cannot be agreed upon. The academics have
written volumes about our craft. They have attempted to define it, to
reduce it to equations and processes. Those of power and of management
rank have attempted to control it, to schedule it, and to somehow mold
it into a manufacturing like process that they can understand. Let us
also not forget those of Marketing. They are the ones who fuel the
economic engine of our craft by re-inventing and re-naming that which
already exists into something which will be perceived as new by those
gullible enough to seek magic solutions.
Now that I have raised your eyebrows and invoked the wrath of you that
consider yourselves experts in this field, let these words fall upon
those of you that are followers of the Cult of Objects.
The basic premise of OOD is that by mapping the application problem into
a model that simulates real world objects, the problem will be better
understood and thus easier to implement and maintain. This sounds like a
good idea and there is truth here if your objective is to truly model
the problem. However, if you objective is simply to transform data from
one format to another, then the solution is instead overly complex.
Modeling and simulation are or for things like 3D action games, flight
simulators, etc. The majority of real application programs are but
simple data transformers. We have the data in one form and need it
translated into another. Data is like energy, it can neither be created
nor destroyed, only transformed. Except the time I wiped out an entire
bank of 3330s, that time it was pretty much destroyed!
OO implementations require more machine resources than procedural.
Anyone who tries to tell you that this can be optimized away by wise
compilers knows not the metal. It takes extra machine cycles to call
code from within an object. It takes extra cycles to process inheritance
rules when birthing a new object, and that is just the tip of the OO
performance iceberg.
But may that as it be. These arguments are old and have been debated
without bounds here and many other places. I lay before you now other
food for thought on the use of objects.
Object Oriented Design stifles creativity. History repeats itself. This
is never a more clear truth than in the field of software design and
engineering. Each time we have automated a new field by applying
computer technology, we have started with emulation and modeling. We
attempted to recreate and reproduce the process as it was presently done
manually. We created user interfaces that were familiar in vocabulary
and actions to the existing manual process. How can this be bad, you
ask? It sounds like a logical approach to me. The answer is simply this:
we usually discovered, after some pain and failed attempts, that the
computer was not very good at replacing what the person was doing
manually. The computer makes a lousy drafting table, also a lousy
typewriter, etc. It was not until we applied the unique power and
abilities of the computer to the domain problem that we achieved
success. This usually involved going outside of the real world model and
thinking abstractly about how the computer can transform and manipulate
the domain data in new ways to help the user. Only then did we achieve
the full potential of the man-machine interface.
The human being is not Object Oriented. We think in terms of tasks and
actions, results and outputs. We use objects, but only as a means to and
end. I do not go into my workshop, pick up a hammer and then try to
decide what I could use it on. I go into my workshop with a task in
mind, to build a table, and then I choose the tools to make it so. If my
computer is my tool, why should I be constrained by real world mechanics?
Over time our computer automation of tasks evolve into useful ones and
experience brings the machine into its proper role in the process. But I
say that OOD obstructs rather than encourages this process.
Enough for now, it is time for me to go. I must take my medication and
it's time for I Love Lucy. I leave you with these thoughts:
It can only add, subtract, multiply, divide, compare things and move
them around, nothing more. |