A New Era
Written By: evenprime
It is customary to do some reflection this time of year, and I've been
doing a
little thinking about Y2K. I suppose that makes sense, since it was the
part
of computing that got the most media coverage the past year. It looks
like the
date change caused very few problems, and most of those were extremely
minor.
Still, there are lessons that can be learned from the things that
did happen.
It took a lot of time, effort and money to ensure that the date change
was
uneventful. One thing to learn from Y2K is that it is difficult to fix
a
program after it is developed and implemented. Getting all the bugs out
of a
piece of software that's currently in production usually requires having
an
outside set of eyes look at the code, as the Social Security
Administration
recently found out. The application of this principle to the open
source
movement is evident [1], but even closed source
developers can benefit by having their work audited by someone outside
the
development team, or better yet, outside the company. The
DVD Copy Control
Association
have amply demonstrated the dangers [2] of
trying to locate your own design flaws instead of letting someone else
examine
your work.
Look back at how programming has been done, and at what it has achieved.
Date
related bugs were everywhere, and had to be fixed. Security bugs are
still
everywhere. Unchecked input to static buffers, race conditions, and
programs
that are installed with too many privilege are all around us. All these
things
come from the same source: a method of software development that focuses
on
immediate results. It seems like the only concern most developers have
is
that the program they write works today, in our current network
environment,
with the input they expect it to receive.
That's a flawed way to look at software use. Y2K has taught us that the
things we write will be used far longer than we expect. Users ensure
that
our programs will receive input that is not what we anticipated.
[3] This may be true even if our intended users are
not looking for bugs. :) I once wrote a user management script that,
due to
not checking operator input, was capable of preventing the entire user
population from getting to applications necessary for their jobs. A
beginner's
mistake, but one that showed me how important it is to design programs
so that
they fail gracefully.
The software problems we have are not new. Lions wrote about race
conditions
back in 1977 [4]. Dr. Mudge was writing about buffer
overflows back in 1995. [5] Where has this gotten
us? Last week bugtraq readers were informed of a root compromise via a
race condition, and there were six security-related buffer overflows.
There
are tools [6] and techniques [7] out there to assist in secure
programming, but very few people use them,
so we keep seeing the same types of mistakes.
Politicians have noticed the net, and they tend to think it is fairly
important stuff. They have been tossing around terms like "Information
Super-highway". Presidential Directives [8] have
declared computer networks to be part of "America's Critical
Infrastructure".
The FBI has set up the National
Infrastructure Protection Center to guard our networks.
Infrastructures
are things that are built to last, and when people begin comparing our
computer programs to them, we ought consider the assumptions being made
by
the users. The highway analogy is kind of interesting; the engineers
responsible
for highways add saftey berms and guard rails to their designs, and
they don't
run the roads over quicksand. They try to incorporate safety into the
design
while it is still in the planning stages.
If the rest of the world thinks that we are designing an
infrastructure, this
industry needs to step back and look at what it is doing. Y2K has
taught us
that we may be using today's programs for a long, long time, so perhaps
we
should begin to develop with a different emphasis. This is a good time
to
consider abandoning the "functionality first" way of doing things and
adopting a "durability first" mind set.
After all, a new millennium seems like a good time to begin a new era of
software developemnet.
1. "Open source keeps designers honest. By depriving them of the crutch
of obscurity, it forces them towards using methods that are provably
secure not only against known attacks but against all possible attacks
by an intruder with full knowledge of the system and its source code.
This is real security, the kind cryptographers and other professional
paranoids respect." - ESR
http://www.tuxedo.org/~esr/writings/quake-cheats.html
http://www.tuxedo.org/~esr/writings/
cathedral-bazaar/cathedral-bazaar.html
2. "The lesson: This is yet another example of an industry meeting in
secret and designing a proprietary encryption algorithm and protocol
that ends up being embarrassingly weak. I never understand why people
don't use open, published, trusted encryption algorithms and
protocols. They're always better." - Bruce Schneier
http://www.counterpane.com/crypto-gram-9911.html
#DVDEncryptionBroken
3. "Security engineering involves making sure things do not fail in the
presence of an intelligent and malicious
adversary who forces faults at precisely the worst time and in
precisely the worst way." - Bruce Schneier
http://www.counterpane.com/crypto-gram-9911.html
#WhyComputersareInsecure
4. The code for "swap has a number of interesting features. In
particular it displays in microcosm the problems of race conditions when
several processes are running together....What happens next depends on
the order in which process A and process B are reactivated. (Since they
both have the same priority, "PSWP", it is a toss-up which goes
first.)
Lions, J., 1977. p. 15-2,
"A commentary on the UNIX operating system"
5.
http://vapid.dhs.org/Library/bufferov.html
6.
http://www.l0pht.com/slint.html
7. http://www.unixpower.org/security/
8. http://www.fas.org/irp/offdocs/pdd/index.html (#62 & #63)