Content-Type: text/plain; charset=us-ascii
As usual, this crew is right behind the clue-ball, but I was
particularly tickled to consider my sage Gelernter's advice on CS grad
school right now. Guess I'm some sort of wimp unwilling to put my sorry
ass on the industrial line and dedicate the ten best years fo my career
to the NFL -- at least, not after hearing the dismal view of CS the NYT
held out in Ernie's clippings!
Reactions? I have to agree with David. Only 3% should grant degrees, and
only 3% of the students int he pipleline should be here. Most would be
better off with practical eduications. It's only good for the
really-long-shot dreamers, for the empire-builidng theses. And for gods
sake, in some POST-cold-war academic CS subdiscipline. So much of it has
been done to death, ialready, even in such a short time. There's still a
ton of AI, compilers, numerical methods faculty out there, and not
enough how-do-we-model-reality profs. Like David.
Content-Type: text/plain; charset=us-ascii; name="gelernter.html"
Content-Disposition: inline; filename="gelernter.html"
THE END OF ACADEMIC COMPUTER SCIENCE AS WE KNOW IT
Copyright 1997, 1998 Mirror Worlds Technologies, Inc. All Rights Reserved.
For more information, contact Mirror Worlds.
January 1, 1998
Computer science is the study of software. This is a controversial
definition but reflects, I think, the consensus view. Hardware design is
understood ordinarily as a branch of electronics more than computer science.
Computation theory and the design and analysis of algorithms are clearly
computer science -- and are aspects of the study of software. Computation
theory is (ultimately) a theory of what software can and can't do in
principle; when you study algorithms, you are studying software construction
techniques in the abstract. Computer science is therefore software science;
for the sake of clarity, it ought to be renamed.
Computer science (or software science, or whatever) emerged as an academic
discipline in the late 1960s, at a time when the software world was
radically different from what it is today. The practical constraints on
software, the market for software and the demands made on software have all
changed dramatically. Compute cycles and memory were rare and expensive in
the 1960s. The ARPA net was brand new. Private citizens didn't own computers
or buy software, and didn't specially want to.
The world is different today, but academic computer science is basically the
same. In the 1960s, software's goal was mainly to speed the completion of
certain well-defined scientific, military and commercial tasks. Virtually
all computer researchers had scientific backgrounds, and non-scientific
computing tasks tended to be easily defined and circumscribed (compute
bills, do something to the personnel file, sort, archive, look up). Today,
software is underfoot whever you turn, and its goal is to make life better.
"Solve these equations and then sort those files" is a task that computer
scientists are well-equipped to understand, but "make life better" is not.
Computer researchers are no more imaginitive (at best) than anyone else
where everyday life is concerned.
In the 1960s the commercial software industry was small (anyway by modern
standards) and prepared to look to the universities for intellectual
leadership. Today's software industry is huge, and so rich that it no longer
needs to look anywhere for leadership; it can hire its own leaders.
The huge money offers for which Microsoft is famous nowadays will transform
the computing world in roughly the way pro sports were transformed in the
1970s. A scientifically-talented young man who is also (say) a good football
player is crazy nowadays not to go into pro football, because the money to
be made isn't merely a lot, it's life-transforming; you emerge a few years
down the road rich enough to do whatever you want for the rest of your
career. By the same token a good programmer would be demented, nowadays, to
pursue a doctorate or (still crazier) a career in academic research; he can
go to Microsoft for a few years and emerge transformed. Which portends
farther-reaching changes in academic computer science than anything we've
seen so far. Microsoft is widely hated nowadays; a good deal of that hatred
comes down to envy, and in any case it is obviously doing the right thing by
sharing its wealth with technicians -- even if the consequences include the
end of academic computer science as we know it.
Microsoft in turn is part of a larger transformation of American life.
During the first several decades of the postwar era, hard work was
guaranteed (if you were prudent and played your cards right) to earn you a
comfortable middleclass life. Starting in the early '80s, hardwork and
prudence were guaranteed to make you rich. Researchers have tended for a
long time to make less money than practitioners, but the freedom and
prestige of research used to be enough to attract top people. But in a
society where researchers and practitioners lead wholly different lives,
where practitioners are rich and researchers remain stolidly middle-class --
research will only attract a handful of fanatics. What are the consequences?
Might it all be for the best? After all, during the postwar generation we
had more researchers than we needed or could handle; maybe only fanatics
OUGHT to go into research. Maybe only a handful of universities OUGHT to
have doctoral programs.
At any rate, computer science has to change. Big changes are in order. For
what it's worth, I'll propose some in the next installment.
Gelernter Horizons Archive
Home | What is New | About Mirror Worlds | Products |
Bookstore | Developers Area | Gelernter Horizons |
Copyright 1997 Mirror Worlds Technologies, Inc. All