Quick scan-through of the PITAC research recommendations

Rohit Khare (rohit@uci.edu)
Fri, 18 Sep 1998 16:02:49 -0700

There's a lot of good bits at http://www.ccic.gov/ac/ that went into the
Latest, Greatest Meta-Exhortation of More Money for Us (TM), the President'=
Information Technology Advisory Committee Interim Report,
http://www.ccic.gov/ac/interim/interim_report.pdf . In this memo, I've
excerpted what I found interesting, personally, interspersed with my
commentary in []s. -- Rohit

PS. Not that I found this report in the first place -- my advisor, Dick
Taylor cc:d the group.


3.1.3 Major Recommendation:
*Make fundamental software research an absolute priority.*

Recommendation: Fund more fundamental research in software
development methods and component technologies.

The Committee recommends that research in software methods,
especially in the area of automated support for software development
and maintenance, be aggressively pursued. Such research should
explore and create:

* component-based software design and production
techniques, and the scienti=DEc and technological foundations
needed for a software component industry

* techniques for using measurably reliable components and
their aggregation into predictably reliable and fault-tolerant

* theories, languages and tools that support automated
analysis, simulation, and testing of components and their
aggregation into systems

* techniques for aggregating provably secure components into
provably secure systems

* standardized protocols and data structures to promote
interoperability of applications running in parallel across
wide-area networks

[and here's one that Adam and I recall painfully piloting in the early 90s

Recommendation: Sponsor a national library of software
components in subject area domains.

The Committee recommends that a program be established=8Bbased
on the recommended research on software development methods and
component technologies=8Bto create a National Electronic Library of
reusable software components in areas useful for Science and
Engineering research and education. This library would initially be a
way to test components and component technologies. The Committee
expects that standards will develop based on technologies for robust
software and will enable widespread registration and sharing of
software. In this way, the library can evolve into a widely used
resource. The Committee is fully aware of previous attempts to
establish national software libraries as well as the reasons for their
lack of success. However, the Committee feels that this is still a
desirable goal and recommends that funding be directed to address
the technical problems associated with the previous failures.

[And here's the finding they're misinterpreting:]

Finding: The Internet has grown well beyond the intent of its
original designers.

The Internet, which connected 2,000 computers in 1985, now
connects 30 million computers, and is continuing to double in size
every year. By the end of 1997, it was estimated that more than 100
million people worldwide were using the Internet. The number of
Internet users worldwide could surpass one billion as early as 2005.
In addition to growing in terms of people accessing the Internet, the
Internet is growing in terms of the types of services provided over the
network. Satellite and wireless systems will soon provide users with
=B3anytime, anywhere=B2 communications. Directory and search services
help users locate important resources on the Internet. Electronic mail
servers manage and store critical information. Authentication and
electronic payment services handle more and more of the Nation=B9s
commerce. Building blocks for new applications are being developed.
Examples include digital signatures, secure transactions, modeling
and simulation software, shared virtual environments for
collaboration, intelligent agents, tools for discovering and retrieving
information, speech recognition, and low-cost networked sensors.

[Because we should use that growth to think of very different ways to go
about networking instead... something more biological]

Finding: We cannot safely extend what we currently know to
more complex systems.

The continued growth of the Internet=8Bin terms of the number of
users, the proliferation of new and more demanding services, changes
in underlying technology, and the growing heterogeneity of the
networks and applications=8Bgreatly increases the complexity of the
infrastructure. For instance, the largest packet network ever built is
many orders of magnitude less complex than what must be built to
accommodate the anticipated number of users and services. We
cannot safely extend current technology to new networks that are
orders of magnitude more complex and that can carry many more
kinds of traffic, including voice, and expect to achieve the kind of
quality and reliability represented by today=B9s U.S. telephone systems.
While we understand how to build some, but not all, of these
individual elements of the future infrastructure, we do not know how
to make these elements work together in a reliable, efficient, and
robust way. Similar problems will arise when we try to build scalable
services, say, a search engine that can index terabytes of data or a
payment service that can process millions of transactions per second.

[They classify the future, you see, as a mere matter of scaling the current
system... and I'd rather bet there's something totally new to do. But that'=
not the job of Presidential commission, the loony thinking...]

To support the growing demand and dependence on the information
infrastructure, advances are needed in at least =DEve major dimensions:

* Scaling to provide robust, high-speed access, with assured
quality-of service when required. These advances will
improve the quality of interaction.

* Scaling to provide multi-faceted access. This scaling will
create new ways for people to connect.

* Scaling to provide ubiquitous access. These advances will
increase the number of people with continuous access to

* Scaling of the infrastructure services to reliably handle many
users and requests. These services include authentication,
resource directories, search engines, banking, and many
others. Advances in this area will improve the quality of

* Scaling of the security infrastructure to safeguard intellectual
property rights, to protect against all types of failures or
attacks, and to provide privacy of access when needed. These
advances will make information and the infrastructure more

[So that kind of thinking leads them to a very timid agenda indeed: bringin=
more short-term players on board an already only-short-term focused

Recommendation: Expand the Next Generation Internet testbeds
to include additional industry partnerships in order to foster the
rapid commercialization and deployment of enabling

[Hey, the 3% rule resurfaces! I'm worried about the decline, but not the
absolute shares: not everyone needs a CS degree to work with computers. I'm
much more heartened by the supply of "computerized" specialists in other

The academic pipeline and re-training efforts are deficient. Since
1991 the rate of growth in computer science and electrical
engineering doctorates has decreased. The annual number of
doctorates granted in computer science, in particular, not only
stagnated during the 1990=B9s, but is down about 10 percent from the
peak in 1991-92. The short-term outlook shows no immediate
improvement. At a time when we should encourage students to
continue graduate study, electrical engineering and computer science
majors are among the least likely to pursue a Ph.D. In comparison to
the biological and health sciences, electrical engineering and
computer science turn out about half as many doctorates. Computer
science and electrical engineering together account for only about 5
out of every 100 master=B9s degrees, and 6 out of 100 doctorates
granted each year. In addition, an increasing percentage of these
graduate students are foreign students.

[I find it much telling that the report says virtually nothing about the
ABYSMAL minority participation record, though. Go to any IETF meeting to se=
what that means: graying white male beavers, all. So they minimize the dire
nature -- *falling* participation rates -- and then relegate discussion to =
much less prominent amount of real-estate than, say, the need to prop up
ailing super manufacturers]

Recommendation: Encourage increased participation by women
and minorities.

To remain competitive in a global economy, we need to ensure that
every American emerges from school with the general and speci=DEc
skills needed to prosper in an information rich society. Current
studies show that women and minorities are vastly underrepresented
in both educational and workplace settings which require the
development and/or use of information technology skills. Our Nation
will not prosper if we do not invest in developing all our human

[how do we get more grad students, then? More toys...]

Although salary is a major factor in a
student=B9s decision not to go to graduate school, an equally important
factor is the perception that universities are no longer the place where
the most exciting work is being done. Substantial increases in funding
for long-term research would change that.

[Sorry, but I don't think any amount of money is going to bring some of the
more exciting industrial ideas back into the lab... e.g. Jim Waldo]

[Unfortunately, their proposal for new work is really uninspiring, to me
anyway. And what's with the belief that all these multiple-PI projects work
so well, anyway? :-)]

should be expanded emphasis on support for multiple-investigator
teams working on a single integrated project over a number of years.
This model has been successfully used by DARPA for many years. In
addition, to foster research that can have truly dramatic impacts, the
Committee recommends the creation of two types of center-sized
activities. =B3Expeditions into the 21st Century=B2 will involve large
teams of researchers in explorations of future information
technologies and their impact on society. =B3Enabling Technology
Centers=B2 will conduct research on the application of information
technology to particular problems of national importance.

[Oh, I see, it's part of the usual fallacy about the "good old days"! I'd
like to see the backup documentation that all was so well in the land of
milk and honey. In fact, it was closer to a cartel, with only a dozen DARPA
campuses, at most. And I'd like to see some cost-effectiveness studies. My
ad-hominem attack of the day? I'd like to see what we bought with AI and
speech bucks.

This approach was used with great success by DARPA during the
1970=B9s and 1980=B9s, when teams of computer science researchers were
encouraged to imagine and explore dramatically different futures.
These teams were given enough resources and time so they could
concentrate on the problem rather than worry about their next
proposal. The results were dramatic advances in arti=DEcial intelligence,
speech recognition, robotics, chip design, high-performance
computing, machine vision, and virtual reality. It is this spirit that the
Committee would like to see reborn and replicated across all Federal
funding programs for information technology. DARPA=B9s role as a
funder of innovative, high risk initiatives in information technology
should be restored as part of a balanced R&D program.

[A converse vision: focused grants to support individual tools and specs an=
projects -- like SBIRs, but for academia.

It'd be a rolling grant board that offered fixed 50k (one-student) and
multi-year (250k) packages to seed ideas like Apache or WebDAV -- a *ventur=
capital* mindset! So anyone that has a software tool idea could just as wel=
pitch it to this 'public-domain' VC -- and customers, too. After all, that'=
what VC's do: they sniff out new companies they sense demand for as well as
read blind submissions.

Why *not* give Kleiner Perkins $10M/year to invest in academic CS?... ]

[So here's the really wacky windup... and if they really do come to pass,
where can I find Sacajawea?]

Fund virtual centers for Expeditions into the 21st Century.

=B3Expeditions into the 21st Century=B2 will be virtual centers that bring
together scientists, engineers, and computer scientists from academia,
government, and industry to =B3live in the technological future.=B2 The
mission of these expeditions will be to report back to the Nation what
could be accomplished by using technologies that are quantitatively
and qualitatively more powerful than those available today. In
essence, these centers will create =B3time machines=B2 to enable the early
exploration of technologies that would otherwise be beyond reach for
many years. Just as the Lewis and Clark expedition opened up our
Nation and led to unanticipated expansion and economic growth, the
ideas pursued by information technology expeditions could lead to
unexpected results and nourish the industry of the future, creating
jobs and benefits for the entire Nation.

There are a number of precedents for this =B3living in the future=B2
approach. In the private sector one of the most famous examples is
the Xerox Palo Alto Research Center (PARC), where researchers
created an experimental network of computers for use by individuals.
This effort pioneered many of the revolutionary technologies that led
to today=B9s personal computers, including graphical user interfaces,
pointing devices, laser printing, distributed =DEle systems, and
WYSIWYG word processing. In the university community, the
Massachusetts Institute of Technology (MIT) Media Lab has been
conducting similar explorations. Finally there is the example of
ARPAnet, which evolved into today=B9s Internet.

The Committee recommends funding several Expeditions, each with
a different focus. The focus may be on either a discipline-based
theme, such as bioinformatics or multi-scale engineering, or on an
infrastructure-based theme, such as distributed databases or tele-
immersion. To establish a context, each Expedition should be based
on assumptions not true today, for example, ubiquitous computing or
a vast amount of simulation (=E0 la Gelernter=B9s Mirror Worlds). Each
center need not be limited to a single such assumption, but an
Expedition should invest sufficient resources to make exploration of
its assumption areas, those parts of the map of the future, possible.

...The full term of an Expedition would be ten
years. To encourage truly aggressive efforts, very high annual funding
levels should be possible, say up to $40 million per center.

[Would Drs. Byars, Rifkin, and Bolcer like to join me in an Expedition to
the Continuously Entertained Future (a/k/a Bread and Circuses?) All media,
all the time...]

[The other program they propose is more prosaic. They might as well just
admit it and relabel them "Digital Extension Service", after the USDA's Ag
Ex Svc for farmers, and the SBA's ill-conceived one for light

Establish a program of Enabling Technology Centers. [They propose 15 x 10yr=
x 10M/yr][

The Committee recommends establishment of centers of excellence
in computer science and engineering applied to particular applications
of information and communications technology. These Enabling
Technology Centers (ETCs), located at university and/or Federal
research institutes, will provide an integrated environment for
academia, industry, and Government to focus on the application of
next-generation IT to important national problems. There is a wide
variety of applications domains where information and
communications technologies could make a difference, including:
computational science and engineering; health care; delivery of
Government services/Digital Government; crisis management;
environmental monitoring; life-long learning; law enforcement and
public safety; arts, culture, and the humanities; intelligent
transportation systems; improving the quality of life for persons with
disabilities; and distributed work (e.g. telecommuting, collaboration
by geographically distributed teams).

[Now *someone* just won a major bureaucratic war here... doesn't this sound
a little inconsistent with the ringing endorsement of DARPA a while back?]

Designate NSF as the lead Federal agency to coordinate
information technology research.

Designate the NSF to serve as the lead organization for coordinating
information technology research within the Federal Government. This
may require institutional innovations internal to the NSF to ensure
that NSF is responsible for de=DEning and coordinating a broad range of
modes of research support, such as centers of diverse sizes and
multiple-investigator projects with longer terms. Roughly half of the
proposed budget increases for information technology should go to
NSF with the rest allocated to other research support agencies. The
majority of the NSF increase should go to the new modes of funding;
the rest should go to the traditional style programs within Computer
and Information Science and Engineering, expanded as appropriate to
projects of larger size and longer duration.

[But the reality is that NSF is much easier to support politically than a 1=
Defense-centric and 2)elitist program that concentrates the wealth (still)
on only a few institutions and Congressional districts]

[Finally, their answer for better oversight is not "more talented program
managers" or
"a central clearinghouse", but more layers:]

Expand the current coordination mechanisms used for HPCC and
NGI=8Ba Federal coordination committee with staff from a national
coordination office and topical working groups, with oversight by a
Presidential advisory committee=8Bto the entire information
technology research endeavor.

[The conclusion points to a report that I'm going to read next.
The other really fun publication this year was a CSTB report on the
Evolution of Untethered Communications:
http://www.nap.edu/readingroom/books/evolution/ ]

The initiatives presented in this report reinforce the recommendations
made in the Brooks-Sutherland report Evolving the High
Performance Computing and Communications Initiative to Support
the Nation=B9s Information Infrastructure.

[Once more, "evolution" shows up. I really want to see more principled
treatment of what it means to "evolve" a sociotechnical artifact, anyway...
good thing the CORPS theorists are the next door down. I really do think
there's a landmark evolution-of-IT paper to be written.]