[BITS] ICS 280 Paper: The Evolution of the W3C

Rohit Khare (rohit@bordeaux.ICS.uci.edu)
Thu, 19 Mar 1998 14:57:02 -0800

[Feel free to circulate this draft within w3t, webheads -- but keep this
url attached and send me comments! Apologies at the sketchiness of Sec 4

http://www.ics.uci.edu/~rohit/w3c-evol --RK]


The Evolution of the
World Wide Web Consortium

by Rohit Khare, March 19, 1998
for ICS 280 (Institutional Aspects of Computing -- Theory of Compatibility



The World Wide Web Consortium (W3C) has developed a novel
organizational form as it attempts to "lead the evolution of the
Web" -- equal parts academic lab, industrial research &
development consortium, trade association, and standards body. In
this paper, we trace the history of W3C's adaptations in structure
and process to accommodate the shifting opportunities of the Web


1. Introduction

"Realizing the Full Potential of the World Wide Web"
"Leading the Web to its Full Potential"
"Leading the Evolution of the World Wide Web"

In August 1995, about a year after its founding, the World Wide Web
Consortium (W3C, ne W3O) held its first all-hands staff meeting. In
addition to a call for a formal document review process and ratifying a
nascent divisional structure, the team debated a new mission statement. Two
days of debate kept returning to the theme of evolution, of guidance, of
shepherding the growth of 'our baby'. In some ways, it's justification writ
large of the staff's individual decisions to work for a non-profit
consortium rather than its corporate members.

It's quite logical to draw a straight line from its members' self-image to
its organizational mission, because in the end, the Web Consortium with all
its hundreds of members, exists because of the staff's decision to pledge
allegiance to Director Tim Berners-Lee, not for any legislative or market
mandate. Organizational psychologists sometimes speak of the life of an
organization as separate from its constituents, shaped by the needs of
society and its objective environment. W3C's birth, though, is not so
immaculate -- it is the story of thirty-odd individual commitments. This
paper attempts to outline the evolution of a standards body from those

1.1 History

The usual creation myth of the Web traces its birthplace to CERN in Geneva,
and its early adoption to the High Energy Physics and NeXTstep developer
community ("As a high energy physicist I can only wish that we had been
smarter, and, instead of having people type WWW when they want to surf the
Internet, we had them type HEP perhaps we would have bigger budgets now." --
Burton Richter, SLAC Director). Other institutions played a key role in
sparking its explosive adoption, though. Notably, NCSA, which had laid the
groundwork for a first international Web conference of its own just days
after CERN (the immediate outcome was the creation of an international
conference committee which sponsored the first in Geneva and the second in

Nevertheless, at the same time other institutions were shifting the center
of gravity away from CERN, it was pushing the Web out to make room in its
limited budget for new physics experiments. An international bidding war of
sorts broke out behind the scenes, and the winner of inventor Berners-Lee's
affections was MIT. The Laboratory for Computer Science (LCS) offered to
replicate its widely-praised model of academic-industrial IT R&D innovation,
the X Consortium. LCS began by adopting its membership structure, policies,
even some of its original administrative staff.

US Government funding sponsored the US bid, but the European Community was
similarly committed to its continental aspirations. At the same time, W3C
announced a second agreement with host site INRIA, at three locations in
France. A year later, Japanese membership growth and local research funding
catalyzed a third site at Keio University in Tokyo (which came on line just
last Fall). Future expansion may aim at Southeast Asia, Oceania, and
upgrading of a secondary center at Rutherford Appleton Laboratories in the
UK. The stated criteria are a sufficient number of local Full Members and a
research organization to act as host site; state support should be
considered an implicit one.

1.2 Structural and Procedural Adaptations

In its ten years, the X Consortium shepherded the growth of a single product
with direct financial and technical support from its member corporations.
The code base had a gravitational effect, grounding all of its activities in
the wire protocol and server implementations the X Consortium "owned". Its
governance placed final authority in Director Robert Scheifler's hands, even
after it spun out of MIT.

W3C initially envisioned its code base, LibWWW, in the same light, but
quickly had to move on because it "owned" so much less -- even something as
fundamental as HTTP was an IETF product. Furthermore, Web-related
specifications are more independent of each other, so W3C's distributed
teams did not have a common 'heartbeat' of new-versions. It split into three
Domains -- User Interface, Architecture, and Technology & Society -- and
developed a new process for approving specifications rather than code
releases. Its goals shifted from writing-specs to
leading-groups-that-write-specs, from chasing ambulances (HTML unification,
Web Security) to industry leadership (XML, Web Accessiblity Initiative).

1.3 The Evolution of Evolution

Finally, W3C institutionalized its commitment to evolutionary change. In the
history of ideas, information technology is associated with revolution --
like the very rise of the Web itself. Military metaphors in the trade press
herald the arrival of new innovations on the IT battlefield; total change is
the message of the week from consultants and authors and journalists.

While it's true that the arrival of new 'technological frames' or
'paradigms' appear to be the most significant source of innovation, as
[Dosi] points out, as much of gains to society appear as evolutionary
developments within the frame as not. Of course, that does not mean
individual products are never toppled: consider the creation of XML and CSS
to relieve pressure on HTML in the [Hanseth, et al] model of "anticipating
and alternating flexibility."

Several of the 'revolutionaries' at W3C wouldn't even claim the Web is a
shift; they see it emerging as the natural consequence of accreted learning
in several fields: hypertext user interface, markup languages, and naming
schemes. Nevertheless, putting across the key idea od Universal Resource
Identifiers required bold, simultaneous innovation on all three fronts. W3C
does not exist to catalyze another Big Bang; it reflects the rise of a new,
eco-systemic worldview per [Kelly], where individuals no longer make
Promethean changes, nor even benevolently shepherd from above, but merely
posit new species, trying to occupy new niches and hoping it all works out.
Evolutionary development not only described W3C's political agenda
(understanding the 'food chain' well enough to focus on strategic
specifications as precursors), but also its technological agenda (building
extensibility into its products), and its very self-image -- a new
organization defending its niche against encroaching corporations, SDOs
(ISO, IETF), and consortia (OMG, Open Group).

2. Structure

As the clich has it, no one owns the Web -- and it's unclear anyone can.
W3C, however, wanted to establish itself at the center of a rapidly
expanding universe. Clearly, a reference product suite of 'official' clients
and servers would not suffice, given that W3C controlled none of the popular
tools except the soon-orphaned 'CERN server.' The market was racing ahead of
W3C in mapping out this space and in attempting to anoint de facto



Java... HTTP Amaya HTML CSS



Guidelines Aural XML-Link Privacy Crypto

Table 1: a selection of W3C and non-W3C activity areas arranged radially by
relevance to W3C mission and resources. W3C-led efforts are underlined; core
Web technologies are italicized.

W3C faced four challenges in organizing itself to achieve such centrality:
recruiting members, establishing host sites, outlining its range of action,
and deciding what to produce. Where W3C decided on supply-side membership,
globally distributed academic hosts, technical focus, and standards, it is
worth contrasting it with similar organizations. CommerceNet was founded but
six months earlier as a Silicon Valley partnership that rapidly grew into a
user-focused, multi-local corporate office, commercial policy-focused, and
pilot-project driven consortium. X Consortium also had a supply-side focus,
but basically remained in Cambridge, had an internal technical agenda, and
produced software products. The Open Group, born of the merger between
X/Open and the Open Software Foundation, requires a much higher level of
commitment, from a smaller number of supply-side members, is geographically
centralized at a few sites, has a member-driven technical agenda, and
produces a mix of products and pilots, as well as a branding program. The
Object Management Group has a range of commitment levels, resulting in
broad-based membership; has its own central corporate offices, focuses on
distributed-object programming standards, and produces specifications and

Membership Support Sites Agenda Products

W3C SW producers $5-50k; Global; Web market; Specifications
some users staff Academic technical Sample Code

X X O($100k)? Centralized; GUI market; Reference Code
Consortium developers, staff initially technical
HW vendors loan acad.

CommerceNet Users $3-35k Multi-local Commercial Demonstrations

Open Group Large $100k-1m+ Centralized Vendor-set; Code, Pilots,
Vendors Rsrch arm Branding

OMG Users & Dev. $100-100k Centralized OOP market CORBA specs

2.1 Membership

Tim Berners-Lee floated the idea W3O at the first Web conference as an
institutional imperative: he saw a need for an independent, neutral home for
the Web in the face of increasing pressure from the physics-focus at CERN
and the urgency of his corporate callers. His decision not to sign on with a
particular company was arguably a necessary -- and sufficient -- condition
to establish W3C. Funding, and more importantly, commitment to W3C's
principles and products, would come from an assortment of corporate and
research Members.

Actually putting that plan into place revealed a welter of new issues to be
resolved as W3C solicited new members around the world. Initially, there was
a broad concept of membership; the Object Management Group (of CORBA fame)
was already in those days the largest software consortium in the field, with
hundreds of diverse members. Users, content providers, and affected
industries like banks, as well as the whole slate of research institutions
were vaguely expected to sign up.

The initial wave, logically enough, was limited to the major producers of
Web-related software, and the major hardware manufacturers who could afford
it. Dues were fixed at $5k or $50k per year in 3-year subscriptions
depending on sales volume (threshold of $50m). Those amounts, however low,
were a significant deterrent to research institutions, especially in North
America. The time commitment was as significant a barrier as the dollar
value, especially for startup companies. Consultants were completely
ineligible, since voting was tied to membership. OMG, by contrast, has
several layers of participating with separate levels of voting privileges.

This exclusive focus on the supply-side of the Web market quickly reached
critical mass. These members called for, and supported, W3C projects that
created value for producers, like brokering peace agreements in the HTML
wars, while leaving competitive hooks like server-extension hooks
fragmented. Combined with the tastes of the technical staff, W3C remained
focused on innovative specifications and tools, not usage guidelines,
certification and branding, or evangelism. Some initial members like MCI and
BT later quit W3C because they were not seeing benefits being involved so
far upstream. In Europe, where many more 'user' industries were canvassed
(Dassault Aviation, Electricite de France, and so on), user demand and EC
contract mandates led to UseWEB (eventually spun off with two W3C/INRIA
staffers), and later W3C-LA (Leveraging Activities). These efforts, much
like the later arrival of a Promotion & Dissemination (read: PR) Domain,
were distinctly second-class citizens in W3C culture. To this day, W3C does
not, by and large, represent people who create, finance, and deliver Web
content: content providers, MIS departments, or ISPs.

Though there have been financial limits to W3C's growth, especially with the
expiry of some of the early Government support and the relatively high
percentage of new revenue siphoned off by the host institutions, the dues
schedule has not been revised. Clearly, W3C's demonstrated value could merit
a rate hike, or the creation of a new tier, so it's instructive to ask why.
First, there are indeed hidden tiers. Members earn political capital by
sponsoring meetings, catering receptions, and seconding staff to W3C (once,
even in lieu of dues). In return, these Members are better informed about
W3C activities and have better informal access to the staff. Politically,
W3C is also in a position to barter their support in one activity area for
another ("log-rolling"). Note that these mechanisms are basically only
valuable to producers. Second, the agreements with the Host sites limit the
scale of W3C's operations, as discussed in the next section.

W3C has not considered the format of a research & development consortium.
Its leadership chose not to explicitly represent the interests of a few
major industry players and accept significant backing in return, like, say,
the Microelectronics and Computing Consortium (MCC), or more distantly
related models like Sematech or EPRI (Electric Power Research Institute)
[Corey]. They also did not see themselves as a fully-independent think-tank
like the Bootstrap Institute.

2.2 Host Sites

Instead, there appears to be an institutional commitment to remaining
esconsced in academia. Part of the rationale is ongoing feedback from fellow
researchers (although there have not been systematic talk series or graduate
research sponsors); another part is the appeal to its technical staff, who
are willing to accept below-market salaries in exchange for an academic
appointment. In return, the imprimatur of the host site attaches to W3C's
products, as demonstrated by the profusion of articles citing MIT instead.

There are limits to operating within this environment, too. Aside from
bureaucratic encumberments like Personnel and Accounting, note that while
multi-million dollar grants are seen as plums, a multi-million dollar
"business" operating out of an academic department is not. It may be easier
to take in more money, but it is not clear how W3C might spend it --
business-class tickets, PR firm retainers, and graphic artists are not
socially acceptable expenses. And top salaries for talented technologists,
much less 'politicians' (leaders and negotiators) are out of the question.
Instead, personnel are expected to be extremely committed (there are
official T-Shirts for the World Wide Web Widows...) and precisely neutral,
foregoing much of the consulting and lecture circuit.

W3C is also committed to an international presence, fittingly enough. Not
merely because it's an international phenomenon [Malamud], but also because
there's a notion national policy is at stake. Governments, and their
'national-champion' corporations, lobby for creating local centers,
following the observation of [Kindleberger] that "it is hard to think of
international standards that did not start out as the public good of some
particular country." This may be less true of IT standardization, but it's
certainly the case that we don't look to Geneva for Web standards anymore --
not because W3C moved the US, but because the Web industry moved to the US.
[Though one might ask, why not a California office?, and the answer might
lie in personal preferences and MIT pride.]

Globalization does not mean offices are interchangable, though. While W3C's
organizational support and meetings are globally distributed, the work often
splits across divisional and national lines (for example, much of UI is in
France and Architecture in the US). There are also nationally-driven
agendas. The Japanese branch is most interested in internationalization and
localization of the specs, as well as color fidelity, because of the
involvement of display and printer manufacturers. Europe has been the site
of W3C's user outreach activities. The US, as the leader in passing Internet
indecency laws, has been the leading site of content-rating and policy
development. One might wryly expect an Oceanian office would be keenly
interested in compression and caching...

2.3 Range

Already in our discussion, we have outlined several areas as 'what W3C does'
and 'what W3C doesn't.' Member and Public Relations are one example of the
latter from the section above; these functions were supported fitfully by
the administrative staff for almost year and a half before professional
support arrived. HTML specifications, on the other hand, are entirely in
W3C's hands by now, with eventual ISO ratification trailing behind by years.
There are several rules which are invoked to help draw a boundary around
W3C's balliwick, presented in order of (perceived) importance.

The first principle puts W3C's self-image on the stand: W3C should do what
W3C's people are uniquely equipped to do in the world. They're not the best
marketers or educators; they're a technical staff. That means W3C's
interests are in code, specification writing, specification editing, and
working group chairing, almost in that order. Almost never training, analyst
briefings, or representing the Web industry; public policy only rarely, and
that only to protest W3C works on technical mechanisms, not policy (c.f.
PICS testimony in the CDA trial; Privacy Preferences Platform Project (P3P)
at FTC hearings). Similarly, W3C has not taken a role in the WWW conference
series, minimizing its involvement to a few cross-posted staffers on the
IW3C2 committee.

Second, W3C's roots in IETF culture guide it toward focusing on the
bits-on-the-wire and bits-on-disk rather than Application Programming
Interfaces. It is more economical to specify the declarative interpretation
of bits than describing the equivalent operational narrative. Hence, W3C's
interest in mobile code and scripting, reflected in the Document Object
Model (DOM) hooks, but not in standardizing Java (Sun became an ISO PAS) or
JavaScript (now ECMASscript). Hence, too, W3C's proposal of an extensible
format for HTTP messages (PEP) rather than a unified server API (instead of

Third, W3C is bounded by territories already staked out by others. While
HTML migrated from IETF to W3C because IETF's agenda includes protocols but
rarely data formats and because the politics of that group required a closed
forum, HTTP has not, and HTTP-NG will probably not, either. In the security
arena, W3C offered to sponsor a quick, closed negotiation on Transport Layer
Security (TLS), but the participants chose the longer, open route at IETF
nonetheless: they had greater credibility in the matter. At the same time, a
topic might within another organization's space, but not in time. XML became
a W3C effort and not an ISO WebSGML committee in part because no one
believed ISO could function as quickly as W3C. Proprietary technology can
also hem W3C in: competing font-encoding technologies have stymied W3C in
that area, as have submarine patent disclosures affecting push distribution.

Fourth, W3C does occasionally have the freedom to draw a line in the sand
over matters of architectural elegance (technical judgment). When Netscape
released frames without prior warning, the staff led the working group to
decline standardizing that mechanisms in favor the nascent "layers" approach
within Cascading Style Sheets, Level 1 (CSS1). Web authoring and versioning
required protocol support for uploading documents as well as metadata
formats for describing sites, pages, and properties; W3C decided to focus on
common metadata solutions and left WebDAV to IETF by default.

Finally, Member demands shape the agenda, too. While day-to-day operations
are mostly reacting to shifts in the Web market, over the longer term user
members in Europe and Asia successfully lobbied for W3C-LA and better
outreach to promote electronic commerce, publishing, accessibility, and
other applications. "Coordination Groups" were another concession to offer
quarterly or semiannual reports to a wider array of Members on specific
categories of activity areas (See 3.2).

Individual end-users, however, do not explicitly affect the agenda. "By the
mid-1980s, users had been largely displaced from the standardization process
[of open Unix], " noted [Cargill]; on the Web they never even got in the
door, effectively. W3C not only ruled out individual contribution to its
activities (except by rare invidation), it did not even leverage the mass
constituencies of the HTML Writers' or Webmasters' Guilds, nor Web Journal

2.4 Products

All of those influences also helped narrow the range of products W3C created
as well. It began by inheriting the CERN-developed code libraries and
specifications are rarely ventured further. Unlike the X Consortium's
situation, though, there was implicit pressure not to make the released code
too useful. A feature-complete Web browser or easy-to-administer server
could conceivably cut into Member revenues, so spell-checking and GUI
consoles and suchlike remained on the wishlists of W3C code. Along the way,
in consideration of legal liability attached to terms like "reference code"
and the automatic obligation of thirty-day advance release written into the
Membership Agreement, W3C relabeled almost all of its efforts Sample Code.

Demonstration projects have not been a high priority; extending Sample Code
for flashy prototypes has been left as a student project for PICS and P3P;
the Arena browser was entirely shut down in favor of outside Linux-community
development. Pilot projects, in the sense of bringing in early adopters into
the process, have also been marginalized. Conspicuous failures to recruit
merchants and payment instruments to the Joint Electronic Payments
Initiative (JEPI) did not encourage adoption of CommerceNet-style pilots.

Nontechnological artifacts also got short shrift. There have been very few
white papers on applying Web technology or evaluating current products (e.g.
W3C gave up on maintaining up-to-date market overviews in the entire
Technology and Society Domain). Much like IETF, W3C has documented how its
protocols work in far more detail than why. Those documents, in turn, are
basically only available online. After two years' experimentation with an
official W3C-sponsored Web Journal and several books by staffers, W3C still
does not have any coherent publishing program for archiving or promoting its
work. The logo also isn't used for branding or certification of compliant
products, a task seen as 'too political' for a neutral body and 'too
resource intensive' for an academic one.

3. Process

Just as the previous section presented what the W3C is designed to
accomplish, this section considers how. W3C has developed its own processes
for organizing its technical staff, its Members', and measuring their
progress towards standardization. Culturally, it seems to cut across the
three separate styles in [Kahin]: ad-hoc application design, Internet-style
open consensus, and telecom-style formal management. Perhaps the main reason
is that W3C insists it is not a standards body just as vehemently as it acts
like one.

3.1 Domain Organization

Within the first year, W3C settled on a divisional structure which allocated
its range of activities to its three seniormost managers, yielding the three
'classic' domains, since joined by P&D and WAI. This was chosen instead of,
say, a functional division into coding, specification, and policy, although
those two kinds of partitions were conflated in the development of each
Domain's style.

The User Interface Domain inherited HTML and its related work on style
sheets, fonts, graphics, and multimedia (later returned to Architecture).
All but two of its staff were in France at the time, and the INRIA influence
persists because the Domain head and its central product, the Amaya testbed
browser, are in the Grenoble office. The new Japanese projects are mainly in
this domain. [Footnote: the cultural affinity of the French and Japanese is
much higher than either for the confrontational American style; it would be
difficult to prove anything without violating confidentiality, though]. Its
efforts have been relatively low-key in the media since the end of the HTML
Wars. It style is to rely primarily on internal expertise and internal

The Architecture Domain inherited HTTP and URIs and their related work on
mobile code, caching, and, later, metadata. Its main products included the
library and the Java-based server, Jigsaw. All but two of its staff were in
the US, which is still the case. It places more emphasis on specifications
than testbeds, though, and organizes its Working Groups to leverage Member
support to that end. There is a fairly sophisticated pattern of separate
chair and editor and subgroups in this Domain. Its goals are often to
negotiate compromise between two or more existing Member proposals and it
expects the industry to follow. XML is a rare case of anticipatory standards
work in this Domain; even HTTP-NG is more driven by experimental inquiry.

The Technology & Society Domain was created around issues in practice:
security, privacy, payments, and content-labeling. It does not have a
unified product base, offering per-project demonstrations instead. Its work
tends to be more anticipatory than the others', and relies extensively on
outsourcing development to Members, reserving a project-management role to
itself. Its personnel, in fact, are professional managers with some Web
experience, as opposed to the Architecture Domain's Web professionals with
some management experience. Its involvement in policy and advocacy also spun
off the WAI International Project Office (IPO) to enhance access for the
disabled, and also made it the primary client of the P&D domain (its success
is measured in press mentions as much as Architecture's in Internet-Draft

In addition to the ultimate hiring and budgetary authority assigned to each
domain, the staff was matrix'ed with host stie management. An Associate
Chair-level post at each site is responsible for liason work, communication
with the local membership, and equipment & facilities, as well as the myriad
invisible issues of team harmony.

3.2 Working Groups

Note that W3C is unique in having a technical staff at all. OMG and IETF are
examples of successful all-volunteer standards organizations, backed by a
small technically expert core; ISO has an even more passive setup. This
inherently puts the technical opinions of fifty-odd people above the
thousands of Member employees' simply because it's their job. The benefit,
of course, is improved coordination, since a given Working Group can rely on
its W3C chair or representative to track any conflicts or highlight
potential synergies.

The ultimate authority to charter new work items, monitor progress, and
approve Recommendations is the Advisory Committee (AC), composed of one
representative (and one vote) per Member. Recently, there has been a reform
to elect a subset Board for expedited review between semiannual AC meetings.
Proposals are prepared by the staff, in consultation with a Coordination
Group, if applicable. CGs are open to any Member employee who can attend,
and reports on the developments in an entire area (all of HTML or T&S, say).
Charters can require a firm commitment of staff in order to join actual
Working Groups; half- to full- FTEs are not uncommon levels. These Working
Groups generally meet weekly by teleconference and actually produce the
documents. Occasionally, they are chaired by outside experts. In the past,
this function used to be called an Editorial Review Board (ERB) which kept
abreast of the W3C staffer doing the drafting (and WG used to mean what
became a CG). The current process evolved to avoid the need for WG elections
or Directorial appointment to an ERB, which smacked of a cabal. Beyond these
measures, the only public review of W3C activities is the annual W3C track
at the International World Wide Web Conference series.

In general, W3C WGs are faster than their counterparts at IETF or OMG
because, first, they are closed for a; second, conscious agenda control by
the staff (not as sinister as it sounds; meant in the Political Science
sense); and third, a willingness to settle for a majority vote: consensus by
weight rather than by volume. They share the IETF's belief in "the virtues
of partial standardization, the insistence on working models, and constant
communication among the participants over the Internet itself." [Lehr], but
their enforcement mechanisms accelerate standardization much as [Farrel and
Saloner 88]'s game-theoretic model predicted.

3.3 Document Maturity

The end-goal for most W3C projects is a Recommendation, the ultimate rung on
the document ladder beginning at Note (no commitment of resources), to
Working Draft (current WG editing), through Proposed Recommendation
(advanced after WG vote and Domain Leader approval). That final step
requires voting support from the AC which informs the Director's decision
(the controlling authority). Then, depending on the specific project at hand
and the staff resources available, documents may be forwarded by W3C or used
by others in formal SDO operations or certification.

This detailed process was put in place about a year after W3C's founding;
the original documents only defined what a Recommendation was, and that the
Director could make one. Its first test was the Portable Network Graphics
(PNG) spec, which had been largely completed outside the W3C and was
shopping for a forum to standardize within. A W3C contact point prepared it
in the proper format and used it to test if W3C's mechanisms for voting and
addressing formal objections worked. Next up was the first major revision of
HTML, the so-called 'can't-we-all-just-get-along' 3.2 edition, which
successfully demonstrated that political compromises between two major
vendors can be ratified this way.

Documents from the W3C are not expected to fit into an interlocking whole as
a series. It allows for considerable flexibility in content while offering
the full imprimatur of W3C. It's the kind of flexibility not expected of
traditional IT SDOs; W3C is consciously avoiding reverting the bureaucratic
ISO model as \cite Drake and Steinmuller would have us fear. On the other
hand, there has to be some recognizable level of stability for other groups
to cite W3C work, as the current Error 33 situation where WebDAV at the IETF
wants to advance to Proposed Standard status there with references to
Working Drafts of XML.

4. Cases

The W3C is an excellent 'model organism' for studying standards development
because it expresses so many different patterns within the same
organizational structure. Their activities are anticipatory and
retrospective, technological and policy-related, high-internal commitment
and high-external-commitment, aimed at appropriable multilateral benefit to
participants and true public goods and cover the range of open IT interface
goals in [Libicki] for interoperability (e.g. HTTP), portability (e.g. URIs,
scripting hooks), and data exchange (XML, stylesheets). W3C also exhibits an
understanding of [Libicki]'s families of standards and constructing coherent
technological clusters (e.g. bringing DSSSL scripting into XSL for XML where
property-setting in CSS was sufficient for HTML).

In this section, we set up an ontology for describing W3C activities and
apply it to classify cases from each of the three Domains, as well as some
notable activities W3C declined to pusue.

4.1 Classifying Standardization Intiatives

In sorting out the dozens of cases one might elucidate in W3C's history --
it has typically maintained more open activity areas than technical staff!
-- three descriptive critera help classify activities. First is the scale of
the project: what fraction of the W3C Membership is affected by it or
participates in the WG; and what fraction of W3C resources are committed to
the it. Second is the type of product: code or specifications. Third is an
estimate of leadership: how far ahead of or behind the market W3C finds
itself, including accounting for the degree of sponsorship [Katz and
Shapiro] it's up against.

This is not to claim these axes are complete, or even whether these are
explanatory variables rather than consequences of other classifications. For
example, nothing above captures projects' interdependencies, nor isolates
the effect of divisional styles.

4.2 User Interface Domain

In the beginning, HTML was only loosely inspired by SGML; two of the first
hires at W3C were the leaders of the effort to establish a format Document
Type Definition for HTML2.0. That project began at IETF, but got bogged down
in a continuing stream of new-feature requests and unwillingness to discuss
new extensions in public before shipping commercial products. Soon after W3C
was formed, many of the key participants decided to shift their energy to a
closed forum at W3C and the IETF accepted that outcome; it could be said
there is a historical aloofness to dealing with data formats there. HTML2
was the only IETF RFC published; the next consolidated edition a year or so
later was the W3C Recommendation for HTML3.2, followed by 4.0.

This project affected the entire membership and the public at large,
involved a significant number of Members in the WG, and was allocated a
commensurate level of resources, two to three people plus a tech writer over
the years. The product was consensus around a specification itself, though
its features were prototyped in the Arena, and later Amaya browser projects.
Finally, W3C always seemed to be cleaning up after its' Members' messes in
this area; HTML fragmentation is the case discussed in the Appendix.
Leadership failed, in the cases of W3C's call for mathematical markup
support and cleaner integration of CSS layering.

Cascading Style Sheets were an attempt to revive a feature which had been in
the very first graphical Web browser, but was swept away in the current of
popular ones. CSS was a quiet proposal even when W3C was founded, and when
W3C eventually hired its designers, a smaller group of members hammered out
a specification which led the market by several months. CSS2 was revised
later, and was also adapted for use with XML, though that community has
adopted another, more powerful and complex style language for XSL.

The Amaya browser is a shared commitment with the Opera SGML research group
at INRIA. Relatively few Members expect to use this software product, and
those mainly for reverse engineering and reference implementation of
HTML/CSS features. In turn, it scale is a largely-internal W3C development
project with up to four FTEs. Amaya is a good example of the reference-code
paradox at W3C, that Members may be threatened by 'too-nice' a reference
platform. National style is at odds with that conclusion, though. In the
quest for perfection, the Amaya team did focus on shipping a usable product
and keeping the source code internal until then. As a result, though it is
freely available, Amaya has not catalyzed a development aftermarket akin to

4.3 Technology & Society Domain

Technology and Society opened for business with two areas in its portfolio,
security and content-filtering. Over time, it was fleshed out by JEPI, a
digital signature spinoff, demographics and privacy, and a new policy forum.
This Domain had broad support among the membership -- one AC rep even
declared it the one unique contribution of W3C, since they could do R&D at
home -- but relatively low commitment. This made the Domain's strategy of
aggressively outsourcing technical work problematic.

Security was arguably the first new W3C-tagged effort in its history. The
first working meeting of any kind was a December 1994 security workshop at
MIT where Netscape unveiled Secure Sockets Layer. A straw poll was held
comparing it to Terisa's (then EIT) Secure HTTP as a basis for future W3C
development, which triggered an uproar when it leaked to the press W3C had
"passed over" SSL. The author came on board precisely to deal with the
"security war" between these two, with a mandate to harmonize these two
entirely different protocols. Instead, W3C laid out a strategy promoting
modular security mechanisms to be layered atop HTTP. Eight more quarterly
meetings were held elucidating this principle, but without corporate
implementation support, cryptography development remained outside the bounds
of an international academic body under the current trade restrictions.
Eventually, one component became its own activity, DSIG, and the other, now
TLS, went to IETF. DSIG, in turn, spent a year or so developing a paper
specification with broad Member involvement, but no implementation support;
it, too, has become a Proposed Recommendation but has not been deployed. At
present there is no further Security work at W3C except for bits of HTTP
Authentication in Architecture.

Electronic Commerce was another topic of broad-based Member interest. After
a workshop series focusing on electronic payment protocols (which led, in
part, to SET), a working group germinated around the problem of payment
selection and integration in January 1996. It was structured as a six-month
joint pilot project with CommerceNet to recruit multiple client, server,
payment, and merchant partners to demonstrate such integration. Almost a
year later, with only one participant in each role in the core group, a
demonstration was completed, but the technology has proven premature since
the forecast split between Visa/Netscape and MasterCard/IBM payment
protocols did not occur. Up until the end, however, the outer team of
observers was still quite healthy: every Member claiming to have a role in
e-commerce attended the quarterly meetings, but none stuck out the full-time
engineers to make it happen. There was also Member support for ongoing
involvement with the Financial Services Technology Consortium (FSTC), a
Federally-funded initiative. This demonstrates the option-value of a
standards working group: if something happens, you had to have been there,
but if nothing's happening, don't rock the boat.

The Platform for Internet Content Selection is still T&S's most lasting
achievement, though it still has not been adopted as widely as hoped. W3C
recognized the need for filtering 'objectionable' content in its original
Prospectus, but it took the famously flawed Marty Rimm Cyberporn Time cover
story to light the fuse. Industry leaders immediately convened in New York
to consider technical solutions to avoid regulation, and W3C walked out with
a mandate to host a PICS Technical Committee which would develop a standard
for labeling content and distributing labels. It moved swiftly to draft
solutions under relatively tight secrecy and a limited committee size. The
result was essentially an s-expression encoding a rating vector, where the
axes were described in a separate file, and was shipped about the same time
the CDA was passed. Students developed sample demonstration code alongside
these efforts. These two eventually met in a Philadelphia courthouse, and
PICS was cited as part of the decision that CDA was an onerous form of
restraint -- a textbook answer to [Schoechle]'s query, "who can effectively
represent the public interest and make policy in progressively more
technical environments?"

Finally, the Web Accessibility Initiative also had its roots in this Domain.
The interest in providing better support for disabled access to HTML and
other Web formats, along with US Dept of Education funding, chartered a
separate International Program Office for WAI, which has been reviewing W3C
specifications and developing guidelines for accessible site design.

4.4 Architecture Domain

The main products from CERN which W3C inherited were the web site, the CERN
httpd, and LibWWW. Other tools, like NeXUS were already obsolete, or like
the Virtual Library, marginally relevant to W3C's mission. W3C decided not
to support the CERN server, but it did begin life investing in its common
code base as the primary product. To this day, LibWWW is the premier
implementation of all the nooks and crannies of the HTTP/1.1 specification.
For a while, its developers wanted to move on, and W3C's new Jigsaw
Java-based server appeared to be a plausible new base, but new versions
continue to be published of the old C base. One reason is that LibWWW
benefited from a large user base of developers who integrated it into other
products, which has not benefited its younger cousin Jigsaw to the same
degree -- probably no library can recapture the ubiquitous awareness LibWWW
once had.

The goal of maintaining such code, however, was not to build a world-class
toolkit, but to support the HTTP specification process at the IETF. This
project naturally had broad Member support, high internal resource
commitment, but operated outside W3C's fora, except for the occasional
workshop on the future of HTTP and caching developments. On the other hand,
the Next Generation (HTTP-NG) project chartered last Summer is an internal
effort, with broad support, high commitment from a few large Members, and
significant engineering resources. Though IETF has held a preliminary
hearing on the subject, it has not yet chartered a working group in the
area, so W3C is leading the way at this point.

Multimedia is another example project in this domain. Motivated staff at W3C
have pushed a core group of partner Members to propose a format for
synchronizing multimedia presentation elements over the last year. It sits
at the intersection of data format, protocol design, and user interface. On
the other hand, the Mobile Code, Object Oriented Programming, and
Caching/Replication activity areas have remained back-burner activities with
occasional special-event workshops without specification work -- and Member
engineering support -- to charter.

Finally, Metadata is an activity which has migrated from T&S. As PICS and
DSIG evolved, they were running into more elaborate proposals like Meta
Content Format (MCF) from Apple/Netscape. Just as XML ended up in this
domain because the key personnel were there rather than with the HTML team
in UI, the Resource Description Format (RDF) became an Architecture project.
This is an ongoing effort, and it's hard to judge its success as yet. Note,
though, that W3C consolidated several initially-unrelated projects under one
umbrella because its staff recognized the opportunity, and committed itself
to a common metadata solution across the board.

4.5 Projects Not Taken

One of the tradeoffs of that decision was deciding that metadata for Web
authoring and versioning -- describing sites, pages, and their properties --
was a more critical pivot point than protocol support for uploading pages
and tracking changes. WebDAV thus began as an independent effort and
explicitly shopped its proposal to W3C and IETF and chose the latter. The
Internet Printing Protocol made the same decision more informally.

There are other considerations besides limited resources. Demographics, for
example, was a hot area in Fall 1995 when W3C cosponsored the first workshop
in the area. W3C could have acted as a force for the trade in standardizing
definitions of hits, page views, and visitors and several Members expressed
interest, but it was seen, in large measure, as too crassly commercial to
pay attention to. 'We didn't build the Web for a 500-channel
advertising-driven future' was a common refrain.

Politics, of course, is another big reason. W3C could have made a case to
standardize the LiveScript/JavaScript language, or even put in a bid to
manage the whole Java language, but these were sufficiently polarizing
industrial political battles that it would have interfered with W3C's value
as a neutral forum on other issues.

5. Conclusions

The previous three sections have discussed how W3C converged on its current
structure and processes as it learned from each new activity it has
undertaken in the last three and a half years. Lamarckian evolution --
iterative design, small changes intentionally leading to large ones --
appears to be an apt metaphor for rise of this new beast in the standards
world, part research lab, part think-tank, and part legislative body.

Understanding the history of adaptations W3C has made to date might
illuminate its future course, such as the likelihood it might open a branch
office elsewhere in the US (low), spinoff a for-profit consulting arm like
CommerceNet's CNgroup (very low), sponsoring a working group for CALS-in-XML
(low), defining new FORM input types (high), or defending caching under the
fair-use doctrine (moderate).

Beyond modeling particular decisions W3C might make, this framework is best
at explaining entire threads of common work. In a marketplace founded on the
Network Effect [Rohlfs], the Web's designers are less worried about how to
encourage mass adoption of a standard than about Path Dependence [Arthur]--
putting the right standards in place for the next step in the sequence.
W3C's commitment to evolution and divisional synergy best explains why a
point-opportunity like filtering indecent content is answered by a
general-purpose ratings hook, one that was subsequently exploited again and
again as the foundation for digitally signed Web content, for privacy
protection, for collaborative filtering, searching and indexing, and more.
Perhaps we have not outgrown the military metaphors of IT-as-revolution
after all: we see here a band of revolutionaries storming the capitol
suddenly plotting strategically to establish control over the countryside.

[This is a working paper; please offer corrections, comments, and
suggestions for publishable derivative work to rohit@uci.edu. The insights
of X Consortium and CommerceNet veterans would be especially useful, as well
as Tim Berners-Lee's April 1998 WWW7 Keynote, "Evolving the Web".]

5.1 Evolution as a Technical Quality Per Se

Another consequence of this framework is the hallmark quality of evolution
within W3C's technical products. The Evolutionary Design of Complex Systems
(EDCS) project here at Irvine speaks to the concept of evolvability as a
quantifiable feature of IT; W3C has also demonstrated its insistence on such
evolvable systems time and again.

End-user extensibility is one of the particular emphases of the Web project:
anyone should be able to throw up a new site, write a new tool, or wire up a
new gateway entirely on their own. Active proxies (language translators,
graphics compressors for handheld displays, cookie-monsters) are another
classic example of third-party extensibility on the Web. W3C has worked
mightily to undo some of the fixed points of its original revolution. HTML's
centralized evolutionary path forbids the creation of ad hoc tags; XML
allows it. Mass market browsers allowed users to tweak fonts, size, and
color directly; W3C countered with CSS1 to manage them hierarchically,
symbolically, and other layout features to boot. HTTP headers allowed anyone
to extend it willy-nilly; W3C proposed PEP to bring order and formalized
semantics to extensions. Entrepreneurs rolled out customized 'wallets' for
e-payment systems; W3C invested in an open payment hook in JEPI.

And yet, W3C does not go so far as to accept optional features of a
standard. One of the lessons learned from ISO, from SGML in particular, is
that optional features can be fatal to widespread adoption. On one hand,
closing participation to only a handful of impacted vendors increases the
returns to partial-standardization; there are profits to be made in
heterogeneity. The integrity of the technical staff appears to be the only
counterweight, the constituency which does not allow 'live and let live' in
the endgame and actually forces tough decisions (e.g. refusing to document
the LAYER tag).

Unlike traditional SDOs, in which design authority lies in the hands of the
proposers, outside the organization, W3C is a horizontally integrated forum
that increases the momentum of this technological frame. In other words,
W3C's central technical staff is in a position to note the potential for
extensibility, and to enforce it -- to write it into its products, and to
actually reuse products 'down the food chain,' -- in ways IETF and ISO

5.2 The Suburbanization of Standardization

While there are valid reasons for the rise of industry consortia --
primarily the new pace of business and the primacy of corporate commitments
over sovereign ones -- most seem to speak to a streak of privatization
affecting so many other areas of public life. If consortia are the gated
communities of standardization, what does that make SDOs? Decaying inner
cities, held down by bloated infrastructure budgets and excelling at
representing diversity of cultures, while fragmenting under the onslaught of

That's no excuse to count out ITU/ISO/IEC/IEEE, though. [David and Shurmer]
discuss how those dinosaurs are adapting to the rise of mammals. There are
still significant institutional factors behind their existence. Beyond the
imprimatur of 'international standard', these bodies also control the marks
of professional approval. The very credo of professionalism argues that
there are abstract standards to be upheld independent of corporate
interests, which is why professionals invest so much time into the process,
and why professional societies establish them. Organizations like W3C are an
implicit affront to professionalism, since their doors are closed to the
merely competent, and their decisions are approved by managers, not
fraternal engineers.

There are also complaints about W3C from its cousins at IETF: that progress
comes at a price, perhaps too high. Yes, some kinds of committees like HTML
are too politicized to make any progress in an open IETF WG; but is the
degree of closure and delayed-review worth it for other activity areas? Once
again, the glint of the double-edged sword which is its technical staff. The
IETF community appears to W3C staffers on a personal basis through their
participation in IETF processes to check such tendencies, and to give W3C
the benefit of the doubt over turf matters. PICS was considered, and tabled,
as a handoff between the two; payments was deferred from IETF to W3C; but
keep an eye on where the first HTTP-NG drafts surface.

W3C's published records also illustrate its tendencies to privatization. The
only language in the Membership Agreement mandating confidentiality is a
30-day advance review on reference code publication, yet that precedent is
not read as a statue-of-limitations. Internal newsletters, project reports,
and the like are never published; even the site's search engine is
restricted to Member access. Its attitude towards establishing a publishing
program has been more colored by distaste for signing contracts with one
publisher-Member over another than any commitment to establishing a Journal
as "individual membership," as Director Berners-Lee once claimed.
Ironically, W3C seems to have taken the lesson of document liberation from
the Bruno server [Malamud, in Drake] without preserving a legitimating
archival series of any sort. There is essentially one public feedback forum
on record; the W3C track at the annual Web Conference series.

Finally, it is an oft-cited meme at W3C that since it only issues
Recommendations -- not standards -- it is somehow immune to antitrust
claims. In theory, that doesn't make a difference; there are no exemptions
for clean living in [Carlton and Klamer]. The point of fact is whether a
forum was closed enough to allow competitors with an intent to monopolize to
plot against others, even other members. Furthermore, there is a possibility
of abuse give the scope and interlocking nature of Web standards we have
discussed. There are significant asymmetries in the amount of information
and influence upon W3C's agenda amongst its members, the one warning flag in
[Lemley]. Nevertheless, I believe W3C is indemnified on balance, because of
the beliefs of its key managers, if not its procedures and (nonexistent)
sunshine provisions.

5.3 Modeling Member Participation

So far, this analysis has only proceeded through W3C's actions. Future work
should definitely address how Members adapted to the rise of W3C as well.
Clearly, there are differences in Member participation and satisfaction with
W3C, but perhaps rigorous analysis can replace emotional judgments in
explaining why Netscape so often stands askance and Microsoft has
all-but-embraced it (bear hug?). The continuation strategies in [Teece] all
appear in practice as companies use W3C processes to shore up their
installed base or attack their competitors. After all, as [Katz and Shapiro
94] declared, "In systems markets, even more so than in other markets, firms
with established reputations, well-known brand names, and ready, visible
access to capital have competitive advantages. These are the firms that are
less likely to choose an open-system strategy."

6. References

Carl Cargill, Evolution and revolution in open systems, StandardView 2(1),
1994, pages 3-13.

Dennis W. Carlton and J. Mark Klamer, The need for coordination among firms,
with special reference to network industries, University of Chicago Law
Review 50, 1983, pages 446-465.

*E. Raymond Corey, Technology Fountainheads: the Management Challenge of R&D
Consortia, Cambridge: Harvard Business School Press, 1997

Paul A. David and Mark Shurmer, Formal standards-setting for global
telecommunications and information services, Telecommunications Policy
20(10), 1996, pages 789-815.

Paul A. David and W. Edward Steinmueller, Standards, trade and competition
in the emerging Global Information Infrastructure environment,
Telecommunications Policy 20(10), 1996, pages 817-830.

Giovanni Dosi, Technological paradigms and technological trajectories: A
suggested interpretation of the determinants and directions of technical
change, Research Policy 11(3), 1982, pages 147-162.

William J. Drake, The Internet religious war, Telecommunications Policy
17(9), 1993, pages 643-649.

Joseph Farrell and Garth Saloner, Coordination through committees and
markets, RAND Journal of Economics 19(2), 1988, pages 235-252.

Ole Hanseth, Eric Monteiro, and Morten Hatling, Developing information
infrastructure: The tension between standardization and flexibility,
Science, Technology, and Human Values 21(4), 1996, pages 407-426.

Mark A. Lemley, Antitrust and the Internet standardization problem,
Connecticut Law Review 28, 1996, pages 1041-1094.

Michael L. Katz and Carl Shapiro, Technology adoption in the presence of
network externalities, Journal of Political Economy 94(4), 1986, pages

Michael L. Katz and Carl Shapiro, Systems competition and network effects,
Journal of Economic Perspectives 8(2), 1994, pages 93-115.

*Kevin Kelly, Out of Control: The New Biology of Machines, Social Systems,
and the Economic World, Addison-Wesley 1995.

Charles P. Kindleberger, Standards as public, collective and private goods,
Kyklos 36(3), 1983, pages 377-396. Jeffrey Rohlfs, A theory of
interdependent demand for a communications service, Bell Journal of
Economics 5(1), 1974, pages 16-37.

William Lehr, Compatibility standards and interoperability: Lessons from the
Internet, in Brian Kahin and Janet Abbate, eds, Standards Policy for
Information Infrastructure, Cambridge: MIT Press, 1995.

Martin C. Libicki, Standards: The rough road to the common byte, in Brian
Kahin and Janet Abbate, eds, Standards Policy for Information
Infrastructure, Cambridge: MIT Press, 1995.

*Carl Malamud, Exploring the Internet: A Technical Travelogue, Englewood
Cliffs, NJ: Prentice-Hall, 1992.

Timothy Schoechle, The emerging role of standards bodies in the formation of
public policy, IEEE Standards Bearer 9(2), 1995, pages 1, 10.

David J. Teece, Capturing value from technological innovation: Integration,
strategic partnering, and licensing decisions, in Bruce R. Guile and Harvey
Brooks, eds, Technology and Global Industry: Companies and Nations in the
World Economy, Washington, DC: National Academy Press, 1987.

[There aren't many citable sources of Web history to date. One exception is
the Web Journal's People & Projects and Interview series from 95-97 --RK]

Notes: which reading had the definition of forum-shopping?

Appendix: W3C, the Blue Helmets of Cyberspace

Fri, 18 Apr 1997 19:37:20 -0400 (EDT)

[I was particularly proud of these two contributions to today's W3C

A City Upon A Hill

by Rohit Khare

Ye are the light of the world. A city that is set on an hill cannot be hid.

-- New Testament of St. Matthew

Whatever happens, we have got
The Maxim Gun, and they have not.

-- Hilaire Belloc 1870-1953

There's a passionate thread in American history known as the doctrine
of exceptionalism: that in rising afresh on the shores of a virgin
land, America developed a society unlike any other: inherently fairer,
more prosperous, and somehow destined to lead the world. The shining
beacon was lit by the Pilgrims themselves, whose leader planted the
seeds of the American ideal not far from Cambridge with the
proclamation "we shall build here a city upon a hill for all to look
up to".

To put it mildly, American exceptionalism is a somewhat disreputable
school of historiography. But, it is an inspirational tale. It sheds a
little light on the cultural crusade which is America. Today, I cite
this image to highlight how I feel about the World Wide Web

Amidst the chaos and carnage of technology battles, the World Wide Web
is a certain sort of earthly paradise, a sunny island of relative
tranquility. The Web has evolved at a breakneck pace because of the
standards and brotherly love of the Web Community. We are doing things
with computers today that no one dared imagine in a universe of
successive releases of proprietary products. Who among us would have
the absurd ambition to write a meta-OS for every computational device
in the world?! But now we are set on that course...

For today, it is W3C which leads the way forward. Not by divine right,
not even "because we have Tim Berners-Lee and they do not". We lead by
dint of hard work, fairness, shared community, and a vision that the
Web can remain a level playing field. To be an example to the world --
that is cause for humility, not pridefulness. Let us put our heads
down, together, and set to work!

The Blue Helmets of Cyberspace

_Jesse Berst's _AnchorDesk news service_ recently ran two articles
crying out that the feud between Netscape and Microsoft threatened to
split the Web and the Internet. He has since organized _a petition
drive_ to persuade these manufacturers and others to abide by 'net

In my visits to Microsoft and Netscape, I've come to believe both
companies genuinely want to adhere to public standards. Yet both
companies are also fiercely competitive. In their zeal, they sometimes
step over the line.


The Good News. Despite the current crisis, there is some good
news. Both Microsoft and Netscape are fully involved with the Internet
Engineering Task Force (IETF) and the Worldwide Web Consortium
(W3C). Both companies meet regularly with the standards groups and
with each other to hash out conflicts and reach compromises. In
addition, the standards committees have been streamlining the process
so proposals can be ratified more quickly.

I believe there may be ways to (1) accelerate the work of the
standards bodies and (2) create a logo program so consumers can be
sure they are buying a compatible product. I'm researching these
possibilities now and I'll be back soon to report what I find.

In between those two events, we offered up W3C's perspective on these

Jesse --

You certainly grabbed our attention with two of your recent Berst Alerts--

1. _Microsoft, Netscape Feud Puts HTML's Future at Risk_
2. _Netscape Push Announcement Puts Internet in Even Greater Danger_

We appreciated your approach as readers when you alerted the public to
the state of Java standardization, and we appreciate your efforts to
highlight the state of Web standards today. On the other hand, your
coverage seems to position the froth and margins of industrial
competition as vital threats to the commonweal. In fact, we believe
W3C is doing an excellent job of mediating today's technical
differences while leading the evolution towards a richer World Wide

We'll cover some of the specific situations you allude to below, but
here's the take-home lesson we're talking about:

1. W3C does have the key players around the negotiating table
2. W3C is moving at record speed
3. W3C is coordinating many, many aspects of web evolution according
to a common vision.

A. About W3C

The W3C has been an active voice in industry technology debates for a
little over two years now. Today, we represent over one hundred and
seventy developers, research organizations, government agencies, and
users. We have a technical staff of three dozen folks around the world
working in three Domains on thirty Activity Areas. We are not a:

1. Standards body, because we do not make legally binding decisions
2. Research Think-tank, because we work on the here-and-now
3. Trade Organization, because we represent the public trust

We operate in many different ways: by developing consensus among
current implementors, building and deploying experimental technology
on our own, and by initiating multilaterial implemenation
projects. We're not Web cops rapping people on knuckles and holding
them back -- we're more like the Blue Helmets of the UN keeping the
peace and striving for ever-greater harmony and tackling ever-larger

B. About '[dD]ynamic HTML'

In some areas, such as HTML, the first step is to stabilize the
patient. HTML 3.2, our first Recommendation in this arena, was
released last year to capture the baseline status of the HTML debate
as of January 1996. That does not mean we are a year behind -- we have
been making rapid, separate progress on many other components of the
next generation of HTML, codenamed Cougar. This includes: an OBJECT
embedding standard, FORMs revisions, accessibility features, scripting
integration, and much more. Future work includes a new Math markup
model, interactions with XML, Web Collections, Style sheets, ... All
work that you suggested "should have been finished last year". Well,
it's not that simple. Remember, the SPA's budget is an order of
magnitude larger than W3C, and all it does is sue users of pirated

The key is building trust around the table. Our Working Groups for
HTML, Style Sheets, Document Object Model, and others, represent the
leaders across the industry (far more than just the 'big two'). We
have demonstrated a lot of concrete cooperations coming out from these
quiet peacemaking efforts. The CSS Positioning draft, for example, is
coauthored by Microsoft and Netscape representatives. Many, many
aspects of CSS have been developed through implementation experience,
so quite rightly many vendors (SoftQuad, Grif S.A., and more) have
shipped CSS-based products before our CSS specs.

In the current marketing tussle over '[dD]ynamic HTML', a lot of
territory is being cloaked under the fog of war. As you explained in
your piece, this is not any single technology: it is a bundle of
approaches to animating HTML, formatting, and browsers. So one cannot
speak of an entire "incompatible" approach, one has to look at the
constituent technologies. Both sides support HTML3.2, CSS, and so
on. Netscape has experiments with JavaScript Style Sheets, Microsoft
has theirs. Some parts differ naturally -- we are only beginning the
requirements analysis phase of our new DOM (Document Object Model)
Working Group.

It is never a matter of "how long the W3C took to endorse the
proposal" -- we don't make standards, and we don't endorse. We are an
active partner in leading the evolution of the Web, which is why we
have a staff of the world's best (sometimes only!) experts on Web
technology. These are hard problems, and need to be solved carefully
since we are designing the legacy systems of tomorrow, today.

"It's all the more frustrating since the two companies could solve the
problem in about a week. Just lock the technical teams in a room with
the mandate to compromise. But then, the Israelis and Palestinians
could end their strife any time, too." -- the technical work from all
sides -- and not just MS and NS -- belies these claims.

C. About Push Technology

The same dissection refutes the claim that incompatible push
technology threatens the entire Internet. (Of course, push itself
might aggravate existing bandwidth challenges, but W3C is addressing
that in a concerted way, too). Push technology has many parts -- the
TV guide, the content itself, the transport protocols. Many of the
parts are strongly in common: a web page is stil HTML + embedded bits
whether in PointCast or IE4 or Netcaster. We are sponsoring extensive
work into HTTP and HTTP caching to keep the protocols effective in
these scenarios. And if there's competitive debate on 'channel
listing', all the better -- we have a process in place for Members to
raise these concerns such as the Submission process Microsoft has used
to offer its CDF for review. And even then, the goal for W3C, and the
Web, may not be to define any particular channel format. After all,
their CDF proposal leverages XML, a superset of markup languages which
could potentially render many format-compatibility questions
moot. We're after big game here:

* W3C works on hooks for payment systems, not payment protocols
* W3C works on hooks for applets, not APIs
* W3C works on hooks for fonts, not font package formats
* ...

D. Moving Forward

We've been reading the reader comments, and it's clear you've hit a
nerve by insisting on protecting our investment in Web technology
through open standards. We're thrilled whenever anyone stands behind
that vision and attracts followers (== users who buy products). We're
all waiting to hear about your next moves...