An Evolutionary theory of compatibility standards [Re: The verb "to

Rohit Khare (
Thu, 12 Feb 1998 01:35:40 -0800

> > Now, now, even has its upsides. Sure, it's a technologically
> top-heavy
> > piece of framed JavaCrypted crap -- it *consistently* crashes netscape --
> they
> It used to be that if a program crashed when given some input data, the
> program was at fault.
> Now (at least with browsers) it seems that we blame the input data.
> As a Computer Scientist, I'm not convinced this is much of an advance.
> - Joe

A fascinating thought indeed. It's probably because (before netscape went
free), it was more futile to expect the program to change than to affect the

"blame the tape, not the turing machine"

I wonder if XML will make a difference. The W3C Activity Page makes the
outlandish claim (in patent Dan Connolly repeat-until-true epistemology):

: XML as a simple method for data representation and organization will
: mean that problems of data incompatibity and tedious manual re-keying
: will, by and large, be solved.

MIME also has a role to play, especially since even in the far future, not all
data will be in XML (Dan Connolly Alert: Rohit is admitting to doubt, hence
dooming proposition to failure! Bzzt! ... guilty as charged).

In the optimistic case, what's going on in the Big Picture is a sedimentation
of standards. The fulcrum of interoperability is going up in some linear
progression with data formats at the apex.

binary compatibility : was the first step in a computer market.
hardware compatiblity: went hand-in-hand as plug-compatible mainframes appeared
OS compatibilty : like VMS, UNIX, CP/M, DOS era stdization of filesystems
API compatiblity : MacOS, Posix, Win32, other 'services'
App compatiblity : competition within sectors: Baan vs. SAP, Excel vs. 123
??? : I think of Web standards right in between here
Data compatibility : formats for information are stable: HL7, MathML, ChemML

Each 'step' in the evolution is another competitive marketplace, and I think
there *couldn't* have been competition between "purchase-order processing tools
that manipulate a commercial standard/inhouse-extended purchase-order cocument
format" until now. EDI used to be a similar metastandard, and it was 'too
early' to take advantage fo a fully competitve market. Now, XML/EDI may hit
the sweet spot.

Technologically, on any *given* computer, there have been standards at every
one of these levels. NeXTstep, for example, was:

binary : x86, 68k, SPARC, PA-RISC
hardware: SCSI, NuBus, and damn little else :-)
OS : Berkeley-unix-like mach flavor
API : what became OpenStep
App : Pages vs. FrameMaker (fourth-party plugin competition on top of apps)
??? : Common services like DigLibrarian,

But, the overall market has evolved to compete, stabilize, and ascend in the
order I'm claiming (yes, circular logic). The data formats, for example, were
not MEANINGFUL to the mission-critical app at hand, though. They were just
interchange formats at the lowest level, not 'purchase order'

In fact, in NeXTstep, arguably the most mature sw marketplace of its day,
custom software was still defined as 'objectware': you bought and sold
Objective-C interfaces, the Turing-complete cookie -- never the data-format
creme filling, which was always hidden. I think of my eText tool as an early
stab at selling the filling instead, by being a multiformat editor (text,
html, rtf, latex)

Anyway, in the long term, data will be sacrosanct, we will have 'data
architects' (hey, didn't I already have that title last summer?) become more
valuable player than programmers in laying out the *permanent* assets while
coding becomes more ephemeral. The market has to progress in each step,
though. Today, the custodians of the format standards (W3C + the users) are
more malleable than the lumpencode we get as browsers -- and so we complain
that frames and JavaCrypt are to blame...

Dysfunctionally yours,