BTW - your Erich Gamma search post rings false to me. You
want to know what he's up to? You have his email address(es)!
email@example.com and firstname.lastname@example.org as you discovered
Ask him directly - don't pretend to be "asking" FoRK.
How the hell should we know? What you were really doing, of course,
was showing off to us the stuff you're reading up on.
Which is fine. Just say so :).
Oh, and transport level issues *DO* matter. A lot. And you know so.
So nyah. Don't let Rohit go putting easily refutable words
in your mouth- it doesn't make you look good.
Here's some idle thoughts I had reading this. I had previously
never heard of Robert Cailliau, so am willing to be swayed
in any direction in what I think of him or his ideas. If I'm
a Great God Of The Web in these comments, I hereby preempt
flaming retorts by abasing myself at the altar of Church of
Webology and doing the following penance ...
written out in longhand on the blackboard 1,000 times:
XML Saves Our Souls. XML Saves Our Souls. XML Saves Our Souls.
XML Saves Our Souls. XML Saves Our Souls. XML Saves Our Souls.
XML Saves Our Souls. XML Saves Our Souls. XML Saves Our Souls.
XML Saves Our Souls. XML Saves Our Souls. XML Saves Our Souls.
XML Saves Our Souls. XML Saves Our Souls. XML Saves Our Souls.....
So here, with no further ado, my irreverant blasphemy:
> This story has not been told much, but we had all that, and the
> documentation was there as well. We had our own process control
> network; we had our own interpretive language. We had the whole
> thing, and this was 1973-74.
> So in the early '70s, you had networking, mobile code, and byte
> interpreters. You had the equivalent of Java and the Internet.
> In a sense, yes
This is neat revisionism :-).
Rather hard to swallow, but a good tale nonetheless.
Note carefully: Cailliau doesn't say 'byte interpreters' or 'mobile code' -
those are the interviewer's words. He says "interpretive language".
Fine - lots of those around, even back in the 60s and 70s. But
was it cross platform interpretive? Abstracting (masking) subtle
differences between machine architectures? Compact format
for network transmission? I'm not trying to say Java is fantastic
and first-of-a-kind. Just that an "interpretive langauge + network"
does not equal "the equivalent of Java".
> and of course from the
> US-perspective, if it isn't done in the US it doesn't exist in
> computing, right?
> Like it never happened.
> Yes, exactly. It's as if it never happened if computing is done
> outside the US.
That kind of stuff is obnoxious. I resent non-American aspersions
on US-centrism as much as I'm offended by examples of
to The World Outside. Linux began in Finland, and is today a global
Internet based activity. OO began with Jacobsen in Sweden/Norway
with Simula. Americans are approximately as good, or as bad, as
anyone else, when it comes to recognizing the achievements
of others, or to burrowing into nationalistic NIH mode.
Not saints, not evil sinners either.
> But the Web certainly made the US sit up and take notice.
Yeah, cuz Andreeeeeesen invented the web ;-). (Just to rile y'all).
> So it was integral. And it displayed GIF?
> No. It did not display bitmaps, bitmaps aren't scalable, right? You
> can't do anything with them. They're nonsensical. Bitmaps are not
> graphics; they're the display result of graphics. You can't express
> graphics in dots, and a bitmap does not have a metric. It has no
Heh. In short, a bitmap is lowgrade bits. Dumb. No self-knowledge.
TIFF. Fax. - ugh. Can we web guys and object guys agree on one thing -
that the world would be better off by smashing its fax machines?
What a waste of "digital" technology.
Oh, don't like that idea? Well, when the Cern guys (Tim/Robert)
talk about "the web version is the original version, hardcopy
is a derivative of the online version" - what do you think they
mean? Basically that its a lot easier to turn smart bits into dumb bits,
than the other way around.
> What kind of navigation scheme did you use? Was there anything like
> bookmarks or a history of places you had visited?
> Why would you need that? Every time you clicked, you got a new
> window. If you found an interesting window, you linked it to your
> home page. Your home page, which was a piece of HTML sitting on
> your machine--that's what a home page really means--functioned as
> your "bookmark" page. Since your home page was still there,
> alongside with the newly found interesting page, you would put your
> cursor there, click on your mouse, and drag the link. That's it. If
> you needed more than one page for bookmarks, you would just create
> new local HTML pages.
> It captures the URL for you?
> Yes, so bookmarks make no sense, right? I mean, why do you want to
> separate these things out into special formats for bookmark or a
> special format for this or a special format for that? Everything
> was just simple, pure HTML. No need for anything else, because it
> was instantly editable. Just click here, click there. And it's
> fast. Much faster.
Faster than what? I don't get this claim at all. Really, what he's
suggesting is that you *do* have bookmarks, but on an html page.
Fine. But the way I record bookmarks now in Netscape is with
a quick mouseclick, click combo on my menu bar. I can't see how speed
can be improved. It then gets saved to - guess what - an html file!
So what exactly is the complaint here?
> It was written in pure, flat
> C. Not ANSI C, mind you. The C it was written in was sort of
> compilable by any old C compiler on any old machine. The idea was
> that it had to be accessible to everyone, everywhere, even on an
> IBM PC--AT or lower. At one point I tried to put some order in and
> redevelop the documentation and some of the code. I started putting
> in some indentations so I could at least read the code, but
> unfortunately that was ANSI C, so it didn't compile.
See, this makes me wonder about this guy's credentials altogether.
How do indentations in C code make it compilable or not? They're
just whitespace, which has always been fine, since C has
always, since earliest K&R days, been free format. Maybe
he was "cleaning" up the code in other ways that only
an ANSI compiler would take, but he doesn't say that, does he?
> But this is not the way this crowd
> operates. They can talk to each other without ever seeing each
> other, over e-mail, and then...
> What's wrong with that?
> There's nothing wrong with that, it just doesn't necessarily
> produce what's needed. It's also not the way society works.
Wow. Would FoRKers agree with that? The nerve of this guy,
suggesting we have no "society" :-).
As to producing what's needed, tell that to the Linux guys.
> Mathematicians have
> been working on SGML DTDs for mathematics. But the Internet crowd
> doesn't necessarily want to talk to them and vice-versa. There are
> many missed opportunities here!
Sigh. Replace "mathematicians" and "Internet crowd" with
"all the people involved in commercial software design, realtime
systems, agents, robotics, graphics, AI, knowledge management,
document management, databases, OO, transactional systems,
protocol work, online
gaming, virtual reality, distributed systems theory, parallel
.... don't necessarily want to talk to each other and vice-versa.
There are many missed opportunities here!"
So, we have the same problems showing up in 20 different fields,
and 20 "different" solutions, the same in all but name, conventions,
and attitude. Sigh.
> I really believe that no telecom, no software company, no hardware
> company could have come up with this.
Heh. No telecom, certainly. Software company? Depends. They
come in all sizes and attitudes.
> I personally didn't want the images in line. It's a nuisance
> because you can't keep the image in view. For example, when you
> read a physics paper, you want the diagrams in view while you
> browse the text. You don't want to lose that image, and you don't
> want it to scroll out of sight. But all browsers today do that,
> it's just like a platform-independent presentation of the printed
This also bugs me. I'm willing to be talked out of it, but my
present instinct is that inlined images are good. The human
visual field is always going to be a limiting factor on how
much "real estate" you have for non-inlined windowed views.
It's a pain in the neck to move your eyeballs around all the time
to different windows. Also, using his example, the diagrams
I want to see are only the ones currently relevant to the passages
I'm reading - if every diagram is in a separate window, how do the
"right" windows stay visible as I'm reading about them? Inlining
has a first-order approximation to this, by physically putting the
images close to their points of reference. Perhaps the ideal
would be inlined images which are also links, so that a quick
click on one pops it up in a parallel window, if you happen to want
Besides, take his argument to the extreme - why stop at disliking
inlined images? What he really means is that he wants all atomic
elements of information available in a parallel, visual web of windows
on his desktop. It's a pain that the para. ahead or below this one
is out of sight by being "inline". Solution? A window per para.
But paras. are themselves not atomic elements. Solution? window
per sentence. per clause. per character. Nope, there's a point
where it's the right thing to do to inline "inline", not "i" -> "n" ->
"l" -> "i" -> "n" -> "e".
> Ah, you roll your eyes at XML! I don't know, do you?
> Every time I hear extensible something-something-something I get
> very nervous because it opens the door for incompatibility. It
> means a version that works differently, which must be because you
> don't have these-and-these macros or whatever it is that makes it
> an extension. Yet I have hopes for XML.
Hah. Like Keith says, floor wax and dessert topping. XML. Whitens
your teeth. Housetrains the dog. And cures the common cold.
Oh yeah, solves world hunger too.
> It still is given away. You can still go to France Telecom and pick
> up the basic model for free (although you won't get a telephone
> book). The point is half of the households don't get one.
> Because they're not interested, or they can't handle it. It is too
> complicated an object with too many buttons on it. There's a
> keyboard there, you know...
Because we, the 97%, just can't be bothered. Leave us alone
The 97% manifesto:
"Look you 3%ers, oh so smug. Look at the numbers: 97. 3.
You're very clever and all. Can you tell which is bigger? Which
is vastly, unfairly, unequivocably bigger? Good. Bigger is better.
Bigger wins. Bigger wins in elections. Bigger wins wars.
Bigger wins in marketing and business. Bigger bullies
win in the schoolyard. We win. You lose. You can stop being
> type. She just doesn't want to deal with the computer that she has to
> install software for, or that she has to reboot. She wants an appliance
> that is failsafe.
Oh, well, then! It's obvious! She should build it with XML, duh.
Didn't we tell you XML is "failsafe" (whatever that means),
in addition to being an excellent
source of Vitamin C, and recommended by the American (and
Canadian!) Dental Associations as effective in the prevention of
tooth decay, when used regularly?
> Even the appliance will run complicated Java applets with
> interactions where you have to respond, and that's where it comes
> back in. Maybe you don't have to handle the configuration in your
> network computer, but you have to learn this Java applet which is
> complex, complicated, different from whatever you saw before. You
> have to declare yourself, you have to get your preferences in, you
> have to configure it.
> Am I hearing that you also don't approve of Java applets?
> No, I didn't say that. I'm saying that with Java programming the
> interface has come back. The diversity of applets is akin to
> configuring my computer. Instead of having one interface to HTML
> pages, I will have an interface to the bank and a different one for
> the grocery store and one for....
Now this is really crap. I'm sorry, but I'm really losing patience now.
So the visual interface an applet presents is confusing because
there are different applets, with different radio buttons & drop-menus
for different domain settings. But somehow the magic of HTML
means that I always have the same controls, regardless of whether
I'm ordering a pizza, or playing an online crossword?? This
is ridiculous. Let's agree on some basics here.
1. Model representation, independent of rendered view, is a
Good Thing (TM). Object/Java/MVC folks know this. So do
web folks. Here's an axiom we can agree on, and quit fighting
about. It doesn't matter who thunk it up first, it's just true,
so leave it alone.
2. For a given domain object/model/representation - whatever
you want to call it, when it comes time to render it in some
view or other, the model will influence that view - it has to.
It's supposed to. A crossword puzzle model should somehow
provide a way to "see" the grid (ascii/lineart/ bitmap, 2d/3d, colour/bw,
with neat singing sound effects or without...), and to enter
solution words into the cells of the grid. Meanwhile, a pizza-order
object should render with fields/drop-menus/radio-buttons, etc.
to select anchovies/pepperoni, your name/address/visa number, etc.
A crossword will, and should, present a different view to the user
than a pizza order. Whether it's a Java applet, a windows app,
or an html page. Suggesting that Java does something evil
that web-forms don't is insulting to both of these, imho. Both
of them can render views made from underlying widgets. You
can design good, intuitive views for good, semantically clean
underlying models. Or you can design crappy, hard-to-navigate
and use views, for semantically broken models.
As Dave Crook, Adam, and everyone seems to agree - *good design*
is what counts. Not Java vs. HTML forms. Ah, but of course,
XML changes this. Because with XML, you can do lousy design,
and still have the miracle cure of "scrub-a-dub" XML to make
it all come out right. (I'm being purposefully nasty, of course,
to make a point. That being that you - karmakids - are over
evangelizing your baby, imho.)
> So what you're saying is that people who can't do tax forms can't handle
> even forms on the Web.
> It's the same complication.
> There's a cut right there.
> Yes, and I'm not saying that this is 50% of the population.
'Fcourse not. He didn't ask FoRK. 97%, of course.
> You were losing clients?
> Physicists were realizing that anything I put on the server was
> visible worldwide. Now if physics experiment A is distributing the
> minutes of its meetings, it doesn't want physics experiment B that
> is also looking for the Nobel Prize to read them as well, right?
> But they're scientists.
> There are a lot of things you discuss with your colleagues, and
> there are a number of things that you don't. So they were holding
> backon further usage of the Web for what it was really for. They
> were using it halfway, but not as a collaborative tool, because it
> was too open. And this was not the idea of putting locks on the
> Web. I mean, I consider a lock just a signal that says if you go in
> ...you're trespassing.
> Yes. You're not respecting the distance that I wish to keep. I am
> not in favor of all this cryptography stuff, right? I have read
> comments saying you should use cryptography even on your home
> computer, because your wife might look at your disk. That's going
> too far, but at least you put up some signal saying this is
> private. I am now in the toilet, close the door. But this was not
> well received in the beginning--not at all.
Adam&Ro - you should send this guy your 'web of trust' papers.
A web without a trust model will not go far at all. It's surprising
it's gotten out the door at all, come to think of it.
> Right. So this young employee is a Unix wizard, and he keeps all
> these things together, and at night he does extracurricular things,
> like with the Web--so let him do it. But it's not in the
> contract. There is, however, no way to reconcile our current
> civilisation with the so-called credo of the Internet: "We don't
> beleive in kings, presidents and voting. We believe in rough
> consensus and running code." That credo is fine if you also grow
> your own potatoes and bake your own pizza. Then it's okay, and I
> can live with it.
Yeah, well the "internet crowd" will barter their bits for potatoes,
and then we'll have our anarchist economy, minus kings
and presidents :-). So seriously, what is the exchange rate
for kudos-potatoes? Have you guys thought at all about efficient
bit/atom currency exchange?
I've been thinking about that one a lot. Even if you take out
the central banks by collapsing their ability to generate tax revenue
etc, that still leaves hard problems of getting potatoes in your tummy,
when all you have is a FoRK archive in payment.
> But not if the kings are supporting you.
> To maintain this philosophy you must become independent of
> them. Then if you have found a way to sustain it maybe I'll join
> you. But you must have some order somewhere. You cannot go without
> a system of making decisions, or it just doesn't sustain
> itself. And so maybe it won't. Or maybe I'm too old for this.
Wooh, I'm starting to warm to this guy after all :-). Skeptical, but
he's got an open mind to the Brave New World. Healthy attitude.
Yes, you do need decisions. Rachid & all are
right to put "consensus" at the bottom of the pile of elemental
building blocks to work from. Consensus algorithms can be
done fully distributed, no leaders. See my recent response
to that multicast draft for more....
> Yeah, I mean essentially America has completely wiped out--whether
> this has been done consciously or not historians should find out; I
> will not volunteer an opinion on this--but you have essentially
> wiped out all computing industry in Europe.
[tons of anti-US blather deleted. Sun buys Chorus, but no, there's
no computing industry in Europe. Look, he'd never heard of
SAP. And we're supposed to trust his opinions on this?
Anyway, who cares? The discussion is supposed to be
abou the web, right? Why these long boring interjections
> No, I think the really innovative guy in computing is not really
> driven by money. I don't think he is. Tim certainly is not. Most of
> the people that I have seen that really have done something,
> they're not driven by money. Those who are driven by money and have
> achieved money, they have not produced any innovative ideas. I
> shall not name them.
Hmm. There's a lot of truth in the above. But it's not always true. Money
& innovation *can* mix, they just don't necessarily always mix.
A recent report in the New England Journal of Medicine has
found that while moderate alcohol consumption (1-2 drinks
per day) has a beneficial effect on reducing heart disease,
and that red wine consumption has even greater systemic
disease prevention abilities, apparently consuming moderate
amounts of alcohol while creating online content using exclusively
XML and style sheets leads to uniform, across the board
lifespan increase. In this remarkable discovery, conducted
in a research program spanning teams in the US, Europe
and Canada, it was found that markup using XML miraculously
removed years of accumulation of arterial plaque, and reversed
the damage to lung tissue caused by heavy cigarette smoking.
When combined with moderate drinking, XML usage seems
to practically assure the reversal of all major causes of bodily
deterioration. Said Dr. Karmakhare, chief scientist responsible
for the study "We believe that XML & booze may be able to
lead to nothing less than human immortality. Ponce de Leon's
quest for the Fountain of Youth may have finally reached its
destination. Apparently neither drinking nor XML is sufficiently
powerful on its own, but the combination is just awesome.
You should try it sometime!"
Ron "Heeee's BACK!!" Resnick, mocker-par-excellence.