Brian, I trust you will keep us informed of any aspeersions on your character!
(just so we can collect all seven, you see, Mr Chicago... :-)
------- Forwarded Message
Date: Wed, 10 Dec 1997 23:24:50 -0800 (PST)
From: Declan McCullagh <email@example.com>
Subject: FC: PICS to become worse than the devil?
[A criticism of PICS and a response from someone involved with its
Date: Sat, 6 Dec 1997 11:22:34 -0400
From: Michael Sims <firstname.lastname@example.org>
Subject: Re: PICS to become worse than the devil - news from W3C
Irene Graham wrote:
> Not content with the devil they've created to date, the folk at W3C
> are now developing a new means to block entire domains based on URL
> and a means to ignore labels embedded in documents because "document
> authors can't be trusted to assess their own [content]", etc.
> Whilst I haven't read/deciphered the entire lengthy proposal as yet,
> any perceivable difference between blacklist programs and PICS
> facilitated systems is becoming increasingly difficult to identify.
No Irene, you just don't get it - PICS is not a censoring system,
it's just a label specification, useful for many things in internet
commerce and privacy!
No, actually, this is a censoring system. The PICSRules
specification outlined at http://www.w3.org/TR/PR-PICSRules.htm is a
way to use the PICS labels to deny access (for anyone below you in
the data food chain) based EITHER on PICS labels OR on additional
blacklisting rules, or a combination of both. Basically, they've
written an outline for a censorware product. This outline and the
outline for how CyberPatrol works are identical.
As I recall, Ms. Lorrie Cranor is both present on this list and has
demonstrated a willingness to reply to criticisms, unlike her cohorts
at W3. Ms. Cranor, how do you justify developing a system which
will be used by repressive governments such as Singapore, China,
and the US to censor huge swathes of the net based on political
correctness, hate speech, sex and violence, or whatever? This system
has no purpose whatsoever save to allow people to censor those under
their control. That it should be developed in the United States is a
shame of the first order.
For F-C's information, here are the people taking credit for the UCS
(Universal Censoring System):
Martin Presler-Marshall, IBM <email@example.com>
Christopher Evans, Microsoft <firstname.lastname@example.org>
Clive D.W. Feather, Demon Internet Ltd. <email@example.com>
Alex Hopmann, Microsoft <firstname.lastname@example.org>
Martin Presler-Marshall, IBM <email@example.com>
Paul Resnick, University of Michigan <firstname.lastname@example.org>
Scott Berkun, Microsoft
Jonathan Brezin, IBM
Yang-hua Chu, MIT
Lorrie Cranor, AT&T
Jon Doyle, MIT
Ghirardelli Chocolate Co.
Brian LaMacchia, AT&T
Breen Liblong, NetShepherd
Jim Miller, W3C
Mary Ellen Rosen, IBM
Rick Schenk, IBM
Bob Schloss, IBM
David Shapiro, MIT
Ray Soular, SafeSurf
- -- Michael Sims
Date: Wed, 10 Dec 1997 23:50:14 -0500
From: Lorrie Faith Cranor <email@example.com>
Michael Sims wrote:
>As I recall, Ms. Lorrie Cranor is both present on this list and has
>demonstrated a willingness to reply to criticisms, unlike her cohorts
Yes, Michael, I do read this list when I have time. Sorry I
got a bit behind this week and didn't see your message until today.
(In the future, you can increase your chance of getting a response
from me by emailing your question to me directly.)
>Ms. Cranor, how do you justify developing a system which
>will be used by repressive governments such as Singapore, China,
>and the US to censor huge swathes of the net based on political
>correctness, hate speech, sex and violence, or whatever? This system
>has no purpose whatsoever save to allow people to censor those under
Let me address two issues, one at a time.
1) This system does have other purposes besides censorship; and
2) repressive governments don't need PICS for censorship.
I realize you don't believe this, but PICSRules really does have useful
purposes that have nothing to do with censorship. Indeed, the reason
my name appears in the spec is as follows: Last spring I was helping
W3C develop a prototype of P3P to demo at the FTC Privacy workshop.
P3P is being designed to give people very fine-grained control over
the privacy preferences they specify. However, there has been a lot of
concern that this system will be too complicated for most people.
Rather than trust browser manufacturers to set appropriate defaults
(and who is to say what's appropriate?), the P3P designers would like to
have a way for organizations that people trust to provide recommended
privacy configurations that people can install with the click of a
mouse. But we need a non-proprietary language that anyone can use
to encode these configurations. So, I tried to use the draft PICSRules
spec to encode some sample configurations for the FTC demo. In the
process of doing that I discovered that the PICSRules draft was not
conducive to describing some of the things I wanted to demo. So, I made
some suggestions to the PICSRules committee about ways to make the
language more flexible to accommodate this.
Hopefully PICSRules (or a similar language) will be useful for
facilitating not only applications like P3P, but also
general purpose "trust engines" that can be used in a variety
of applications. For those of you unfamiliar with the concept
of "trust management" I include below an excerpt from a paper by
Chu, Feigenbaum, LaMacchia, Resnick, and Strauss:
"To trust is to undertake a potentially dangerous operation
knowing that it is potentially dangerous. A user might
prefer to have proof of harmlessness, but weaker
forms of evidence may also be sufficient. A recommendation
from a close friend may convince someone to trust
that a piece of software is virus-free. Someone may trust
an insecure channel for transmission of a credit card
number if the credit card company assumes liability for
any fraudulent uses of the number.
"Credentials and policies are the raw materials for
making trust decisions. A credential is a statement,
purportedly made by some speaker. A policy determines
the conditions under which a particular action is allowed.
"Following [BFL], we use the term trust management to refer
to the problem of deciding whether requested
actions, supported by credentials, conform to policies.
Just as a database manager coordinates the input and
storage of data and processes queries, a trust manager
coordinates the collection and storage of policies and
credentials and processes requests for trust decisions."
PICSRules is a language for specifying policies, as described
above. It is specifically applicable to policies that reference
URLs. With a little creativity you might come up with a variety
of other applications that could greatly benefit from a good
policy-specification language. One that comes to mind immediately
is a spam filter (however, some extensions to PICSRules
would make it more suitable for referencing the various
components of email headers). Indeed, someone recently submitted
a proposal for an extensible mail filtering language called "Sieve"
to the IETF. I haven't had time to look at it in depth, but I think
it can be used for censorship ;-).
I find it ironic that my friends on this list who cherish the
right to freedom of speech as much as I do are asking
that W3C suppress (or in your terminology "self-censor") a
language because it might be used in a harmful way.
Repressive governments have much more efficient and effective
mechanisms for censorship than PICS. You may disagree, but
I don't believe that the existence of PICS will increase the
frequency with which repressive governments practice censorship.
Note the following anecdotes:
A few months ago one of the vendors of a PC-based filtering product
designed for use by parents told me that his company also makes
a fire-wall based product that is currently being used by an
undisclosed "small country" to effectively filter all Web traffic
into and out of the country. This product does not use PICS.
In April I heard a presentation by the Hong Kong Privacy
Commissioner for Personal Data, Stephen Lau, in which he described
a system under construction for censoring the Chinese Internet.
Here's an excerpt from an article he distributed:
"We will establish a "CICNet" inside China that is not connected
to the Internet except through designated gateways in Hong Kong or
any one of the major cities in China.... Then the question is, how
can we pass information to and from the world? The answer is simple.
We will put all the information from China that is needed by the
world into a giant database in Hong Kong, which is connected to the
Internet. We can set up the database in English so that the world
community can understand the content. At the same time, we can collect
all the information that is needed by China from the world
community and put them into our giant database in Beijing, in Chinese
language so that all the users in China can also understand the content."
(For more on this see http://www.research.att.com/~lorrie/pubs/iwgdpt21.html)
Dr. Lorrie Faith Cranor firstname.lastname@example.org
Senior Technical Staff Member 973-360-8607
AT&T Labs-Research FAX 973-360-8809
180 Park Ave. Room B224
Florham Park, NJ 07932 http://www.research.att.com/~lorrie/
This list is public. To join fight-censorship-announce, send
"subscribe fight-censorship-announce" to email@example.com.
More information is at http://www.well.com/~declan/fca/
------- End of Forwarded Message