> Date: Mon, 22 Jun 1998 13:40:05 -0700 (PDT)
> From: Phil Agre <firstname.lastname@example.org>
> Subject: More on @#$%& software
> An article in the 17 Jun 1998 *Wall Street Journal* (Robert Cwiklik,
> "Honest, mom, I don't even know what those @#$%& words mean", page B1)
> describes a program called "Secret Writer's Society" that is supposed to
> help children write by reading their writing back to them in automatically
> generated speech. Under certain conditions, however, the recitation is
> augmented with every obscenity in the English language. The problem is
> evidently that the program also reads out loud its full dictionary of
> words that it is supposed to filter out.
Nice example of a blacklist exposed.
> Judging from the sound of the conditions under which the problem arises,
> some kind of array bounds check is not being done. Assuming that this
> isn't another in the Wall Street Journal's recent series of urban myths,
> it's a depressing comment on the state of computer programming. Way back
> when I was a college student, we were taught programming languages that
> automatically prevented your program from reading random swatches of
> memory through automatic bounds checking. This was presented as a boring
> and well-established technology, which of course it was. So many of the
> problems reported on Risks result from the failure to apply methods that
> were prevalent 40 years ago.
So I decided to dig up the Wall Street Journal article for fun.
The relevant parts are below.
I'm really wondering what words were on that list...
> Computers are revolutionizing education, sometimes in surprising ways.
> Now there's software that can teach kids how to cuss like a drunken
> stevedore. The program, called "Secret Writer's Society," is meant to
> help seven to nine-year-olds learn to write by, among other things,
> reciting their compositions back to them in a computer-generated voice.
> BUT A STRANGE BUG sometimes causes the program to do some creative
> rewriting and vocalize streams of obscenities before reciting
> the child's own words.
> One parent who tested the program for SuperKids, an educational-
> software review Web site, describes the foul language as the sort
> heard in a "slasher flick." Another says "This goes way beyond George
> Carlin's seven banned words."
Crumby investigative reporting --- what were the bad words, dammit???
> Kari Gibbs, a marketing manager for Matsushita Inc.'s Panasonic
> Interactive Media, which makes the product, acknowledges the cursing
> problem, but says it's very uncommon. "We've had two reports of it so
> far," she says. The bug only occurs on Macintoshes, she says, and
> only when a lot of the machine's memory is in use.
No wonder Byars likes Macs so much...
> But Andrew Maisel, SuperKids editor in chief, counters that it doesn't
> take much to turn the program's language blue. He says that if a
> passage is longer than a few sentences and the mouse is double-clicked
> rather than single-clicked, the nastiness ensues. "It's got a very
> expressive vocabulary," he says. "I wouldn't want a 15-year-old
> exposed to some of the language this thing has."
> Ms. Gibbs says the problem is caused by a bug in a filter that's
> supposed to prevent the software's text-to-speech engine from reciting
> foul language that users might put in their text. The bug causes the
> program to tap into the filter's archive of forbidden expressions and
> enunciate several concepts not found on the Scholastic Aptitude Test.
> "It's a bad thing if some child is sitting at the computer and all
> of a sudden it starts swearing at you," Ms. Gibbs concedes.
Gee, ya think?
> When alerted by SuperKids, Panasonic promised to replace flawed copies
> of the program with debugged versions. But Mr. Maisel thinks the
> company should recall the product and publicize its flaws to protect
> kids where it might still be in use.
Oh, come on! This will be as popular as those "Type 'I need a drink' or
'I'd like to see you naked' or 'Unable to evaluate my performance' Word
and highlight it and use the thesaurus command" features...
(For the record, Word's replies are 'I should say so' and 'I'll drink to
that' and 'Unable to have an erection', nyuck nyuck.)
> Ms. Gibbs says copies of the program that haven't been shipped
> "will be pulled," but that there's "no way to contact every person
> who's purchased it." Still, the company says it will set up an 800
> number so that consumers with the flawed version can request a free,
> nonprofane replacement.
Can a consumer with the nonprofane replacement specifically request the
flawed model?? I need some new bad words to add to my repertoire...
You steal the crown jewel of a man's life, and all you can come up with
is some [wimpy] Hallmark sentiment?
-- A Perfect Murder