RE: How do you teach fundamental logic to someone that doesn't grok it?

From: John Hall (johnhall@evergo.net)
Date: Fri May 04 2001 - 09:38:50 PDT


Obviously, definitions would be included in any discussion before you asked
them to answer a question, but in the context of 'us bitheads' I hardly
figured that the definitions were necessary.

Also note that A --> B, B is something I've seen people have trouble
understanding. Including, apparently, on FoRK. I can understand teaching
that one, and how to do it, and why it might be necessary.

But I only introduced it in comparison to A --> B, A. That is the one that
flips my circuits.

-----Original Message-----
From: Matt Jensen [mailto:mattj@newsblip.com]
Sent: Friday, May 04, 2001 5:17 AM
To: FoRK
Subject: Re: How do you teach fundamental logic to someone that
doesn'tgrok it?

I took Tony's post as "here's how an average high schooler, without any
logic training, might understand the question." That was the original
context. Monotonic logic versus information theory is just the FoRK
discussion context.

For a typical high schooler, I suspect Tony's description *is*
the simplest framework; it describes the commonsense ways in which people
casually think. In contrast, it's the imposition of rules of prepositional
logic which seems unnatural and arbitrary (and thus, you need a teacher
for it).

We bitheads are so used to our logical forms, and so enamored of the
benefits it has brought us through millions of silicon logic gates, that
we come to see it as the natural way to view things. But it's not
natural; everyone who knows such logic had to learn it at some point. In
contrast, induction is natural. If you eat an onion and don't like it,
you have a high confidence that you won't like other onions. You don't
have to inventory the space of "all onions". So it shouldn't be
surprising that a high schooler given [ A-->B, B ] concludes "A,
probably."

-Matt Jensen
 NewsBlip.com
 Seattle

On Thu, 3 May 2001, Jeff Bone wrote:

>
> Tony Berkman wrote:
>
> > I have to disagree. Where does it say they are statements??? I am
> > considering
>
> Ahem, *assuming.*
>
> > them Binary Random Variables over some unknown distribution
>
> Assuming.
>
> > in
> > which case if B is a discreet Random Variable, even without knowing it's
> > mass, you know a little bit more about A once you know that B is True.
> >
> > At 10:44 PM 5/3/01, John Hall wrote:
> > >Similarly, if A => B and you know that B is true you have no idea
whether A
> > >is true or false. No information. None.
> > >Zero. Zilch. Nada.
> > >
>
> I have to agree with John, Tony. Given the discussion, it was entirely
obvious
> that A => B meant "A implies B," with A and B being simple truth values.
No need
> to make it more complex; always choose the smallest possible context for
> interpretation of mathematical assertions. "Principle of Least
Assumption" and
> all that. No reason to assume that the logic of the system is
nonmonotonic or
> contextual unless we're told otherwise.
>
> In straightforward (i.e., introductory) monotonic / symbolic logic, A =>
B, B
> tells you nothing about A's truth value.
>
> Tony does suggest a point, however, in that introducing statistical or
other
> relationships or facts about the quantities involved or other state makes
the
> problem more interesting. But then, that's moving towards information
theory ---
> very interesting indeed, but not something John has to figure out how to
teach to
> HS kids. (Unless he's very lucky.:-)
>
> jb
>
> PS - though in ASCII I would've said A --> B. ;-)
>
>
>



This archive was generated by hypermail 2b29 : Sun May 06 2001 - 08:04:38 PDT