Game Theory and the Golden Rule

jbone at place.org jbone at place.org
Mon Dec 1 21:33:18 PST 2003


While it's a vast oversimplification, the "Christian" game theory bits 
Joe pointed out earlier do sort of underscore one point I've been 
making for a while about game-theoretic ethics (Christian or otherwise) 
--- namely:

--

	http://objective.jesussave.us/gametheory.html

Game Theory allows the Christian to compare and contrast the strategies 
of the Lord with those that are contradictory, thereby showing the 
apologetic target the merits of the Lord's teachings. For example, let 
us consider two different strategies for interacting with our fellow 
Man. One is given to us by the Lord: "Therefore all things whatsoever 
ye would that men should do to you, do ye even so to them: for this is 
the law and the prophets" (Matthew 7:12, which we will abbreviate as 
"Do unto others", as it is popularly known). The other comes to us from 
noted Satanist, Aleister Crowley: "Do what thou wilt shall be the whole 
of the Law". This produces the following payoff matrix for two 
participants:

Golden Rule vs. Satanist Credo Payoff Matrix
. Do unto others Do what thou wilt
Do unto others 2,2 1,-1
Do what thou wilt -1,1 0,0

When the two participants differ in strategy it is a zero-sum game 
since one gains at the other's expense. When they agree to be 
self-serving the sum is zero but no one gains. When they agree to 
follow the Golden Rule they both gain more than either could by 
thinking only of their own desires.

If one participant chooses the self-serving strategy, he will hurt the 
other who will then act selfishly himself in order to even the score, 
thereby leading to a world where all follow the Satanist's credo and 
thus no one wins. If, however, all participants follow the Golden Rule 
as our Lord tells us to do, all will win. The Lord is teaching us the 
proper strategy for human interactions, and Game Theory shows that He 
is correct.

--

Forgetting all the crap, the point being that it's possible to analyze 
the iterated impact of a strategy which values the impact of outcomes 
on others vs. a strategy that does not.  QED.  The guy's a loon, but 
he's using the tools in an interesting way.

Personally it's a slight variation on this kind of reasoning that makes 
me relatively secure in the idea that while e.g. Eli-style 
"Friendliness" is probably impossible to induce and insure, it's 
probably also unnecessary.  A rational being chooses rational 
strategies, and that usually excludes scorched-Earth / purely 
self-serving strategies.

jb



More information about the FoRK mailing list