RE: Morality: Where do you get yours?
May 14, 2012 at 11:04 pm
(This post was last modified: May 14, 2012 at 11:12 pm by Angrboda.)
In response to the OP, I would suggest that a morals based on evolutionary psychology quickly runs afoul of Hume's is-ought fallacy. The fact that we display a set of morals as biological creatures does not mean that the morality we display is in fact moral. We can imagine a species similar to ours, but differing in certain moral judgements. Deciding whose morality should prevail reduces to speciesism. I'm fond of giving the example of reproductive strategy here. We are a K-reproductive strategy species, giving birth to few offspring and lavishing enormous resources on them. An r-reproductive species on the other hand reproduces in immense numbers and cares nothing for its offspring. As K-reproductive, we would consider it immoral for a couple to have a child and simply abandon it to the fates. Yet if we were an r-reproductive species, we wouldn't. But obviously there's nothing inherently more moral about the values of a K-reproductive species than an r-reproductive species. And this gets to the core of all such evolutionarily based moralities — they depend on the assumption that the prosperity of our species is a good, and only then from that value derive the consequence for an evolutionarily based ethics. But that initial assumption is purely arbitrary.
I must confess some sympathy for the evolutionary perspective as while my views on morality have their genesis in some ideas from computational neuroscience and incorporate similar elements, the fallout of which seemed to promise a means of transcending the is-ought fallacy, anymore I'm not so sure, as I seem to have run aground in like fashion. And anyway, I have not the luxury of time to explore these ideas further at this time. (I'm heavily involved in book clubs and discussion groups these days, leaving precious time for my own work — which usually ends up being spent on some bollocks like posting on forums or something.) So for now, my ideas on the foundations of morality remain incomplete.
In reference to the golden rule, I think it is making two mistakes. The first is, I think it is an example of G.E. Moore's naturalistic fallacy, being a case in which because a heuristic or rule seems to share the same properties as our moral intuitions unaided by the rule, we conclude that our moral intuitions are generated using the rule (or part of our judgements are). Even if I felt the golden rule was an exact mirror for our moral judgements, I would strongly suspect that our brains do not use a variant of the golden rule in generating our moral intuitions. Moreover, there's a logical problem here as well: how do we know that the golden rule accords with our moral intuitions without recourse to those intuitions to verify that this is the case; it would seem the intuition is primary, and its correspondance to the golden rule secondary, and perhaps merely coincidental. (Christians face a similar quandary with Euthyphro's dilemma and Cafeteria Christianity — how do you know which of God's rules to follow?) The other problem being that it assumes fairness as a value. The likely reason for this is that fairness results in tit-for-tat strategies being successful for a social species such as ourselves, thus allowing our species to prosper via individual cooperation and reciprocity. Again note how this depends on the specifics of our species biologically. Moreover, I suspect fairness and altruism are competitive moral values, neither being purely realizable unconstrained by the other; an appropriate moral judgement would then be a satisficing of two values, which a simple rule like the golden rule is simply not powerful to embody. Again, if you drop the arbitrary value that the success of our species is a good, the purpose of fairness and reciprocity becomes highly questionable, and without that, the golden rule ceases to exist.
For what it's worth, a week ago I was introduced to what I believe is termed subjective consequentialism, and that framework appears to ably resolve Violet Lilly Blossom's list of exceptions capably.
I like kılıç_mehmet's suggestion that a society is the proper level at which to examine morality. I'm convinced that truth qua truth is the property of the social group, not the individual. Social groups, by their nature of possessing normative standards, have much more reliable characteristics for arriving at useful truths than individuals. In a particular individual, a handful, or even a lot, of truth may be present, yet there is no reliable epistemological procedure for determining which of that individual's ideas are correctly held and which are mistakenly held (especially by the individual herself). Groups on the other hand, which usually have embedded epistemic procedures for arriving at truth over time, can be reliably depended upon to produce whatever passes for truth in the group (whether Turks, scientists, creationists, Raelians or whatever). In a similar vein, I'm coming to the conclusion that much of modern thinking corresponds to the example of Moore's naturalistic fallacy applied to morals: the mistaking of a rule which produces similar behaviors for the actual cause of those behaviors themselves. This leaves me with the default rules of political efficacy: might makes right (including of groups), and some form of social contract theory. (Political philosophy is not my field.) Unfortunately, kılıç_mehmet ignores a central point. It is the behavior and judgements of individuals which gives rise to the behavior of the group. Change the individual socially, mentally, or biologically, and the behavior of the group changes. So suggesting that morals should filter top-down, from the society to the individual, instead of bottom-up, from the individual upwards, would appear to be putting the cart before the horse. Individuals have genes and a local environment which determine their phylogeny; societies do not, and therefore a society is not constrained by morals, any morals. You can't get morality from a source that doesn't itself have it.