RE: Objective vs Subjective Morals
April 23, 2014 at 6:26 pm
(This post was last modified: April 23, 2014 at 6:29 pm by DeistPaladin.)
The definition of "objective" is existing independent of opinions or values. I've found that such things tend to be measurable in mathematical terms. Temperature, velocity, mass, weight, distance, etc. are what we think of as "objective" and these are measurable in numerical expressions.
The closest I've ever seen to coming up with a system to make actions measurable on some sort of morality scale is Jeremy Bentham's Utilitarian Principles. Bentham suggested that there is a sum total of pleasure and a sum total of pain in the universe, totals which are variable depending on our actions and other influences. That which creates pleasure or relieves pain can be considered "morally right" which actions that destroy pleasure or create pain are "morally wrong".
Looking past the fact that evaluations of the sum total of pleasure and pain are themselves subjective and therefore not practically measurable even if we had any ability to acquire such knowledge, it's not a bad concept and I've found it helpful in understanding morality. That said, it certainly isn't perfect and, taken in exclusion of any other principles of morality, leads to an "ends justify the means" mentality.
Let's consider an extreme example.
Let's say you know that in the future, a super computer controlling America's nuclear stockpile will gain sentience and start a nuclear war. Soon after that war, it will manufacture killer machines to wipe out the remainder of human beings on the planet. You learn about the scientist who will discover a super chip that will lead to the construction of said super computer. The scientist is himself innocent, having peaceful applications in mind for this new breed of self-aware computers. Nonetheless, his invention will lead to the deaths of 3 billion humans and the destruction of human civilization. Killing this innocent man might prevent all this from happening. For argument's sake, let's say it will.
Would it be moral for Sarah to pull the trigger and murder Miles Bennett Dyson to prevent that future?
![[Image: Miles_Dyson_nearly_killed.jpg]](https://images.weserv.nl/?url=img3.wikia.nocookie.net%2F__cb20130114235458%2Fterminator%2Fimages%2Fc%2Fc8%2FMiles_Dyson_nearly_killed.jpg)
I would say no. I think most people would. And yet from a strictly mathematical analysis, it's a no-brainer to the contrary. Murdering 1 man to save the lives of 3 billion is a net gain of 2,999,999,999 lives, to say nothing of saving human civilization.
The fact that the numbers alone can't decisively tell us what is morally right or wrong would suggest that morality is a subjective matter.
We may be working with different definitions of "objective". It seems to me that something either is or isn't objective. Objective must mean free of ANY opinions, views or values, no matter who or what is making those evaluations.
EDIT to add picture.
The closest I've ever seen to coming up with a system to make actions measurable on some sort of morality scale is Jeremy Bentham's Utilitarian Principles. Bentham suggested that there is a sum total of pleasure and a sum total of pain in the universe, totals which are variable depending on our actions and other influences. That which creates pleasure or relieves pain can be considered "morally right" which actions that destroy pleasure or create pain are "morally wrong".
Looking past the fact that evaluations of the sum total of pleasure and pain are themselves subjective and therefore not practically measurable even if we had any ability to acquire such knowledge, it's not a bad concept and I've found it helpful in understanding morality. That said, it certainly isn't perfect and, taken in exclusion of any other principles of morality, leads to an "ends justify the means" mentality.
Let's consider an extreme example.
Let's say you know that in the future, a super computer controlling America's nuclear stockpile will gain sentience and start a nuclear war. Soon after that war, it will manufacture killer machines to wipe out the remainder of human beings on the planet. You learn about the scientist who will discover a super chip that will lead to the construction of said super computer. The scientist is himself innocent, having peaceful applications in mind for this new breed of self-aware computers. Nonetheless, his invention will lead to the deaths of 3 billion humans and the destruction of human civilization. Killing this innocent man might prevent all this from happening. For argument's sake, let's say it will.
Would it be moral for Sarah to pull the trigger and murder Miles Bennett Dyson to prevent that future?
![[Image: Miles_Dyson_nearly_killed.jpg]](https://images.weserv.nl/?url=img3.wikia.nocookie.net%2F__cb20130114235458%2Fterminator%2Fimages%2Fc%2Fc8%2FMiles_Dyson_nearly_killed.jpg)
I would say no. I think most people would. And yet from a strictly mathematical analysis, it's a no-brainer to the contrary. Murdering 1 man to save the lives of 3 billion is a net gain of 2,999,999,999 lives, to say nothing of saving human civilization.
The fact that the numbers alone can't decisively tell us what is morally right or wrong would suggest that morality is a subjective matter.
(April 22, 2014 at 4:29 pm)alpha male Wrote: Personally I think morality is subjective. When theists speak of having an objective morality, it's only objective from a human point of view. They're generally not arguing that there's some objective code of morality which is independent of and superior to god, which is what a true objective morality would require.
We may be working with different definitions of "objective". It seems to me that something either is or isn't objective. Objective must mean free of ANY opinions, views or values, no matter who or what is making those evaluations.
EDIT to add picture.
Atheist Forums Hall of Shame:
"The trinity can be equated to having your cake and eating it too."
... -Lucent, trying to defend the Trinity concept
"(Yahweh's) actions are good because (Yahweh) is the ultimate standard of goodness. That’s not begging the question"
... -Statler Waldorf, Christian apologist
"The trinity can be equated to having your cake and eating it too."
... -Lucent, trying to defend the Trinity concept
"(Yahweh's) actions are good because (Yahweh) is the ultimate standard of goodness. That’s not begging the question"
... -Statler Waldorf, Christian apologist