I'm not comfortable with the question. Usually, when we talk about trusting intuitions, it is in reference to whether and to what extent they conform to objective reality. When I analyze my intuition about whether someone is trustworthy based on very limited information, it is in reference to whether they are actually trustworthy and whether given more information and opportunity for analysis I would reach the same conclusion. That question presupposes an objectively "right" answer. And my sense is that our intuitions are better at approximating right answers in rough proportion to how often our ancestors encountered similar situations throughout our evolutionary history and how important those developing intuitions were to survival. "This doesn't taste right" is more highly correlated with "this will harm me" than "the volume of that lake seems like 100K liters" is correlated to the actual volume.
But before you can justify such a question about morality you must assume there is a right answer and therefore an objective morality that would be independent of what any sentient creature thought. And this is something that has never been demonstrated. I tend to agree that morality is subjective and dependent on the existence of sentient beings. What would morality even mean in a lifeless universe?
But before you can justify such a question about morality you must assume there is a right answer and therefore an objective morality that would be independent of what any sentient creature thought. And this is something that has never been demonstrated. I tend to agree that morality is subjective and dependent on the existence of sentient beings. What would morality even mean in a lifeless universe?