(December 24, 2019 at 12:50 pm)possibletarian Wrote: It's always been humanities, religion just being one aspect of it.
Well, for a long time "the humanities" was a term used in contrast to religious studies. It included pagan philosophers and others as a way of countering or finding alternatives to theological views.
But I'm sure you're right that in more recent times it's come to be a term in contrast with the "hard sciences." So maybe the term has changed sufficiently that it has come to include its old rival.
Anyway, it's my unprovable opinion that we'd be better off if the humanities had always been the field in which we debate and advocate aspects of the human world. What we often have instead -- especially in corporate-produced entertainment -- is propaganda reproducing the values of the status quo military/industrial complex.
Marvel Movies advocate a narrow range of things and push down the idea of alternatives. A good novel (Proust, Stendhal, Dostoevsky) enriches our experience of ourselves and others, increasing sympathy, making it more difficult for us to keep our favorite cliches, and challenging our prejudices.