(April 5, 2022 at 2:33 pm)brewer Wrote: The video is confusing hard science with soft science. Hard science could easily tell which thermometers are inaccurate. Soft science (of which psychology is one, google soft science) can't easily make that distinction and it is more dependent on fluctuating societal norms than actual repeatable hard science.
Repeatable results in psychology are more nebulous than other scientific disciplines.
https://en.wikipedia.org/wiki/Hard_and_soft_science
Publication bias exists in both hard and soft science. Eventually they both work it out or self correct.
I would say that replicability is equally important for the hard and soft sciences. Like, if it ain't replicable, it ain't science... period. True that a replication is a more nebulous thing in the soft sciences, but I would argue that replication is a key factor of all sciences.
Even if publication bias "self-corrects" eventually, I still think it deserves our attention. It isn't just one result or one single fact that gets corrected. Researchers depend on scientific papers being accurate. One single erroneous result could impact dozens of studies because (for better or worse) researchers in psychology assume the results in publications are sound.
And that's the thesis of the video. They ought to be more reliable than they are. We could vastly improve on an 85% replicability rate (and it could even be lower than 85%). So I think addressing publication bias in a more immediate way is preferred over waiting for things to work themselves out eventually.