While exploring the website of the Singularity Institute for Artificial Intelligence
, I was guided to an essay by Eliezer Yudkowsy entitled Cognitive biases potentially affecting judgment of global risks
. [PDF document]
. I experienced several "A-ha!" moments while reading the essay, all of which synchronized with my inherent skepticism of supposed expertise. And at the risk of sounding like I am ignoring Yudkowsky's warnings, I knew it all along.
"The systematic experimental study of reproducible errors of human reasoning . . . is known as the heuristics and biases program in cognitive psychology. . . .
"[ . . . ]
"The . . . program has uncovered results that may startle and dismay the unaccustomed scholar. . . .
"[ . . . ]
" . . . [B]y making you a more sophisticated arguer . . . I have actually harmed you; I have made you slower to react to evidence. I have given you another opportunity to fail each time you face the challenge of changing your mind. . . .
" . . . Awareness of human fallibility is a dangerous knowledge; if you remind yourself of the fallibility of those who disagree with you. If I am selective about which arguments I inspect for errors, or even how hard I inspect for errors, then every new rule of rationality I learn, every new logical flaw I know to detect, makes me that much stupider."
Yudkowsky narrates a number of scenarios where experts are inclined to convince themselves that they know more than they actually do, particularly when it comes to overconfidence in their estimates, then points out that he has barely scraped the surface in uncovering the astonishing number of ways that so-called experts can be wrong. He then goes on to admit his own arrogance in estimates that he once made regarding the development of Artificial Intelligence by 2025, with a peak in 2018.
"Why did I ever think I could generate a tight probability distribution over a problem like that? Where did I ever get those numbers in the first place?"
I am reminded of a joke that was once told to me -- i.e., that 90 percent of all statistics are made up on the spot. In accord with this joke, I am inclined to dismiss virtually all statistics as irrational appeals to authority. At the same time, I am inclined to interpret Yudkowsky's predictions regarding the development of Artificial Intelligence as harmless wishful thinking. Of course, making this sort of excuse for an expert that I like is one of the cognitive biases that Yudkowsky warns against, so I guess I'm part of what I have always referred to as the "ongoing worldwide conspiracy of ignorance and incompetence."
Labels: artificial intelligence, heuristics