(Forthcoming in Psychology Today) TL;DR key points
2. They are confident in things which are outright false 3. They countenance the “impossible” and are “paranoid” 4. They avoid risks that don’t happen 5. They pursue opportunities that fail 6. They are often irrational 7. They do things that are often “crazy” or “unconventional”
2. Learn norms of reasoning 3. Think in terms of expected utility theory
THE IMPORTANCE OF RECOGNIZING WHAT'S RATIONAL AND WHAT'S NOT If someone was as rational as could be—with many accurate and trustworthy judgments about the world, and with sound decisions—would we recognize it? There are reasons to think the answer is “No”. In this piece, I aim to challenge prevailing intuitions about rationality: I will argue that the philosophy and science of judgment and decision-making reveal a number of ways in which what appears to be rational diverges from what actually is rational. This piece takes its title from Steven Covey’s well-known book “The Seven Habits of Highly Effective People”. I will argue that, similarly, there are seven habits of highly rational people—but these habits can appear so counter-intuitive that others label them as “irrational”. Of course, the rationality of these habits might be obvious to specialists in judgment and decision-making, but I find they are often not so obvious to others of the sort for whom this piece is written. In any case, not only are these habits potentially interesting in their own right, but recognizing them may also help to open our minds, to help us better understand the nature of rationality and to better identify the judgments and decisions we should trust—or not trust—in our own lives. Without further ado, then, I present… THE SEVEN "IRRATIONAL" HABITS OF HIGHLY RATIONAL PEOPLE 1. Highly rational people are confident in things despite “no good evidence” for them The first habit of highly rational people is that they are sometimes confident in things when others think there is “no good evidence” for them.
0 Comments
(Forthcoming in Psychology Today) THE TL;DR KEY POINTS
THE IMPORTANCE OF RECOGNIZING GOOD EVIDENCE We all need to form accurate judgments about the world in many diverse and important contexts. What is the correct diagnosis for someone’s medical condition? Does someone have a crush on you? Did the defendant kill the victim? Here, I will discuss how the evidence can reveal the truth about these questions--and potentially others which you might care about--but only if we think in the right ways. It’s well-documented that various biases can hinder us in our quest for truth. In a recently published paper in Judgment and Decision Making (freely available here), I introduce a new cognitive bias: likelihood neglect bias. Understanding this bias, and how to overcome it, can help us recognize good evidence and find the truth in numerous cases where others might not. To show this, though, I’ll use a well-known brain-teaser which reveals this bias—the Monty Hall problem—and then I’ll apply the emerging ideas to show how we can find the truth in other realistic cases—including medicine, law and more mundane topics. You might then want to apply these ideas to other cases which you might care about. (Forthcoming in Psychology Today) THE TL;DR KEY POINTS
THE IMPORTANCE OF TRUSTWORTHY JUDGMENTS
We all make judgments of probability and depend on them for our decision-making. However, it is not always obvious which judgments to trust, especially since a range of studies suggest these judgments can sometimes be more inaccurate than we might hope or expect. For example, scholars have argued at least 4% of death sentence convictions in the US are false convictions, that tens or even hundreds of thousands of Americans die of misdiagnoses each year and that sometimes experts can be 100% sure of predictions which turn out to be false 19% of the time. So we want trustworthy judgments, or else bad outcomes can occur. How do, then, can we determine which judgments to trust—either from ourselves or others? In a paper recently published here and freely available here, I argue for an answer called “inclusive calibrationism”—or just “calibrationism” for short. Calibrationism says trustworthiness requires two ingredients—calibration and inclusivity. |
AuthorJohn Wilcox Archives
August 2024
Categories
All
|