Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Pers Soc Psychol Rev ; : 10888683241251520, 2024 Jun 07.
Artigo em Inglês | MEDLINE | ID: mdl-38847444

RESUMO

ACADEMIC ABSTRACT: Prominent theories of belief and metacognition make different predictions about how people evaluate their biased beliefs. These predictions reflect different assumptions about (a) people's conscious belief regulation goals and (b) the mechanisms and constraints underlying belief change. I argue that people exhibit heterogeneity in how they evaluate their biased beliefs. Sometimes people are blind to their biases, sometimes people acknowledge and condone them, and sometimes people resent them. The observation that people adopt a variety of "metacognitive positions" toward their beliefs provides insight into people's belief regulation goals as well as insight into way that belief formation is free and constrained. The way that people relate to their beliefs illuminates why they hold those beliefs. Identifying how someone thinks about their belief is useful for changing their mind. PUBLIC ABSTRACT: The same belief can be alternatively thought of as rational, careful, unfortunate, or an act of faith. These beliefs about one's beliefs are called "metacognitive positions." I review evidence that people hold at least four different metacognitive positions. For each position, I discuss what kinds of cognitive processes generated belief and what role people's values and preferences played in belief formation. We can learn a lot about someone's belief based on how they relate to that belief. Learning how someone relates to their belief is useful for identifying the best ways to try to change their mind.

2.
Curr Opin Psychol ; 55: 101727, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38035657

RESUMO

People vary between each other and across contexts with respect to how important it is to them to think in logical, impartial, and evidence-based ways. Recent studies demonstrate that this variation in people's personal standards for thinking predicts the nature and quality of their beliefs. Strong commitments to epistemic virtues motivate careful thinking and protect people from suspicious claims. At the same time, people are more likely to knowingly hold biased or evidentially unsupported beliefs when they think that they are justified to think in biased or evidentially poor ways. People's personal standards for reasoning likely play an important role in shaping how suspect or unreasonable information is received.


Assuntos
Resolução de Problemas , Pensamento , Humanos , Viés
3.
J Exp Psychol Gen ; 153(3): 837-863, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38386386

RESUMO

To make sense of the social world, people reason about others' mental states, including whether and in what ways others can form new mental states. We propose that people's judgments concerning the dynamics of mental state change invoke a "naive theory of reasoning." On this theory, people conceptualize reasoning as a rational, semi-autonomous process that individuals can leverage, but not override, to form new rational mental states. Across six experiments, we show that this account of people's naive theory of reasoning predicts judgments about others' ability to form rational and irrational beliefs, desires, and intentions, as well as others' ability to act rationally and irrationally. This account predicts when, and explains why, people judge others as psychologically constrained by coercion and other forms of situational pressure. (PsycInfo Database Record (c) 2024 APA, all rights reserved).


Assuntos
Julgamento , Resolução de Problemas , Humanos , Coerção , Intenção
4.
Cognition ; 254: 105958, 2024 Oct 02.
Artigo em Inglês | MEDLINE | ID: mdl-39362054

RESUMO

How do ordinary people evaluate robots that make morally significant decisions? Previous work has found both equal and different evaluations, and different ones in either direction. In 13 studies (N = 7670), we asked people to evaluate humans and robots that make decisions in norm conflicts (variants of the classic trolley dilemma). We examined several conditions that may influence whether moral evaluations of human and robot agents are the same or different: the type of moral judgment (norms vs. blame); the structure of the dilemma (side effect vs. means-end); salience of particular information (victim, outcome); culture (Japan vs. US); and encouraged empathy. Norms for humans and robots are broadly similar, but blame judgments show a robust asymmetry under one condition: Humans are blamed less than robots specifically for inaction decisions-here, refraining from sacrificing one person for the good of many. This asymmetry may emerge because people appreciate that the human faces an impossible decision and deserves mitigated blame for inaction; when evaluating a robot, such appreciation appears to be lacking. However, our evidence for this explanation is mixed. We discuss alternative explanations and offer methodological guidance for future work into people's moral judgment of robots and humans.

5.
Cognition ; 234: 105379, 2023 05.
Artigo em Inglês | MEDLINE | ID: mdl-36791606

RESUMO

People often engage in biased reasoning, favoring some beliefs over others even when the result is a departure from impartial or evidence-based reasoning. Psychologists have long assumed that people are unaware of these biases and operate under an "illusion of objectivity." We identify an important domain of life in which people harbor little illusion about their biases - when they are biased for moral reasons. For instance, people endorse and feel justified believing morally desirable propositions even when they think they lack evidence for them (Study 1a/1b). Moreover, when people engage in morally desirable motivated reasoning, they recognize the influence of moral biases on their judgment, but nevertheless evaluate their reasoning as ideal (Studies 2-4). These findings overturn longstanding assumptions about motivated reasoning and identify a boundary condition on Naïve Realism and the Bias Blind Spot. People's tendency to be aware and proud of their biases provides both new opportunities, and new challenges, for resolving ideological conflict and improving reasoning.


Assuntos
Ilusões , Humanos , Resolução de Problemas , Julgamento , Princípios Morais , Emoções
6.
Cognition ; 209: 104513, 2021 04.
Artigo em Inglês | MEDLINE | ID: mdl-33478742

RESUMO

When faced with a dilemma between believing what is supported by an impartial assessment of the evidence (e.g., that one's friend is guilty of a crime) and believing what would better fulfill a moral obligation (e.g., that the friend is innocent), people often believe in line with the latter. But is this how people think beliefs ought to be formed? We addressed this question across three studies and found that, across a diverse set of everyday situations, people treat moral considerations as legitimate grounds for believing propositions that are unsupported by objective, evidence-based reasoning. We further document two ways in which moral considerations affect how people evaluate others' beliefs. First, the moral value of a belief affects the evidential threshold required to believe, such that morally beneficial beliefs demand less evidence than morally risky beliefs. Second, people sometimes treat the moral value of a belief as an independent justification for belief, and on that basis, sometimes prescribe evidentially poor beliefs to others. Together these results show that, in the folk ethics of belief, morality can justify and demand motivated reasoning.


Assuntos
Princípios Morais , Resolução de Problemas , Humanos , Obrigações Morais
7.
Trends Cogn Sci ; 25(11): 937-949, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-34281766

RESUMO

Scientific reasoning is characterized by commitments to evidence and objectivity. New research suggests that under some conditions, people are prone to reject these commitments, and instead sanction motivated reasoning and bias. Moreover, people's tendency to devalue scientific reasoning likely explains the emergence and persistence of many biased beliefs. However, recent work in epistemology has identified ways in which bias might be legitimately incorporated into belief formation. Researchers can leverage these insights to evaluate when commonsense affirmation of bias is justified and when it is unjustified and therefore a good target for intervention. Making reasoning more scientific may require more than merely teaching people what constitutes scientific reasoning; it may require affirming the value of such reasoning in the first place.


Assuntos
Resolução de Problemas , Viés , Humanos
8.
J Pers Soc Psychol ; 119(5): 999-1029, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-32437185

RESUMO

People think other individuals have considerable control over what they believe. However, no work to date has investigated how people judge their own belief control, nor whether such judgments diverge from their judgments of others. We addressed this gap in 7 studies and found that people judge others to be more able to voluntarily change what they believe than they themselves are. This occurs when people judge others who disagree with them (Study 1) as well as others who agree with them (Studies 2-5, 7), and it occurs when people judge strangers (Studies 1, 2, 4, and 5) as well as close others (Studies 3 and 7). It appears not to be explained by impression management or self-enhancement motives (Study 3). Rather, there is a discrepancy between the evidentiary constraints on belief change that people access via introspection, and their default assumptions about the ease of voluntary belief revision. That is, people tend spontaneously to think about the evidence that supports their beliefs, which leads them to judge their beliefs as outside their control. But they apparently fail to generalize this sense of constraint to others, and similarly fail to incorporate it into their generic model of beliefs (Studies 4-7). We discuss the implications of our findings for theories of ideology-based conflict, actor-observer biases, naïve realism, and ongoing debates regarding people's actual capacity to voluntarily change what they believe. (PsycInfo Database Record (c) 2020 APA, all rights reserved).


Assuntos
Atitude , Julgamento , Percepção Social , Volição , Adulto , Humanos
9.
Perspect Psychol Sci ; 15(2): 250-272, 2020 03.
Artigo em Inglês | MEDLINE | ID: mdl-31877108

RESUMO

Genetically modified foods (GMFs) have met with strong opposition for most of their existence. According to one account-the consequence-based perspective (CP)-lay people oppose GMFs because they deem them unsafe as well as of dubious value. The CP is backed by the data and offers a clear solution for easing GMF opposition. However, several scholars have claimed that the CP is faulty, that lay opposition derives from largely nonrational factors and is consequence blind. One recent statement of this, the moral-absolutism perspective (MAP), contends that GMFs' opponents are principled "moral absolutists" who think that GMFs should be banned no matter their value or risk. Herein we critically weigh key arguments for this proposal. We also present five new studies that probed the clearest data that seem to favor the MAP-opponents affirming the statement that GMFs should be "prohibited," no matter their value or risk. These studies jointly show that (a) most presumed absolutists do not understand the key question and/or (b) cannot validly answer it. We show that taking due steps in clarifying the question and screening for those participants who cannot validly answer it cuts down absolutism to near zero. Finally, we demonstrate that helping GMFs' opponents imagine a world wherein GMFs are safe and constructive makes the majority willing to welcome GMFs in this context.


Assuntos
Alimentos Geneticamente Modificados , Conhecimentos, Atitudes e Prática em Saúde , Princípios Morais , Pensamento , Adulto , Feminino , Alimentos Geneticamente Modificados/efeitos adversos , Humanos , Masculino
10.
J Exp Psychol Gen ; 148(10): 1701-1732, 2019 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-30730195

RESUMO

Prominent accounts of folk theory of mind posit that people judge others' mental states to be uncontrollable, unintentional, or otherwise involuntary. Yet, this claim has little empirical support: few studies have investigated lay judgments about mental state control, and those that have done so yield conflicting conclusions. We address this shortcoming across six studies, which show that, in fact, lay people attribute to others a high degree of intentional control over their mental states, including their emotions, desires, beliefs, and evaluative attitudes. For prototypical mental states, people's judgments of control systematically varied by mental state category (e.g., emotions were seen as less controllable than desires, which in turn were seen as less controllable than beliefs and evaluative attitudes). However, these differences were attenuated, sometimes completely, when the content of and context for each mental state were tightly controlled. Finally, judgments of control over mental states correlated positively with judgments of responsibility and blame for them, and to a lesser extent, with judgments that the mental state reveals the agent's character. These findings replicated across multiple populations and methods, and generalized to people's real-world experiences. The present results challenge the view that people judge others' mental states as passive, involuntary, or unintentional, and suggest that mental state control judgments play a key role in other important areas of social judgment and decision making. (PsycINFO Database Record (c) 2019 APA, all rights reserved).


Assuntos
Cultura , Emoções , Julgamento , Autocontrole , Tomada de Decisões , Feminino , Humanos , Masculino , Comportamento Social , Adulto Jovem
SELEÇÃO DE REFERÊNCIAS
Detalhe da pesquisa