Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 166
Filter
1.
Postgrad Med J ; 2024 Jul 15.
Article in English | MEDLINE | ID: mdl-39005056

ABSTRACT

Clinical reasoning is a crucial skill and defining characteristic of the medical profession, which relates to intricate cognitive and decision-making processes that are needed to solve real-world clinical problems. However, much of our current competency-based medical education systems have focused on imparting swathes of content knowledge and skills to our medical trainees, without an adequate emphasis on strengthening the cognitive schema and psychological processes that govern actual decision-making in clinical environments. Nonetheless, flawed clinical reasoning has serious repercussions on patient care, as it is associated with diagnostic errors, inappropriate investigations, and incongruent or suboptimal management plans that can result in significant morbidity and even mortality. In this article, we discuss the psychological constructs of clinical reasoning in the form of cognitive 'thought processing' models and real-world contextual or emotional influences on clinical decision-making. In addition, we propose practical strategies, including pedagogical development of a personal cognitive schema, mitigating strategies to combat cognitive bias and flawed reasoning, and emotional regulation and self-care techniques, which can be adopted in medical training to optimize physicians' clinical reasoning in real-world practice that effectively translates learnt knowledge and skill sets into good decisions and outcomes.

3.
Cognition ; 250: 105837, 2024 Sep.
Article in English | MEDLINE | ID: mdl-38878520

ABSTRACT

Would you take a gamble with a 10% chance to gain $100 and a 90% chance to lose $10? Even though this gamble has a positive expected value, most people would avoid taking it given the high chance of losing money. Popular "fast-and-slow" dual process theories of risky decision making assume that to take expected value into account and avoid a loss aversion bias, people need to deliberate. In this paper we directly test whether reasoners can also consider expected value benefit intuitively, in the absence of deliberation. To do so, we presented participants with bets and lotteries in which they could choose between a risky expected-value-based choice and a safe loss averse option. We used a two-response paradigm where participants made two choices in every trial: an initial intuitive choice under time-pressure and cognitive load and a final choice without constraints where they could freely deliberate. Results showed that in most trials participants were loss averse, both in the intuitive and deliberate stages. However, when people opted for the expected-value-based choice after deliberating, they had predominantly already arrived at this choice intuitively. Additionally, loss averse participants often showed an intuitive sensitivity to expected value (as reflected in decreased confidence). Overall, these results suggest that deliberation is not the primary route for expected-value-based responding in risky decision making. Risky decisions may be better conceptualized as an interplay between different types of "fast" intuitions rather than between two different types of "fast" and "slow" thinking per se.


Subject(s)
Choice Behavior , Decision Making , Intuition , Risk-Taking , Humans , Intuition/physiology , Male , Female , Adult , Young Adult , Choice Behavior/physiology , Decision Making/physiology , Gambling , Thinking/physiology
4.
Psych J ; 2024 Apr 15.
Article in English | MEDLINE | ID: mdl-38618751

ABSTRACT

Rule learning is an important ability that enables human beings to adapt to nature and develop civilizations. There have been many discussions on the mechanism and characteristics of algebraic rule learning, but there are still controversies due to the lack of theoretical guidance. Based on the dual-process theory, this study discussed the following arguments for algebraic rule learning across human and animal studies: whether algebraic rule learning is simply Type 1 processing, whether algebraic rule learning is a domain-general ability, whether algebraic rule learning is shared by humans and animals, and whether an algebraic rule is learned consciously. Moreover, we propose that algebraic rule learning is possibly a cognitive process that combines both Type 1 and Type 2 processing. Further exploration is required to establish the essence and neural basis of algebraic rule learning.

5.
J Intell ; 12(4)2024 Apr 03.
Article in English | MEDLINE | ID: mdl-38667709

ABSTRACT

We tested predictions deriving from the "Pleasure-Interest Model of Aesthetic Liking" (PIA Model), whereby aesthetic preferences arise from two fluency-based processes: an initial automatic, percept-driven default process and a subsequent perceiver-driven reflective process. One key trigger for reflective processing is stimulus complexity. Moreover, if meaning can be derived from such complexity, then this can engender increased interest and elevated liking. Experiment 1 involved graffiti street-art images, pre-normed to elicit low, moderate and high levels of interest. Subjective reports indicated a predicted enhancement in liking across increasing interest levels. Electroencephalography (EEG) recordings during image viewing revealed different patterns of alpha power in temporal brain regions across interest levels. Experiment 2 enforced a brief initial image-viewing stage and a subsequent reflective image-viewing stage. Differences in alpha power arose in most EEG channels between the initial and deliberative viewing stages. A linear increase in aesthetic liking was again seen across interest levels, with different patterns of alpha activity in temporal and occipital regions across these levels. Overall, the phenomenological data support the PIA Model, while the physiological data suggest that enhanced aesthetic liking might be associated with "flow-feelings" indexed by alpha activity in brain regions linked to visual attention and reducing distraction.

6.
Int J Psychophysiol ; 199: 112340, 2024 May.
Article in English | MEDLINE | ID: mdl-38574820

ABSTRACT

Sokolov described both phasic and tonic aspects of the Orienting Reflex (OR), but subsequent research and theory development has focussed primarily on the phasic OR at the expense of the tonic OR. The present study used prestimulus skin conductance level (SCL) during a dishabituation paradigm to model the tonic OR, examining its amplitude patterning over repeated standard stimulus presentations and a change stimulus. We expected sensitisation (increased amplitude) following the initial and change trials, and habituation (decrement) over the intervening trials. Prestimulus EEG alpha level was explored as a potential central measure of the tonic OR (as an inverse correlate), examining its pattern over stimulus repetition and change in relation to the SCL model. We presented a habituation series of innocuous auditory stimuli to two groups (each N = 20) at different ISIs (Long 13-15 s and Short 5-7 s) and recorded electrodermal and EEG data during two counterbalanced conditions; Indifferent: no task requirements; Significant: silent counting. Across groups and conditions, prestimulus SCLs and alpha amplitudes generally showed the expected trials patterns, confirming our main hypotheses. Findings have important implications for including the assessment of Sokolov's tonic OR in modelling central and autonomic nervous system interactions of fundamental attention and learning processes.


Subject(s)
Galvanic Skin Response , Habituation, Psychophysiologic , Humans , Habituation, Psychophysiologic/physiology , Orientation/physiology , Reflex/physiology , Attention/physiology , Acoustic Stimulation
7.
Traffic Inj Prev ; 25(5): 733-740, 2024.
Article in English | MEDLINE | ID: mdl-38629829

ABSTRACT

OBJECTIVE: Jaywalking is an important cause of pedestrian-related automobile accidents. Exploring the factors that influence jaywalking behavior and suggesting appropriate improvement measures are critical for reducing automobile accidents involving pedestrians. METHODS: This study divided traffic situations into high-risk and low-risk situations. Each situation contained three visual attention cues: vehicle, traffic light, and group behavior. Based on this, the role of visual cues in guiding pedestrians' attention and influencing their decisions during jaywalking was examined. Sixty participants, with an average age of 19, were recruited. They were shown 84 crosswalk videos randomly while their crossing decisions and eye movement data were recorded. RESULTS: In low-risk situations, pedestrians spent more attention on group behavioral cues when making jaywalking decisions. The rate of jaywalking increased with the number of other jaywalking pedestrians. In high-risk situations, the pedestrians' total fixation duration at vehicle hazard cues was longer when making jaywalking decisions, and the jaywalking rate decreased. CONCLUSIONS: The results indicate that pedestrians' jaywalking decisions were based on other pedestrians' illegal crossing cues and automatic associative processes in low-risk situations. The higher the number of people crossing the street, the higher the number of pedestrians illegally crossing the road. In high-risk situations, pedestrians paid more attention to vehicle hazard cues before making jaywalking decisions, and fewer illegal crossings. The jaywalking decisions were based on a risk assessment, a controlled analytical process. The results verify the effect of visual cues on pedestrians' attentional guidance and decision-making in different traffic situations, as well as the effectiveness of visual attention in predicting decision intention. The findings provide a theoretical basis and data reference for pedestrian safety education and constructing an intelligent driving pedestrian trajectory prediction model.


Subject(s)
Accidents, Traffic , Attention , Cues , Decision Making , Pedestrians , Walking , Humans , Pedestrians/psychology , Male , Female , Young Adult , Accidents, Traffic/prevention & control , Walking/psychology , Adolescent , Eye Movements , Adult , Universities , Students/psychology
9.
Front Neurosci ; 18: 1393595, 2024.
Article in English | MEDLINE | ID: mdl-38655110

ABSTRACT

[This corrects the article DOI: 10.3389/fnins.2022.996957.].

10.
Behav Sci (Basel) ; 14(2)2024 Feb 19.
Article in English | MEDLINE | ID: mdl-38392499

ABSTRACT

To describe something in terms of its purpose or function is to describe its teleology. Previous studies have found that teleological beliefs are positively related to anthropomorphism, and that anthropomorphism decreases the perceived unpredictability of non-human agents. In the current study, we explore these relationships using the highly salient example of beliefs about the coronavirus pandemic. Results showed that both anthropomorphism and teleology were negatively associated with perceived uncertainty and threat, and positively associated with self-reported behavioural change in response to the pandemic. These findings suggest that highly anthropomorphic and teleological individuals may view coronavirus as agentive and goal-directed. While anthropomorphic and teleological beliefs may facilitate behavioural change in response to the pandemic, we also found that the associated reduction in uncertainty and threat may be detrimental to behavioural change. We discuss the implications of these findings for messaging about global events more broadly.

11.
Basic Clin Neurosci ; 14(4): 529-542, 2023.
Article in English | MEDLINE | ID: mdl-38050565

ABSTRACT

Introduction: Frontoparietal (FPN) and cingulo-opercular network (CON) control cognitive functions needed in deductive and inductive reasoning via different functional frameworks. The FPN is a fast intuitive system while the CON is slow and analytical. The default-interventionist model presents a serial view of the interaction between intuitive and analytic cognitive systems. This study aims to examine the activity pattern of the FPN and CON from the perspective of the default-interventionist model via reasoning. Methods: We employed functional magnetic resonance imaging (fMRI) to investigate cingulo-opercular and frontoparietal network activities in 24 healthy university students during Raven and Wason reasoning tasks. Due to the different operation times of the CON and FPN, the reaction time was assessed as a behavioral factor. Results: During Raven's advanced progressive matrices (RAPM) test, both the CON and FPN were activated. Also, with the increase in the difficulty level of the Raven test, a linear increase in response time was observed. In contrast, during the Wason's selection task (WST) test, only the activity of FPN was observed. Conclusion: The results of the study support the hypothesis that the default-interventionist model of dual-process theory provides an accurate explanation of the cognitive mechanisms involved in reasoning. Thus, the response method (intuitive/analytical) determines which cognitive skills and brain regions are involved in responding. Highlights: The cingulo-opercular and fronto-parietal networks (FPNs) control cognitive functions and processes.The frontoparietal network is a fast intuitive system that utilizes short-time attention which is compatible with type 1 processing. In contrast, the cingulo-opercular network (CON) is an analytical time-consuming system that utilizes attention and working memory for a longer time, compatible with type 2 processing.The default-interventionist model of a dual-process theory states that our behaviors are controlled by type 1 processing unless we are confronted with novel and complex problems in which we have no prior experiences. Plain Language Summary: The present study examined the activity of two task-based brain networks through performing diffrent type of reasoning tasks. Fronto-parietal and Cingulo-opercular are the two task-based brain networks that are responsible for cognitive control. These two brain networks direct the way to use cognitive skills and executive functions which are necessary to perform cognitive tasks especially higher-order ones as reasoning tasks. Since the two types of inductive and deductive reasoning tasks requier two different bottom-up and top-down cognitive control respectively, different cognitive skills would be needed which affect the activity of fronto-parietal and cingulo-opercular brain networks. Our results showed that through inductive reasoning task which examined by RAVEN, both of the fronto-parietal and cingulo-opercular brain networks were activated but deductive reasoning task which examined by Wason Selection Card test, just the fronto-parietal brain network was activated. It seems that in the case of deductive reasoninf task, there is a higher probability of errors which lead to giving less correct responses. Based on our results, subjects paid not enough attention to details, so had failure to update informations that leaded to responding with errors. Inactivity of cingulo-opercular network through dedeuctive reasoning task clearly showed that the bottom-up cognitive control did not happen successfully. As a result of that, information processing did not proceed properly.

12.
Front Psychol ; 14: 1195668, 2023.
Article in English | MEDLINE | ID: mdl-37809292

ABSTRACT

False memory formation is usually studied using the Deese-Roediger-McDermott paradigm (DRM), in which individuals incorrectly remember words that were not originally presented. In this paper, we systematically investigated how two modes of thinking (analytical vs. intuitive) can influence the tendency to create false memories. The increased propensity of intuitive thinkers to generate more false memories can be explained by one or both of the following hypotheses: a decrease in the inhibition of the lure words that come to mind, or an increased reliance on the familiarity heuristic to determine if the word has been previously studied. In two studies, we conducted tests of both recognition and recall using the DRM paradigm. Our observations indicate that a decrease in inhibitory efficiency plays a larger role in false memory formation compared to the use of the familiarity heuristic.

13.
PeerJ ; 11: e15751, 2023.
Article in English | MEDLINE | ID: mdl-37529214

ABSTRACT

Introduction: The fast, intuitive and autonomous system 1 along with the slow, analytical and more logical system 2 constitute the dual system processing model of decision making. Whether acting independently or influencing each other both systems would, to an extent, rely on randomness in order to reach a decision. The role of randomness, however, would be more pronounced when arbitrary choices need to be made, typically engaging system 1. The present exploratory study aims to capture the expression of a possible innate randomness mechanism, as proposed by the authors, by trying to isolate system 1 and examine arbitrary decision making in autistic participants with high functioning Autism Spectrum Disorders (ASD). Methods: Autistic participants withhigh functioning ASD and an age and gender matched comparison group performed the random number generation task. The task was modified to limit the contribution of working memory and allow any innate randomness mechanisms expressed through system 1, to emerge. Results: Utilizing a standard analyses approach, the random number sequences produced by autistic individuals and the comparison group did not differ in their randomness characteristics. No significant differences were identified when the sequences were examined using a moving window approach. When machine learning was used, random sequences' features could discriminate the groups with relatively high accuracy. Conclusions: Our findings indicate the possibility that individual patterns during random sequence production could be consistent enough between groups to allow for an accurate discrimination between the autistic and the comparison group. In order to draw firm conclusions around innate randomness and further validate our experiment, our findings need to be replicated in a bigger sample.


Subject(s)
Autism Spectrum Disorder , Autistic Disorder , Humans , Autism Spectrum Disorder/diagnosis , Memory, Short-Term
14.
Cogn Res Princ Implic ; 8(1): 47, 2023 07 25.
Article in English | MEDLINE | ID: mdl-37488460

ABSTRACT

Over the course of training, physicians develop significant knowledge and expertise. We review dual-process theory, the dominant theory in explaining medical decision making: physicians use both heuristics from accumulated experience (System 1) and logical deduction (System 2). We then discuss how the accumulation of System 1 clinical experience can have both positive effects (e.g., quick and accurate pattern recognition) and negative ones (e.g., gaps and biases in knowledge from physicians' idiosyncratic clinical experience). These idiosyncrasies, biases, and knowledge gaps indicate a need for individuals to engage in appropriate training and study to keep these cognitive skills current lest they decline over time. Indeed, we review converging evidence that physicians further out from training tend to perform worse on tests of medical knowledge and provide poorer patient care. This may reflect a variety of factors, such as specialization of a physician's practice, but is likely to stem at least in part from cognitive factors. Acquired knowledge or skills gained may not always be readily accessible to physicians for a number of reasons, including an absence of study, cognitive changes with age, and the presence of other similar knowledge or skills that compete in what is brought to mind. Lastly, we discuss the cognitive challenges of keeping up with standards of care that continuously evolve over time.


Subject(s)
Clinical Decision-Making , Physicians , Humans , Heuristics , Knowledge , Cognition
15.
J Behav Ther Exp Psychiatry ; 81: 101884, 2023 12.
Article in English | MEDLINE | ID: mdl-37307645

ABSTRACT

BACKGROUND AND OBJECTIVES: Cognitive theories assume an imbalance of intuitive and analytical reasoning in paranoid ideation. The argumentative theory of reasoning offers an approach on the primary function of reasoning and its flaws. It assumes social exchange as main purpose of reasoning. We applied this theory to delusion research and tested experimentally whether social exchange in the form of production and evaluation of arguments influences subsequent reflective reasoning. Additionally, we examined whether the social network and the frequency/preference for discussions are associated with distorted reflective reasoning and paranoid ideation. METHODS: 327 participants completed the Social Network Index (SNI), the Paranoia Checklist (PCL), and the Cognitive Reflection Test-2 (CRT2). Moreover, preference and frequency of discussions were assessed. In the discussion group (N = 165), participants produced arguments and evaluated counterarguments on two socially relevant topics. The control group (N = 162) watched a nature video instead. RESULTS: The discussion group showed lower distorted reflective reasoning than the control group. Discussion preference and/or frequency was associated with frequency and disturbance of paranoid thoughts, as well as overall paranoid ideation. LIMITATIONS: Due to the online format environmental factors could not be held constant and no intrasubject comparison of the CRT2 could be computed in the trial. Furthermore, the sample mainly consisted of psychology students. CONCLUSION: The results contribute to the understanding of distorted reflective reasoning and provides preliminary evidence that the argumentative theory of reasoning may offer a promising perspective for delusion research.


Subject(s)
Paranoid Disorders , Problem Solving , Humans , Pilot Projects , Paranoid Disorders/psychology , Neuropsychological Tests
16.
Integr Psychol Behav Sci ; 57(4): 1273-1283, 2023 12.
Article in English | MEDLINE | ID: mdl-37101099

ABSTRACT

Metacognitive monitoring and control processes are important parts of our cognitive system. In this article, they are considered in the light of the dual-process theory and interpreted as occurring at the level of Type 1 and Type 2 information processing. Associative connection is the main factor that allows us to divide these processes into two types. Accordingly, metacognitive monitoring of the first type occurs when feelings of rightness/error automatically appear along with a certain judgment. The second type occurs when a controlled inference is made about whether a judgment is true or false. Metacognitive control of the first type occurs when the decision to reject, revise or accept the received judgment is associated with the feelings of rightness/error and automatically appears when one of these feelings appears. The second type takes place when a person rejects (or they are unclear) the results of the first type of metacognitive control and deliberately decides what to do with the received judgment - reject, revise or accept.


Subject(s)
Metacognition , Humans , Cognition , Judgment , Emotions
17.
Behav Sci (Basel) ; 13(4)2023 Apr 07.
Article in English | MEDLINE | ID: mdl-37102833

ABSTRACT

Empirical studies have found that although humans often rely on heuristic intuition to make stereotypical judgments during extreme base-rate tasks, they can at least detect conflicts between stereotypical and base-rate responses, which supports the dual-processing view of flawless conflict detection. The current study combines the conflict detection paradigm with moderate base-rate tasks of different scales to test the generalization and boundaries of flawless conflict detection. After controlling for possible confounding by the "storage failure" factor, the conflict detection results indicated that reasoners providing stereotypical heuristic responses to conflict problems were slower to respond, less confident in their stereotypical responses, and slower to indicate their reduced confidence than reasoners who answered no-conflict problems. Moreover, none of these differences were affected by different scales. The results suggest that stereotypical reasoners are not blind heuristic performers and that they at least realize that their heuristic responses are not entirely warranted, which supports the argument for flawless conflict detection and extends the boundaries of flawless conflict detection. We discuss the implications of these findings for views of detection, human rationality, and the boundaries of conflict detection.

18.
Cognition ; 237: 105451, 2023 08.
Article in English | MEDLINE | ID: mdl-37058838

ABSTRACT

Base rate neglect refers to the well-documented tendency for people to primarily rely on diagnostic information to identify event probabilities while discounting information about relative probabilities (base rates). It is often postulated that using base rate information requires some form of working memory intensive processes. However, recent studies have put this interpretation into doubt, showing that rapid judgments can also lead to base rate use. Here we examine the idea that base rate neglect can be explained by the degree of attention paid to diagnostic information, which predicts that having more time should lead to greater rates of base rate neglect. Participants were presented with base rate problems either with a limited time to respond or with no time restrictions. Results show that having more time results in a decrease in base rate use.


Subject(s)
Judgment , Memory , Humans , Probability , Attention , Emotions
19.
Cognition ; 235: 105417, 2023 06.
Article in English | MEDLINE | ID: mdl-36870202

ABSTRACT

The capacity to evaluate logical arguments intuitively is a fundamental assumption of recent dual-process theories. One observation supporting this effect is the standard conflict effect on incongruent arguments under belief instruction. Conflict arguments are evaluated less accurately than non-conflict arguments, arguably because logic is intuitive and automatic enough to interfere with belief judgments. However, recent studies have challenged this interpretation by finding the same conflict effects when a matching heuristic cues the same response as logic, even on arguments with no logically valid structures. In this study, we test the matching heuristic hypothesis across 4 experiments (total N = 409) by manipulating the arguments propositions so that matching cues a response that is either (1) aligned or (2) misaligned with logic, or (3) cues no response at all. Consistent with the predictions of the matching heuristic, standard, reversed, and no conflict effects were found in those conditions, respectively. These results indicate that intuitively correct inferences which are assumed as evidence of logical intuitions are actually driven by a matching heuristic that cues responses aligned with logic. Alleged intuitive logic effects are reversed when the matching heuristic cues an opposing logical response or disappears when there are no matching cues. Therefore, it appears as though the operation of a matching heuristic, rather than an intuitive access to logic, drives logical intuitions.


Subject(s)
Illusions , Intuition , Humans , Intuition/physiology , Heuristics , Thinking/physiology , Logic , Judgment/physiology
20.
Conscious Cogn ; 110: 103505, 2023 04.
Article in English | MEDLINE | ID: mdl-37001443

ABSTRACT

Dual process theories of attitude formation propose that an evolutionary old associative system automatically generates subjective judgments by processing mere spatiotemporal contiguity between paired objects, subjects, or events. These judgments can potentially contradict our well-reasoned evaluations and highjack decisional or behavioral outcomes. Contrary to this perspective, other models stress the exclusive work of a single propositional system that consciously process co-occurrences between environmental cues and produce propositions, i.e., mental statements that capture the specific manner through which stimuli are linked. We constructed an experiment on the premise that it would be possible, if the associative system does produce attitudes in a parallel non-conscious fashion, to condition two mutually exclusive attitudes (one implicit, the other explicit) toward a same stimulus. Through explicit ratings, inhibition performance, and neural correlates of performance monitoring, we assessed whether there was a discrepancy between stimuli that were conditioned with (1) the two systems working in harmony (i.e., producing congruent attitudes), or (2) the two systems working in competition (i.e., producing incongruent attitudes). Compared with congruent stimuli, incongruent stimuli consistently elicited more neutral liking scores, higher response times and error rates, as well as a diminished amplitudes in two well-studied neural correlates of automatic error detection (i.e., error-related negativity) and conscious appraisal of error commission (i.e., error-related positivity). Our findings are discussed in the light of evolutionary psychology, dual-process theories of attitude formation and theoretical frameworks on the functional significance of error-related neural markers.


Subject(s)
Attitude , Cues , Humans , Reaction Time/physiology , Judgment , Consciousness , Evoked Potentials/physiology
SELECTION OF CITATIONS
SEARCH DETAIL