ABSTRACT
BACKGROUND: Borderline personality disorder (BPD) is characterized by frequent and intense moment-to-moment changes in affect, behavior, identity, and interpersonal relationships, which typically result in significant and negative deterioration of the person's overall functioning and well-being. Measuring and characterizing the rapidly changing patterns of instability in BPD dysfunction as they occur in a person's daily life can be challenging. Ecological momentary assessment (EMA) is a method that can capture highly dynamic processes in psychopathology research and, thus, is well suited to study intense variability patterns across areas of dysfunction in BPD. EMA studies are characterized by frequent repeated assessments that are delivered to participants in real-life, real-time settings using handheld devices capable of registering responses to short self-report questions in daily life. Compliance in EMA research is defined as the proportion of prompts answered by the participant, considering all planned prompts sent. Low compliance with prompt schedules can compromise the relative advantages of using this method. Despite the growing EMA literature on BPD in recent years, findings regarding study design features that affect compliance with EMA protocols have not been compiled, aggregated, and estimated. OBJECTIVE: This systematic meta-analytic review aimed to investigate the relationship between study design features and participant compliance in EMA research of BPD. METHODS: A systematic review was conducted on November 12, 2021, following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) and MOOSE (Meta-analyses of Observational Studies in Epidemiology) guidelines to search for articles featuring EMA studies of BPD that reported compliance rates and included sufficient data to extract relevant design features. For studies with complete data, random-effect models were used to estimate the overall compliance rate and explore its association with design features. RESULTS: In total, 28 peer-reviewed EMA studies comprising 2052 participants were included in the study. Design features (sampling strategy, average prompting frequency, number of items, response window, sampling device, financial incentive, and dropout rate) showed a large variability across studies, and many studies did not report design features. The meta-analytic synthesis was restricted to 64% (18/28) of articles and revealed a pooled compliance rate of 79% across studies. We did not find any significant relationship between design features and compliance rates. CONCLUSIONS: Our results show wide variability in the design and reporting of EMA studies assessing BPD. Compliance rates appear to be stable across varying setups, and it is likely that standard design features are not directly responsible for improving or diminishing compliance. We discuss possible nonspecific factors of study design that may have an impact on compliance. Given the promise of EMA research in BPD, we also discuss the importance of unifying standards for EMA reporting so that data stemming from this rich literature can be aggregated and interpreted jointly.
Subject(s)
Borderline Personality Disorder , Ecological Momentary Assessment , Humans , Surveys and Questionnaires , Self Report , Research DesignABSTRACT
QUESTION: Most adolescents live in low- and middle-income countries (LMIC), and about 10% of them face mental problems. The mental health provision gap in low- and middle-income countries could be addressed by evidence-based practices, however costs are implementational barriers. Digitalization can improve the accessibility of these tools and constitutes a chance for LMIC to use them more easily at a low cost. We reviewed free and brief evidence-based mental health assessment tools available for digital use to assess psychopathology across different domains in youth. METHODS: For the current study, instruments from a recent review on paper-based instruments were re-used. Additionally, a systematic search was conducted to add instruments for the personality disorder domain. We searched and classified the copyright and license terms available from the internet in terms of free usage and deliverability in a digital format. In the case that this information was insufficient, we contacted the authors. RESULTS: In total, we evaluated 109 instruments. Of these instruments, 53 were free and digitally usable covering 11 mental health domains. However, retrieving information on copyright and license terms was very difficult. CONCLUSIONS: Free and digitally adaptable instruments are available, supporting the strategy of using instruments digitally to increase access. The instrument's authors support this initiative, however, the lack of copyright information and the difficulties in contacting the authors and licence holders are barriers to using this strategy in LMIC. A comprehensive, online instrument repository for clinical practice would be an appropriate next step to make the instruments more accessible and reduce implementation barriers.