Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 155
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Ann Intern Med ; 176(11): 1456-1464, 2023 11.
Artigo em Inglês | MEDLINE | ID: mdl-37903367

RESUMO

BACKGROUND: Multiple challenges impede interprofessional teamwork and the provision of high-quality care to hospitalized patients. OBJECTIVE: To evaluate the effect of interventions to redesign hospital care delivery on teamwork and patient outcomes. DESIGN: Pragmatic controlled trial. Hospitals selected 1 unit for implementation of interventions and a second to serve as a control. (ClinicalTrials.gov: NCT03745677). SETTING: Medical units at 4 U.S. hospitals. PARTICIPANTS: Health care professionals and hospitalized medical patients. INTERVENTION: Mentored implementation of unit-based physician teams, unit nurse-physician coleadership, enhanced interprofessional rounds, unit-level performance reports, and patient engagement activities. MEASUREMENTS: Primary outcomes were teamwork climate among health care professionals and adverse events experienced by patients. Secondary outcomes were length of stay (LOS), 30-day readmissions, and patient experience. Difference-in-differences (DID) analyses of patient outcomes compared intervention versus control units before and after implementation of interventions. RESULTS: Among 155 professionals who completed pre- and postintervention surveys, the median teamwork climate score was higher after than before the intervention only for nurses (n = 77) (median score, 88.0 [IQR, 77.0 to 91.0] vs. 80.0 [IQR, 70.0 to 89.0]; P = 0.022). Among 3773 patients, a greater percentage had at least 1 adverse event after compared with before the intervention on control units (change, 1.61 percentage points [95% CI, 0.01 to 3.22 percentage points]). A similar percentage of patients had at least 1 adverse event after compared with before the intervention on intervention units (change, 0.43 percentage point [CI, -1.25 to 2.12 percentage points]). A DID analysis of adverse events did not show a significant difference in change (adjusted DID, -0.92 percentage point [CI, -2.49 to 0.64 percentage point]; P = 0.25). Similarly, there were no differences in LOS, readmissions, or patient experience. LIMITATION: Adverse events occurred less frequently than anticipated, limiting statistical power. CONCLUSION: Despite improved teamwork climate among nurses, interventions to redesign care for hospitalized patients were not associated with improved patient outcomes. PRIMARY FUNDING SOURCE: Agency for Healthcare Research and Quality.


Assuntos
Pessoal de Saúde , Médicos , Humanos , Tempo de Internação , Qualidade da Assistência à Saúde , Inquéritos e Questionários
2.
Am J Transplant ; 22(10): 2433-2442, 2022 10.
Artigo em Inglês | MEDLINE | ID: mdl-35524363

RESUMO

Racial/ethnic disparities persist in patients' access to living donor kidney transplantation (LDKT). This study assessed the impact of having available potential living donors (PLDs) on candidates' receipt of a kidney transplant (KT) and LDKT at two KT programs. Using data from our clinical trial of waitlisted candidates (January 1, 2014-December 31, 2019), we evaluated Hispanic and Non-Hispanic White (NHW) KT candidates' number of PLDs. Multivariable logistic regression assessed the impact of PLDs on transplantation (KT vs. no KT; for KT recipients, LDKT vs. deceased donor KT). A total of 847 candidates were included, identifying as Hispanic (45.8%) or NHW (54.2%). For Site A, both Hispanic (adjusted OR = 2.26 [95% CI 1.13-4.53]) and NHW (OR = 2.42 [1.10-5.33]) candidates with PLDs completing the questionnaire were more likely to receive a KT. For Site B, candidates with PLDs were not significantly more likely to receive KT. Among KT recipients at both sites, Hispanic (Site A: OR = 21.22 [2.44-184.88]; Site B: OR = 25.54 [7.52-101.54]), and NHW (Site A: OR = 37.70 [6.59-215.67]; Site B: OR = 15.18 [5.64-40.85]) recipients with PLD(s) were significantly more likely to receive a LDKT. Our findings suggest that PLDs increased candidates' likelihood of KT receipt, particularly LDKT. Transplant programs should help candidates identify PLDs early in transplant evaluation.


Assuntos
Falência Renal Crônica , Transplante de Rim , Ensaios Clínicos como Assunto , Etnicidade , Humanos , Falência Renal Crônica/cirurgia , Doadores Vivos , Grupos Raciais
3.
Am J Transplant ; 22(2): 474-488, 2022 02.
Artigo em Inglês | MEDLINE | ID: mdl-34559944

RESUMO

Hispanic patients receive disproportionately fewer living donor kidney transplants (LDKTs) than non-Hispanic Whites (NHWs). The Northwestern Medicine Hispanic Kidney Transplant Program (HKTP), designed to increase Hispanic LDKTs, was evaluated as a nonrandomized, implementation-effectiveness hybrid trial of patients initiating transplant evaluation at two intervention and two similar control sites. Using a mixed method, observational design, we evaluated the fidelity of the HKTP implementation at the two intervention sites. We tested the impact of the HKTP intervention by evaluating the likelihood of receiving LDKT comparing pre-intervention (January 2011-December 2016) and postintervention (January 2017-March 2020), across ethnicity and centers. The HKTP study included 2063 recipients. Intervention Site A exhibited greater implementation fidelity than intervention Site B. For Hispanic recipients at Site A, the likelihood of receiving LDKTs was significantly higher at postintervention compared with pre-intervention (odds ratio [OR] = 3.17 95% confidence interval [1.04, 9.63]), but not at the paired control Site C (OR = 1.02 [0.61, 1.71]). For Hispanic recipients at Site B, the likelihood of receiving an LDKT did not differ between pre- and postintervention (OR = 0.88 [0.40, 1.94]). The LDKT rate was significantly lower for Hispanics at paired control Site D (OR = 0.45 [0.28, 0.90]). The intervention significantly improved LDKT rates for Hispanic patients at the intervention site that implemented the intervention with greater fidelity. Registration: ClinicalTrials.gov registered (retrospectively) on September 7, 2017 (NCT03276390).


Assuntos
Transplante de Rim , Doadores Vivos , Assistência à Saúde Culturalmente Competente , Humanos , Rim , Estudos Retrospectivos
4.
J Gen Intern Med ; 37(8): 1877-1884, 2022 06.
Artigo em Inglês | MEDLINE | ID: mdl-34472021

RESUMO

BACKGROUND: A small number of patients are disproportionally readmitted to hospitals. The Complex High Admission Management Program (CHAMP) was established as a multidisciplinary program to improve continuity of care and reduce readmissions for frequently hospitalized patients. OBJECTIVE: To compare hospital utilization metrics among patients enrolled in CHAMP and usual care. DESIGN: Pragmatic randomized controlled trial. PARTICIPANTS: Inclusion criteria were as follows: 3 or more, 30-day inpatient readmissions in the previous year; or 2 inpatient readmissions plus either a referral or 3 observation admissions in previous 6 months. INTERVENTIONS: Patients randomized to CHAMP were managed by an interdisciplinary team including social work, physicians, and pharmacists. The CHAMP team used comprehensive care planning and inpatient, outpatient, and community visits to address both medical and social needs. Control patients were randomized to usual care and contacted 18 months after initial identification if still eligible. MAIN MEASURES: Primary outcome was number of 30-day inpatient readmissions 180 days following enrollment. Secondary outcomes were number of hospital admissions, total hospital days, emergency department visits, and outpatient clinic visits 180 days after enrollment. KEY RESULTS: There were 75 patients enrolled in CHAMP, 76 in control. Groups were similar in demographic characteristics and baseline readmissions. At 180 days following enrollment, CHAMP patients had more inpatient 30-day readmissions [CHAMP incidence rate 1.3 (95% CI 0.9-1.8) vs. control 0.8 (95% CI 0.5-1.1), p=0.04], though both groups had fewer readmissions compared to 180 days prior to enrollment. We found no differences in secondary outcomes. CONCLUSIONS: Frequently hospitalized patients experienced reductions in utilization over time. Though most outcomes showed no difference, CHAMP was associated with higher readmissions compared to a control group, possibly due to consolidation of care at a single hospital. Future research should seek to identify subsets of patients with persistently high utilization for whom tailored interventions may be beneficial. TRIAL REGISTRATION: ClinicalTrials.gov identifier: NCT03097640; https://clinicaltrials.gov/ct2/show/NCT03097640.


Assuntos
Hospitalização , Readmissão do Paciente , Serviço Hospitalar de Emergência , Humanos , Pacientes Internados
5.
Pain Med ; 23(4): 669-675, 2022 04 08.
Artigo em Inglês | MEDLINE | ID: mdl-34181019

RESUMO

OBJECTIVE: To determine the efficacy of a program to limit the use of the intravenous (IV) push route for opioids on the experience of pain by inpatients and on associated safety events. DESIGN: Retrospective cohort study. SETTING: Two inpatient general medicine floor units at an urban tertiary care academic medical center. SUBJECTS: 4,752 inpatient opioid recipients. METHODS: Patients in one unit were exposed to a multidisciplinary intervention to limit the prescription of opioids via the IV push route, with the other unit used as a control unit. The primary study outcome was the mean numeric pain score per patient during the hospital stay. Secondary measures included the hospital length of stay and postdischarge patient satisfaction. Fidelity measures included the percentage of the patient population exposed to each opioid administration route and the amount of opioid administered per route. Safety measures included patient disposition, transfer to intensive care, and incidence of naloxone administration. RESULTS: The intervention was successful in decreasing both the percentage of patients exposed to IV push opioids and the amount of opioid administered via the IV push route, but no associated changes in other study outcomes were identified. CONCLUSIONS: For the treatment of acute pain in medical inpatients, no evidence of benefit or harm was identified in relation to an increase or decrease in the use of the IV push opioid route.


Assuntos
Medicina Hospitalar , Transtornos Relacionados ao Uso de Opioides , Assistência ao Convalescente , Analgésicos Opioides/uso terapêutico , Hospitalização , Humanos , Pacientes Internados , Transtornos Relacionados ao Uso de Opioides/tratamento farmacológico , Dor Pós-Operatória/tratamento farmacológico , Alta do Paciente , Estudos Retrospectivos
6.
BMC Musculoskelet Disord ; 23(1): 972, 2022 Nov 10.
Artigo em Inglês | MEDLINE | ID: mdl-36357880

RESUMO

STUDY OBJECTIVE: To describe recent practice patterns of preoperative tests and to examine their association with 90-day all-cause readmissions and length of stay. DESIGN: Retrospective cohort study using the New York Statewide Planning and Research Cooperative System (SPARCS). SETTING: SPARCS from March 1, 2016, to July 1, 2017. PARTICIPANTS: Adults undergoing Total Hip Replacement (THR) or Total Knee Replacement (TKR) had a preoperative screening outpatient visit within two months before their surgery. INTERVENTIONS: Electrocardiogram (EKG), chest X-ray, and seven preoperative laboratory tests (RBCs antibody screen, Prothrombin time (PT) and Thromboplastin time, Metabolic Panel, Complete Blood Count (CBC), Methicillin Resistance Staphylococcus Aureus (MRSA) Nasal DNA probe, Urinalysis, Urine culture) were identified. PRIMARY AND SECONDARY OUTCOME MEASURES: Regression analyses were utilized to determine the association between each preoperative test and two postoperative outcomes (90-day all-cause readmission and length of stay). Regression models adjusted for hospital-level random effects, patient demographics, insurance, hospital TKR, THR surgical volume, and comorbidities. Sensitivity analysis was conducted using the subset of patients with no comorbidities. RESULTS: Fifty-five thousand ninety-nine patients (60% Female, mean age 66.1+/- 9.8 SD) were included. The most common tests were metabolic panel (74.5%), CBC (66.8%), and RBC antibody screen (58.8%). The least common tests were MRSA Nasal DNA probe (13.0%), EKG (11.7%), urine culture (10.7%), and chest X-ray (7.9%). Carrying out MRSA testing, urine culture, and EKG was associated with a lower likelihood of 90-day all-cause readmissions. The length of hospital stay was not associated with carrying out any preoperative tests. Results were similar in the subset with no comorbidities. CONCLUSIONS: Wide variation exists in preoperative tests before THR and TKR. We identified three preoperative tests that may play a role in reducing readmissions. Further investigation is needed to evaluate these findings using more granular clinical data.


Assuntos
Artroplastia de Quadril , Artroplastia do Joelho , Humanos , Feminino , Pessoa de Meia-Idade , Idoso , Masculino , Estudos Retrospectivos , Tempo de Internação , Sondas de DNA
7.
J Acoust Soc Am ; 151(4): 2391, 2022 04.
Artigo em Inglês | MEDLINE | ID: mdl-35461508

RESUMO

Distortion product otoacoustic emissions (DPOAEs) offer an outcome measure to consider for clinical detection and monitoring outer hair cell dysfunction as a result of noise exposure. This investigation detailed DPOAE characteristics and behavioral hearing thresholds up to 20 kHz to identify promising metrics for early detection of cochlear dysfunction. In a sample of normal-hearing individuals with and without self-reported noise exposure, the DPOAE and hearing threshold measures, as assessed by two questions, were examined. The effects on various auditory measures in individuals aged 10-65 years old with clinically normal/near-normal hearing through 4 kHz were evaluated. Individuals reporting occupational noise exposures (n = 84) and recreational noise exposures (n = 46) were compared to age-matched nonexposed individuals. The hearing thresholds and DPOAE level, fine structure, and component characteristics for the full frequency bandwidth were examined. The data suggest that the DPOAE levels measured using a range of stimulus levels hold clinical utility while fine structure characteristics offer limited use. Under carefully calibrated conditions, the extension to frequencies beyond 8 kHz in combination with various stimulus levels holds clinical utility. Moreover, this work supports the potential utility of the distortion product place component level for revealing differences in cochlear function due to self-reported, casual noise exposure that are not observable in behavioral hearing thresholds.


Assuntos
Testes Auditivos , Emissões Otoacústicas Espontâneas , Adolescente , Adulto , Idoso , Limiar Auditivo , Criança , Cóclea , Audição , Humanos , Pessoa de Meia-Idade , Autorrelato , Adulto Jovem
8.
J Nurs Manag ; 30(6): 2023-2030, 2022 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-35476274

RESUMO

AIMS: To improve the timeliness and quality of discharge for patients by creating the role of the attending nurse. BACKGROUND: Discharge time affects hospital throughput and patient satisfaction. Bedside nurses and hospitalists have competing priorities that can hinder performing timely, high-quality discharges. METHODS: This retrospective analysis evaluated the effect of an attending nurse paired with a hospital medicine physician on discharge time and quality. A total of 8329 patient discharges were eligible for the study, and propensity score matching yielded 2715 matched pairs. RESULTS: In the post-intervention matched cohort, the percentage of patients discharged before 2 PM increased from 34.4% to 45.9% (p < .01), and the median discharge time moved 48 min earlier. In the unmatched cohort, patient satisfaction with the discharge process improved on several questions. While length of stay was not affected, the 30-day readmission rate did increase from 8.9% to 10.7% (p = .02). CONCLUSION: With the new attending nurse role, we positively impacted throughput by shifting discharge times earlier in the day while improving patient satisfaction. Length of stay stayed the same but the 30-day readmission rate increased. IMPLICATIONS FOR NURSING MANAGEMENT: Our multidisciplinary approach to the problem of late discharge times led to the creation of a new role. This role made ownership of discharge tasks clear and reduced competing priorities, freeing up nurses and hospitalists to perform other care-related responsibilities without holding up discharges.


Assuntos
Alta do Paciente , Readmissão do Paciente , Hospitais , Humanos , Satisfação do Paciente , Estudos Retrospectivos
9.
Stroke ; 52(8): 2676-2679, 2021 08.
Artigo em Inglês | MEDLINE | ID: mdl-34162217

RESUMO

Background and Purpose: Accurate prehospital diagnosis of stroke by emergency medical services (EMS) can increase treatments rates, mitigate disability, and reduce stroke deaths. We aimed to develop a model that utilizes natural language processing of EMS reports and machine learning to improve prehospital stroke identification. Methods: We conducted a retrospective study of patients transported by the Chicago EMS to 17 regional primary and comprehensive stroke centers. Patients who were suspected of stroke by the EMS or had hospital-diagnosed stroke were included in our cohort. Text within EMS reports were converted to unigram features, which were given as input to a support-vector machine classifier that was trained on 70% of the cohort and tested on the remaining 30%. Outcomes included final diagnosis of stroke versus nonstroke, large vessel occlusion, severe stroke (National Institutes of Health Stroke Scale score >5), and comprehensive stroke center-eligible stroke (large vessel occlusion or hemorrhagic stroke). Results: Of 965 patients, 580 (60%) had confirmed acute stroke. In a test set of 289 patients, the text-based model predicted stroke nominally better than models based on the Cincinnati Prehospital Stroke Scale (c-statistic: 0.73 versus 0.67, P=0.165) and was superior to the 3-Item Stroke Scale (c-statistic: 0.73 versus 0.53, P<0.001) scores. Improvements in discrimination were also observed for the other outcomes. Conclusions: We derived a model that utilizes clinical text from paramedic reports to identify stroke. Our results require validation but have the potential of improving prehospital routing protocols.


Assuntos
Pessoal Técnico de Saúde/normas , Serviços Médicos de Emergência/normas , Processamento de Linguagem Natural , Relatório de Pesquisa/normas , Acidente Vascular Cerebral/diagnóstico , Idoso , Idoso de 80 Anos ou mais , Chicago/epidemiologia , Serviços Médicos de Emergência/métodos , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Estudos Retrospectivos , Acidente Vascular Cerebral/epidemiologia
10.
Kidney Int ; 100(6): 1292-1302, 2021 12.
Artigo em Inglês | MEDLINE | ID: mdl-34339746

RESUMO

Disordered iron and mineral homeostasis are interrelated complications of chronic kidney disease that may influence cardiovascular and kidney outcomes. In a prospective analysis of 3747 participants in the Chronic Renal Insufficiency Cohort Study, we investigated risks of mortality, heart failure, end-stage kidney disease (ESKD), and atherosclerotic cardiovascular disease according to iron status, and tested for mediation by C-terminal fibroblast growth factor 23 (FGF23), hemoglobin and parathyroid hormone. Study participants were agnostically categorized based on quartiles of transferrin saturation and ferritin as "Iron Replete" (27.1% of participants; referent group for all outcomes analyses), "Iron Deficiency" (11.1%), "Functional Iron Deficiency" (7.6%), "Mixed Iron Deficiency" (iron indices between the Iron Deficiency and Functional Iron Deficiency groups; 6.3%), "High Iron" (9.2%), or "Non-Classified" (the remaining 38.8% of participants). In multivariable-adjusted Cox models, Iron Deficiency independently associated with mortality (hazard ratio 1.28, 95% confidence interval 1.04-1.58) and heart failure (1.34, 1.05- 1.72). Mixed Iron Deficiency associated with mortality (1.61, 1.27-2.04) and ESKD (1.33, 1.02-1.73). High Iron associated with mortality (1.54, 1.24-1.91), heart failure (1.58, 1.21-2.05), and ESKD (1.41, 1.13-1.77). Functional Iron Deficiency did not significantly associate with any outcome, and no iron group significantly associated with atherosclerotic cardiovascular disease. Among the candidate mediators, FGF23 most significantly mediated the risks of mortality and heart failure conferred by Iron Deficiency. Thus, alterations in iron homeostasis associated with adverse cardiovascular and kidney outcomes in patients with chronic kidney disease.


Assuntos
Fator de Crescimento de Fibroblastos 23/metabolismo , Ferro/análise , Insuficiência Renal Crônica , Estudos de Coortes , Humanos , Rim , Insuficiência Renal Crônica/diagnóstico , Insuficiência Renal Crônica/epidemiologia
11.
Clin Transplant ; 35(6): e14316, 2021 06.
Artigo em Inglês | MEDLINE | ID: mdl-33844367

RESUMO

Deceased organ donor intervention research aims to increase organ quality and quantity for transplantation. We assessed the proportion of kidney transplant candidates who would accept "intervention organs," participate in organ intervention research, and factors influencing acceptance. Kidney transplant candidates were presented 12 hypothetical scenarios, which varied three attributes, donor age, predicted waiting time to receive another organ offer, and research risk to the organ. Candidates were also randomly assigned to one of two conditions varying recipient risk. For each scenario, candidates agreed to accept the intervention organ or remain waitlisted. We fit a multivariable logit model to determine the association between scenario attributes and the acceptance decision. Of 249 participants, most (96%) accepted intervention organs under some or all conditions. Factors independently associated with candidates' greater likelihood of accepting an intervention organ included: low risk to the kidney from the intervention (OR 20.53 [95% Confidence Interval (CI), 13.91-30.29]); younger donor age (OR 3.72 [95% CI, 2.83-4.89]), longer time until the next organ offer (OR 3.48 [95% CI, 2.65-4.57]), and greater trust in their transplant physician (OR 1.03 [95% CI, 1.00-1.06]). Candidates with a lower likelihood of acceptance had been waitlisted longer (OR 0.97 per month [95% CI, 0.96-0.99]) and were Black (OR 0.21 [95% CI, 0.08-0.55]). Most candidates would accept an intervention organ, which should encourage transplant leaders to conduct deceased donor organ intervention trials.


Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Transplantes , Humanos , Doadores de Tecidos , Transplantados , Listas de Espera
12.
Ann Emerg Med ; 78(5): 674-681, 2021 11.
Artigo em Inglês | MEDLINE | ID: mdl-34598828

RESUMO

STUDY OBJECTIVE: Acute stroke patients often require interfacility transfer from primary stroke centers to comprehensive stroke centers. Given the time-sensitive benefits of endovascular reperfusion, reducing door-in-door-out time at the primary stroke center is a target for quality improvement. We sought to identify modifiable predictors of door-in-door-out times at 3 Chicago-region primary stroke centers. METHODS: We performed a retrospective analysis of consecutive patients with acute stroke from February 1, 2018 to January 31, 2020 who required transfer from 1 of 3 primary stroke centers to 1 of 3 affiliated comprehensive stroke centers in the Chicago region. Stroke coordinators at each primary stroke center abstracted data on type of transport, medical interventions and treatments prior to transfer, and relevant time intervals from patient arrival to departure. We evaluated predictors of door-in-door-out time using median regression models. RESULTS: Of 191 total patients, 67.9% arrived by emergency medical services and 57.4% during off-hours. Telestroke was performed in 84.2%, 30.5% received alteplase, and 48.4% underwent a computed tomography (CT) angiography at the primary stroke center. The median door-in-door-out time was 148.5 (interquartile range 106 to 207.8) minutes. The largest contributors to door-in-door-out time, in minutes, were CT to CT angiography time (22 [7 to 73.5]), transfer center contact to ambulance request time (20 [8 to 53.3]), ambulance request to arrival time (20.5 [14 to 36]), and transfer ambulance time at primary stroke center (26 [21 to 35]). Factors associated with door-in-door-out time were (adjusted median differences, in minutes [95% confidence intervals]): CT angiography performed at primary stroke center (+39 [12.3 to 65.7]), walk-in arrival mode (+53 [4.1 to 101.9]), administration of intravenous alteplase (-29 [-31.3 to -26.7]), intubation at primary stroke center (+23 [7.3 to 38.7]), and ambulance request by primary stroke center (-20 [-34.3 to -5.7]). CONCLUSION: Door-in-door-out times at Chicago-area primary stroke centers average nearly 150 minutes. Reducing time to CT angiography, receipt of alteplase, and ambulance request are likely important modifiable targets for interventions to decrease door-in-door-out times at primary stroke centers.


Assuntos
Serviços Médicos de Emergência/estatística & dados numéricos , Transferência de Pacientes/estatística & dados numéricos , Acidente Vascular Cerebral/diagnóstico por imagem , Acidente Vascular Cerebral/terapia , Tempo para o Tratamento/estatística & dados numéricos , Ativador de Plasminogênio Tecidual/administração & dosagem , Chicago , Fibrinolíticos/administração & dosagem , Humanos , Estudos Retrospectivos , Fatores de Tempo , Tomografia Computadorizada por Raios X
13.
J Biomed Inform ; 117: 103749, 2021 05.
Artigo em Inglês | MEDLINE | ID: mdl-33766780

RESUMO

OBJECTIVE: Secure mobile communication technologies are being implemented at an increasing rate across health care organizations, though providers' use of these tools can remain limited by a perceived lack of other users to communicate with. Enabling acceptance and driving provider utilization of these tools throughout an organization requires attention to the interplay between perceived peer usage (i.e. perceived critical mass) and local user needs within the social context of the care team (e.g. inpatient nursing access to the mobile app). To explain these influences, we developed and tested a consolidated model that shows how mobile health care communication technology acceptance and utilization are influenced by the moderating effects of social context on perceptions about the technology. METHODS: The theoretical model and questionnaire were derived from selected technology acceptance models and frameworks. Survey respondents (n = 1254) completed items measuring perceived critical mass, perceived usefulness, perceived ease of use, personal innovativeness in information technology, behavioral intent, and actual use of a recently implemented secure mobile communication tool. Actual use was additionally measured by logged usage data. Use group was defined as whether a hospital's nurses had access to the tool (expanded use group) or not (limited use group). RESULTS: The model accounted for 61% and 72% of the variance in intent to use the communication tool in the limited and expanded use groups, respectively, which in turn accounted for 53% and 33% of actual use. The total effects coefficient of perceived critical mass on behavioral intent was 0.57 in the limited use group (95% CI 0.51-0.63) and 0.70 in the expanded use group (95% CI 0.61-0.80). CONCLUSION: Our model fit the data well and explained the majority of variance in acceptance of the tool amongst participants. The overall influence of perceived critical mass on intent to use the tool was similarly large in both groups. However, the strength of multiple model pathways varied unexpectedly by use group, suggesting that combining sociotechnical moderators with traditional technology acceptance models may produce greater insights than traditional technology acceptance models alone. Practically, our results suggest that healthcare institutions can drive acceptance by promoting the recruitment of early adopters though liberal access policies and making these users and the technology highly visible to others.


Assuntos
Aplicativos Móveis , Telemedicina , Atitude do Pessoal de Saúde , Tecnologia Biomédica , Comunicação , Humanos
14.
J Clin Rheumatol ; 27(8): e440-e445, 2021 Dec 01.
Artigo em Inglês | MEDLINE | ID: mdl-32815908

RESUMO

BACKGROUND/OBJECTIVE: Sleep disturbance is common among adults with osteoarthritis (OA), but little is known about patterns over time. In this cohort study, we identified restless sleep trajectories and associated factors in adults with or at high risk for knee OA. METHODS: Longitudinal (2004-2014) restless sleep (≥3 nights/week) annual reports over 8 years from 4359 Osteoarthritis Initiative participants were analyzed. Group-based trajectory modeling identified heterogeneous temporal patterns. Logistic regression identified baseline health and behavioral predictors of trajectory membership. RESULTS: Four restless sleep trajectory groups were identified: good (69.7%, persistently low restless sleep probabilities), worsening (9.1%), improving (11.7%), and poor (9.5%, persistently high). Among 2 groups initially having low restless sleep prevalence, the worsening trajectory group had an increased likelihood of baseline cardiovascular disease (odds ratio [OR], 1.53; 95% confidence interval [CI], 1.01-2.33), pulmonary disease (OR, 1.48; 95% CI, 1.07-2.05), lower physical activity (OR, 1.29; 95% CI, 1.03-1.61), knee pain (OR, 1.04; 95% CI, 1.00-1.07), depressive symptoms (OR, 1.03; 95% CI, 1.01-1.06), and a decreased likelihood of better mental health (OR, 0.97; 95% CI, 0.95-0.98) at baseline. Among 2 groups initially having high restless sleep prevalence, the poor group had an increased likelihood of baseline depressive symptoms (OR, 1.03; 95% CI, 1.00-1.05). CONCLUSIONS: Four trajectories of restless sleep over 8 years were identified using data collected from over 4000 older adults aged 45 to 79 years with or at higher risk for knee OA. The presence of depressive symptoms, less physical activity, knee pain, poor mental health, cardiovascular disease, or pulmonary disease was each associated with unfavorable trajectories.


Assuntos
Osteoartrite do Joelho , Transtornos do Sono-Vigília , Idoso , Estudos de Coortes , Humanos , Articulação do Joelho , Osteoartrite do Joelho/diagnóstico , Osteoartrite do Joelho/epidemiologia , Sono , Transtornos do Sono-Vigília/diagnóstico , Transtornos do Sono-Vigília/epidemiologia , Transtornos do Sono-Vigília/etiologia
15.
Am J Transplant ; 20(2): 474-492, 2020 02.
Artigo em Inglês | MEDLINE | ID: mdl-31550422

RESUMO

Deceased donor organ intervention research holds promise for increasing the quantity and quality of organs for transplantation by minimizing organ injury and optimizing function. Such research will not progress until ethical, regulatory, and legal issues are resolved regarding whether and how to obtain informed consent from transplant candidates offered intervention organs given time constraints intrinsic to distribution. This multi-center, mixed-methods study involved semi-structured interviews using open- and closed-ended questions to assess waitlisted candidates' preferences for informed consent processes if offered an organ after undergoing intervention. Data were analyzed thematically. Sixty-one candidates participated (47% participation rate). Most were male (57%), white (61%), with a mean age of 56 years. Most candidates (79%) desired being informed that the organ offered was an intervention organ before accepting it, and were likely to accept an intervention organ if organ quality was good (defined as donor age 30) (81%), but fewer candidates would accept an intervention organ if quality was moderate (ie, donor age 50) (26%). Most perceived informed consent important for decision-making, while others considered it unnecessary given medical necessity to accept an organ and trust in their physician. Our findings suggest that most candidates desire an informed consent process before accepting an intervention organ and posttransplant data collection.


Assuntos
Consentimento Livre e Esclarecido/psicologia , Aceitação pelo Paciente de Cuidados de Saúde/psicologia , Obtenção de Tecidos e Órgãos/métodos , Adolescente , Adulto , Idoso , Idoso de 80 Anos ou mais , Feminino , Conhecimentos, Atitudes e Prática em Saúde , Humanos , Consentimento Livre e Esclarecido/ética , Entrevistas como Assunto , Masculino , Pessoa de Meia-Idade , Percepção , Pesquisa Qualitativa , Obtenção de Tecidos e Órgãos/ética , Listas de Espera , Adulto Jovem
16.
Am J Kidney Dis ; 75(6): 908-918, 2020 06.
Artigo em Inglês | MEDLINE | ID: mdl-31864822

RESUMO

RATIONALE & OBJECTIVE: Studies using a single measurement of fibroblast growth factor 23 (FGF-23) suggest that elevated FGF-23 levels are associated with increased risk for requirement for kidney replacement therapy (KRT) in patients with chronic kidney disease. However, the data do not account for changes in FGF-23 levels as kidney disease progresses. STUDY DESIGN: Case-cohort study. SETTING & PARTICIPANTS: To evaluate the association between serial FGF-23 levels and risk for requiring KRT, our primary analysis included 1,597 individuals in the Chronic Renal Insufficiency Cohort Study who had up to 5 annual measurements of carboxy-terminal FGF-23. There were 1,135 randomly selected individuals, of whom 266 initiated KRT, and 462 individuals who initiated KRT outside the random subcohort. EXPOSURE: Serial FGF-23 measurements and FGF-23 trajectory group membership. OUTCOMES: Incident KRT. ANALYTICAL APPROACH: To handle time-dependent confounding, our primary analysis of time-updated FGF-23 levels used time-varying inverse probability weighting in a discrete time failure model. To compare our results with prior data, we used baseline and time-updated FGF-23 values in weighted Cox regression models. To examine the association of FGF-23 trajectory subgroups with risk for incident KRT, we used weighted Cox models with FGF-23 trajectory groups derived from group-based trajectory modeling as the exposure. RESULTS: In our primary analysis, the HR for the KRT outcome per 1 SD increase in the mean of natural log-transformed (ln)FGF-23 in the past was 1.94 (95% CI, 1.51-2.49). In weighted Cox models using baseline and time-updated values, elevated FGF-23 level was associated with increased risk for incident KRT (HRs per 1 SD ln[FGF-23] of 1.18 [95% CI, 1.02-1.37] for baseline and 1.66 [95% CI, 1.49-1.86] for time-updated). Membership in the slowly and rapidly increasing FGF-23 trajectory groups was associated with ∼3- and ∼21-fold higher risk for incident KRT compared to membership in the stable FGF-23 trajectory group. LIMITATIONS: Residual confounding and lack of intact FGF-23 values. CONCLUSIONS: Increasing FGF-23 levels are independently associated with increased risk for incident KRT.


Assuntos
Fatores de Crescimento de Fibroblastos/análise , Falência Renal Crônica , Transplante de Rim/estatística & dados numéricos , Insuficiência Renal Crônica , Terapia de Substituição Renal , Biomarcadores/análise , Estudos de Coortes , Progressão da Doença , Feminino , Fator de Crescimento de Fibroblastos 23 , Humanos , Estimativa de Kaplan-Meier , Falência Renal Crônica/sangue , Falência Renal Crônica/epidemiologia , Falência Renal Crônica/terapia , Transplante de Rim/métodos , Masculino , Pessoa de Meia-Idade , Modelos de Riscos Proporcionais , Insuficiência Renal Crônica/sangue , Insuficiência Renal Crônica/diagnóstico , Insuficiência Renal Crônica/epidemiologia , Insuficiência Renal Crônica/fisiopatologia , Terapia de Substituição Renal/métodos , Terapia de Substituição Renal/estatística & dados numéricos , Medição de Risco/métodos , Fatores de Risco , Estados Unidos/epidemiologia
17.
Am J Kidney Dis ; 75(2): 235-244, 2020 02.
Artigo em Inglês | MEDLINE | ID: mdl-31668375

RESUMO

RATIONALE & OBJECTIVE: The pathogenesis of disordered mineral metabolism in chronic kidney disease (CKD) is largely informed by cross-sectional studies of humans and longitudinal animal studies. We sought to characterize the longitudinal evolution of disordered mineral metabolism during the course of CKD. STUDY DESIGN: Retrospective analysis nested in a cohort study. SETTING & PARTICIPANTS: Participants in the Chronic Renal Insufficiency Cohort (CRIC) Study who had up to 5 serial annual measurements of estimated glomerular filtration rate, fibroblast growth factor 23 (FGF-23), parathyroid hormone (PTH), serum phosphate, and serum calcium and who subsequently reached end-stage kidney disease (ESKD) during follow-up (n = 847). EXPOSURE: Years before ESKD. OUTCOMES: Serial FGF-23, PTH, serum phosphate, and serum calcium levels. ANALYTICAL APPROACH: To assess longitudinal dynamics of disordered mineral metabolism in human CKD, we used "ESKD-anchored longitudinal analyses" to express time as years before ESKD, enabling assessments of mineral metabolites spanning 8 years of CKD progression before ESKD. RESULTS: Mean FGF-23 levels increased markedly as time before ESKD decreased, while PTH and phosphate levels increased modestly and calcium levels declined minimally. Compared with other mineral metabolites, FGF-23 levels demonstrated the highest rate of change (velocity: first derivative of the function of concentration over time) and magnitude of acceleration (second derivative). These changes became evident approximately 5 years before ESKD and persisted without deceleration through ESKD onset. Rates of changes in PTH and phosphate levels increased modestly and without marked acceleration around the same time, with modest deceleration immediately before ESKD, when use of active vitamin D and phosphate binders increased. LIMITATIONS: Individuals who entered the CRIC Study at early stages of CKD and who did not progress to ESKD were not studied. CONCLUSIONS: Among patients with progressive CKD, FGF-23 levels begin to increase 5 years before ESKD and continue to rapidly accelerate until transition to ESKD.


Assuntos
Densidade Óssea/fisiologia , Cálcio/sangue , Fosfatos/sangue , Insuficiência Renal Crônica/sangue , Adulto , Idoso , Biomarcadores/sangue , Estudos Transversais , Progressão da Doença , Feminino , Fator de Crescimento de Fibroblastos 23 , Fatores de Crescimento de Fibroblastos/sangue , Seguimentos , Humanos , Masculino , Pessoa de Meia-Idade , Minerais/metabolismo , Hormônio Paratireóideo/sangue , Estudos Prospectivos , Adulto Jovem
18.
Epilepsy Behav ; 111: 107162, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-32575009

RESUMO

OBJECTIVE: The objective of the study was to describe the effect of the vaginal ring and transdermal patch on lamotrigine serum levels in women with epilepsy. BACKGROUND: Previous studies demonstrate that oral hormonal contraceptives containing synthetic estrogen increase lamotrigine clearance through induction of glucuronidation. This leads to variable lamotrigine serum concentrations throughout monthly cycles in women who are on combined oral contraceptives (COCs). The effects of estrogen-containing nonoral hormonal contraceptive methods, including the vaginal ring and transdermal patch, on lamotrigine pharmacokinetics are not well described. METHODS: Retrospective chart review was performed to identify serum lamotrigine levels drawn from women with epilepsy while on the active phase of vaginal ring or transdermal patch and while off contraception. Wilcoxon signed-rank tests for paired data were used to compare the difference in dose-corrected lamotrigine concentration in plasma between values while on hormonal contraception to those while off contraception in patients using a vaginal ring. RESULTS: Six patients were using the vaginal ring, and one patient was using the transdermal patch. Lamotrigine dose-corrected concentrations were decreased during the active phase of the vaginal ring compared with concentrations during the period off contraception (p = .04). There was one patient without a decrease in concentration, but the other five patients on the vaginal ring had a decrease in dose-corrected lamotrigine concentration ranging from 36 to 70% while on the vaginal ring. Similarly, one patient using the transdermal patch had a decrease of 37% in dose-corrected lamotrigine concentration while on the patch. CONCLUSIONS: The findings support that the vaginal ring contraceptive method decreases lamotrigine concentrations during the active phase of treatment. This has important implications for contraceptive counseling and maintaining therapeutic levels in women of childbearing age with epilepsy.


Assuntos
Anticonvulsivantes/sangue , Anticoncepcionais Femininos/sangue , Dispositivos Anticoncepcionais Femininos/tendências , Epilepsia/sangue , Lamotrigina/sangue , Adesivo Transdérmico/tendências , Adulto , Anticonvulsivantes/uso terapêutico , Anticoncepcionais Femininos/uso terapêutico , Interações Medicamentosas/fisiologia , Epilepsia/tratamento farmacológico , Feminino , Humanos , Lamotrigina/uso terapêutico , Pessoa de Meia-Idade , Estudos Retrospectivos , Adulto Jovem
19.
Ear Hear ; 41(2): 461-464, 2020.
Artigo em Inglês | MEDLINE | ID: mdl-31261213

RESUMO

OBJECTIVES: Traditionally, elevated hearing thresholds have been considered to be the main contributors to difficulty understanding speech in noise; yet, patients will often report difficulties with speech understanding in noise despite having audiometrically normal hearing. The purpose of this cross-sectional study was to critically evaluate the relationship of various metrics of auditory function (behavioral thresholds and otoacoustic emissions) on speech understanding in noise in a large sample of audiometrically normal-hearing individuals. DESIGN: Behavioral hearing thresholds, distortion product otoacoustic emission (DPOAE) levels, stimulus-frequency otoacoustic emission levels, and physiological noise (quantified using OAE noise floors) were measured from 921 individuals between 10 and 68 years of age with normal pure-tone averages. The quick speech-in-noise (QuickSIN) test outcome, quantified as the signal-to-noise ratio (SNR) loss, was used as the metric of speech understanding in noise. Principle component analysis (PCA) and linear regression modeling were used to evaluate the relationship between the measures of auditory function and speech in noise performance. RESULTS: Over 25% of participants exhibited mild or worse degree of SNR loss. PCA revealed DPOAE levels at 12.5 to 16 kHz to be significantly correlated with the variation in QuickSIN scores, although correlations were weak (R = 0.017). Out of all the metrics evaluated, higher levels of self-generated physiological noise accounted for the most variance in QuickSIN performance (R = 0.077). CONCLUSIONS: Higher levels of physiological noise were associated with worse QuickSIN performance in listeners with normal hearing sensitivity. We propose that elevated physiological noise levels in poorer speech in noise performers could diminish the effective SNR, thereby negatively impacting performance as seen by poorer QuickSIN scores.


Assuntos
Percepção da Fala , Fala , Limiar Auditivo , Estudos Transversais , Humanos , Ruído , Emissões Otoacústicas Espontâneas
20.
Ann Rheum Dis ; 78(10): 1412-1419, 2019 10.
Artigo em Inglês | MEDLINE | ID: mdl-31243017

RESUMO

OBJECTIVES: Disability prevention strategies are more achievable before osteoarthritis disease drives impairment. It is critical to identify high-risk groups, for strategy implementation and trial eligibility. An established measure, gait speed is associated with disability and mortality. We sought to develop and validate risk stratification trees for incident slow gait in persons at high risk for knee osteoarthritis, feasible in community and clinical settings. METHODS: Osteoarthritis Initiative (derivation cohort) and Multicenter Osteoarthritis Study (validation cohort) participants at high risk for knee osteoarthritis were included. Outcome was incident slow gait over up to 10-year follow-up. Derivation cohort classification and regression tree analysis identified predictors from easily assessed variables and developed risk stratification models, then applied to the validation cohort. Logistic regression compared risk group predictive values; area under the receiver operating characteristic curves (AUCs) summarised discrimination ability. RESULTS: 1870 (derivation) and 1279 (validation) persons were included. The most parsimonious tree identified three risk groups, from stratification based on age and WOMAC Function. A 7-risk-group tree also included education, strenuous sport/recreational activity, obesity and depressive symptoms; outcome occurred in 11%, varying 0%-29 % (derivation) and 2%-23 % (validation) depending on risk group. AUCs were comparable in the two cohorts (7-risk-group tree, 0.75, 95% CI 0.72 to 0.78 (derivation); 0.72, 95% CI 0.68 to 0.76 (validation)). CONCLUSIONS: In persons at high risk for knee osteoarthritis, easily acquired data can be used to identify those at high risk of incident functional impairment. Outcome risk varied greatly depending on tree-based risk group membership. These trees can inform individual awareness of risk for impaired function and define eligibility for prevention trials.


Assuntos
Árvores de Decisões , Avaliação da Deficiência , Transtornos Neurológicos da Marcha/complicações , Osteoartrite do Joelho/etiologia , Medição de Risco/normas , Idoso , Área Sob a Curva , Estudos de Viabilidade , Feminino , Transtornos Neurológicos da Marcha/fisiopatologia , Humanos , Incidência , Modelos Logísticos , Masculino , Pessoa de Meia-Idade , Osteoartrite do Joelho/epidemiologia , Curva ROC , Reprodutibilidade dos Testes , Medição de Risco/métodos , Fatores de Risco , Velocidade de Caminhada
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA