RESUMO
BACKGROUND: The US Preventive Services Task Force recommends providers offer individualized healthy behavior interventions for all adults, independent of their risk of cardiovascular disease. While strong evidence exists to support disease-specific programs designed to improve multiple lifestyle behaviors, approaches to adapting these interventions for a broader population are not well established. Digital behavior change interventions (DBCIs) hold promise as a more generalizable and scalable approach to overcome the resource and time limitations that traditional behavioral intervention programs face, especially within an occupational setting. OBJECTIVE: We aimed to evaluate the efficacy of a multimodal DBCI on (1) self-reported behaviors of physical activity, nutrition, sleep, and mindfulness; (2) cardiometabolic biomarkers; and (3) chronic disease-related medical expenditure. METHODS: We conducted a 2-arm randomized controlled trial for 12 months among employees of an academic health care facility in the United States. The intervention arm received a scale, a smartphone app, an activity tracker, a video library for healthy behavior recommendations, and an on-demand health coach. The control arm received standard employer-provided health and wellness benefits. The primary outcomes of the study included changes in self-reported lifestyle behaviors, cardiometabolic biomarkers, and chronic disease-related medical expenditure. We collected health behavior data via baseline and quarterly web-based surveys, biometric measures via clinic visits at baseline and 12 months, and identified relevant costs through claims datasets. RESULTS: A total of 603 participants were enrolled and randomized to the intervention (n=300, 49.8%) and control arms (n=303, 50.2%). The average age was 46.7 (SD 11.2) years, and the majority of participants were female (80.3%, n=484), White (85.4%, n=504), and non-Hispanic (90.7%, n=547), with no systematic differences in baseline characteristics observed between the study arms. We observed retention rates of 86.1% (n=519) for completing the final survey and 77.9% (n=490) for attending the exit visit. CONCLUSIONS: This study represents the largest and most comprehensive evaluation of DBCIs among participants who were not selected based on their underlying condition to assess its impact on behavior, cardiometabolic biomarkers, and medical expenditure. TRIAL REGISTRATION: ClinicalTrials.gov NCT04712383; https://clinicaltrials.gov/study/NCT04712383. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): RR1-10.2196/50378.
Assuntos
Biomarcadores , Humanos , Feminino , Masculino , Adulto , Estilo de Vida , Gastos em Saúde/estatística & dados numéricos , Comportamentos Relacionados com a Saúde/fisiologia , Terapia Comportamental/métodos , Pessoa de Meia-Idade , Exercício Físico , Doenças Cardiovasculares/prevenção & controle , Aplicativos MóveisRESUMO
BACKGROUND AND AIMS: Alcohol-associated hepatitis (AH) is a clinically severe, acute disease that afflicts only a fraction of patients with alcohol use disorder (AUD). Genomic studies of alcohol-associated cirrhosis (AC) have identified several genes of large effect, but the genetic and environmental factors that lead to AH and AC, and their degree of genetic overlap, remain largely unknown. This study aims to identify genes and genetic variation that contribute to the development of AH. APPROACH AND RESULTS: Exome-sequencing of patients with AH (N=784) and heavy drinking controls (N=951) identified exome-wide significant association for AH at PNPLA3, as previously observed for AC in GWAS, although with a much lower effect-size. SNPs of large effect-size at ICOSLG (Chr 21) and TOX4/RAB2B (Chr 14), were also exome-wide significant. ICOSLG encodes a co-stimulatory signal for T-cell proliferation and cytokine secretion and induces B-cell proliferation and differentiation. TOX4 was previously implicated in diabetes and immune system function. Other genes previously implicated in AC did not strongly contribute to AH, and the only prominently implicated (but not exome wide significant) gene overlapping with AUD was ADH1B. Polygenic signals for AH were observed in both common and rare variant analysis and identified genes with roles associated with inflammation. CONCLUSIONS: This study has identified two new genes of high effect size with a previously unknown contribution to ALD, and highlights both the overlap in etiology between liver diseases, and the unique origins of AH.
RESUMO
Background: Understanding changes in diagnostic performance after symptom onset and severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) exposure within different populations is crucial to guide the use of diagnostics for SARS-CoV-2. Methods: The Test Us at Home study was a longitudinal cohort study that enrolled individuals across the United States between October 2021 and February 2022. Participants performed paired antigen-detection rapid diagnostic tests (Ag-RDTs) and reverse-transcriptase polymerase chain reaction (RT-PCR) tests at home every 48â hours for 15 days and self-reported symptoms and known coronavirus disease 2019 exposures immediately before testing. The percent positivity for Ag-RDTs and RT-PCR tests was calculated each day after symptom onset and exposure and stratified by vaccination status, variant, age category, and sex. Results: The highest percent positivity occurred 2 days after symptom onset (RT-PCR, 91.2%; Ag-RDT, 71.1%) and 6 days after exposure (RT-PCR, 91.8%; Ag-RDT, 86.2%). RT-PCR and Ag-RDT performance did not differ by vaccination status, variant, age category, or sex. The percent positivity for Ag-RDTs was lower among exposed, asymptomatic than among symptomatic individuals (37.5% (95% confidence interval [CI], 13.7%-69.4%) vs 90.3% (75.1%-96.7%). Cumulatively, Ag-RDTs detected 84.9% (95% CI, 78.2%-89.8%) of infections within 4 days of symptom onset. For exposed participants, Ag-RDTs detected 94.0% (95% CI, 86.7%-97.4%) of RT-PCR-confirmed infections within 6 days of exposure. Conclusions: The percent positivity for Ag-RDTs and RT-PCR tests was highest 2 days after symptom onset and 6 days after exposure, and performance increased with serial testing. The percent positivity of Ag-RDTs was lowest among asymptomatic individuals but did not differ by sex, variant, vaccination status, or age category.
RESUMO
Diagnostic challenges continue to impede development of effective therapies for successful management of alcohol-associated hepatitis (AH), creating an unmet need to identify noninvasive biomarkers for AH. In murine models, complement contributes to ethanol-induced liver injury. Therefore, we hypothesized that complement proteins could be rational diagnostic/prognostic biomarkers in AH. Here, we performed a comparative analysis of data derived from human hepatic and serum proteome to identify and characterize complement protein signatures in severe AH (sAH). The quantity of multiple complement proteins was perturbed in liver and serum proteome of patients with sAH. Multiple complement proteins differentiated patients with sAH from those with alcohol cirrhosis (AC) or alcohol use disorder (AUD) and healthy controls (HCs). Serum collectin 11 and C1q binding protein were strongly associated with sAH and exhibited good discriminatory performance among patients with sAH, AC, or AUD and HCs. Furthermore, complement component receptor 1-like protein was negatively associated with pro-inflammatory cytokines. Additionally, lower serum MBL associated serine protease 1 and coagulation factor II independently predicted 90-day mortality. In summary, meta-analysis of proteomic profiles from liver and circulation revealed complement protein signatures of sAH, highlighting a complex perturbation of complement and identifying potential diagnostic and prognostic biomarkers for patients with sAH.
Assuntos
Biomarcadores , Proteínas do Sistema Complemento , Hepatite Alcoólica , Proteômica , Humanos , Hepatite Alcoólica/sangue , Hepatite Alcoólica/mortalidade , Hepatite Alcoólica/diagnóstico , Proteômica/métodos , Masculino , Feminino , Proteínas do Sistema Complemento/metabolismo , Biomarcadores/sangue , Pessoa de Meia-Idade , Adulto , Fígado/metabolismo , Fígado/patologia , Alcoolismo/sangue , Alcoolismo/complicações , Proteoma/metabolismo , Prognóstico , IdosoRESUMO
BACKGROUND & AIMS: Severe alcohol-associated hepatitis (SAH) is associated with high 90-day mortality. Glucocorticoid therapy for 28 days improves 30- but not 90-day survival. We assessed the efficacy and safety of a combination of anakinra, an IL-1 antagonist, plus zinc (A+Z) compared to prednisone using the Day-7 Lille score as a stopping rule in patients with SAH. METHODS: In this phase IIb double-blind randomized trial in adults with SAH and MELD scores of 20-35, participants were randomized to receive either daily anakinra 100 mg subcutaneously for 14 days plus daily zinc sulfate 220 mg orally for 90 days, or daily prednisone 40 mg orally for 30 days. Prednisone or prednisone placebo was stopped if Day-7 Lille score was >0.45. All study drugs were stopped for uncontrolled infection or ≥5 point increase in MELD score. The primary endpoint was overall survival at 90 days. RESULTS: Seventy-three participants were randomized to prednisone and 74 to A+Z. The trial was stopped early after a prespecified interim analysis showed prednisone was associated with higher 90-day overall survival (90% vs. 70%; hazard ratio for death = 0.34, 95% CI 0.14-0.83, p = 0.018) and transplant-free survival (88% vs. 64%; hazard ratio for transplant or death = 0.30, 95% CI 0.13-0.69, p = 0.004) than A+Z. Acute kidney injury was more frequent with A+Z (45%) than prednisone (22%) (p = 0.001), but rates of infection were similar (31% in A+Z vs. 27% in prednisone, p = 0.389). CONCLUSIONS: Participants with SAH treated with prednisone using the Day-7 Lille score as a stopping rule had significantly higher overall and transplant-free 90-day survival and lower incidence of acute kidney injury than those treated with A+Z. IMPACT AND IMPLICATIONS: There is no approved treatment for severe alcohol-associated hepatitis (SAH). In this double-blind randomized trial, patients with SAH treated with prednisone using the Lille stopping rule on Day 7 had higher 90-day overall and transplant-free survival and lower rates of acute kidney injury compared to patients treated with a combination of anakinra and zinc. The data support continued use of glucocorticoids for patients with SAH, with treatment discontinuation for those with a Lille score >0.45 on Day 7. TRIAL REGISTRATION: NCT04072822.
Assuntos
Injúria Renal Aguda , Hepatite Alcoólica , Adulto , Humanos , Prednisona/efeitos adversos , Proteína Antagonista do Receptor de Interleucina 1/efeitos adversos , Zinco/uso terapêutico , Hepatite Alcoólica/tratamento farmacológico , Método Duplo-Cego , Injúria Renal Aguda/induzido quimicamente , Injúria Renal Aguda/tratamento farmacológico , Resultado do TratamentoRESUMO
BACKGROUND: Systemic thromboxane A2 generation, assessed by quantifying the concentration of stable thromboxane B2 metabolites (TXB2-M) in the urine adjusted for urinary creatinine, is strongly associated with mortality risk. We sought to define optimal TXB2-M cutpoints for aspirin users and nonusers and determine if adjusting TXB2-M for estimated glomerular filtration rate (eGFR) in addition to urinary creatinine improved mortality risk assessment. METHODS: Urinary TXB2-M were measured by competitive ELISA in 1363 aspirin users and 1681 nonusers participating in the Framingham Heart Study. Cutpoints were determined for TXB2-M and TXB2-M/eGFR using log-rank statistics and used to assess mortality risk by Cox proportional hazard modeling and restricted mean survival time. Multivariable models were compared using the Akaike Information Criterion (AIC). A cohort of 105 aspirin users with heart failure was used for external validation. RESULTS: Optimized cutpoints of TXB2-M were 1291 and 5609â pg/mg creatinine and of TXB2-M/eGFR were 16.6 and 62.1 filtered prostanoid units (defined as pg·min/creatinine·mL·1.73â m2), for aspirin users and nonusers, respectively. TXB2-M/eGFR cutpoints provided more robust all-cause mortality risk discrimination than TXB2-M cutpoints, with a larger unadjusted hazard ratio (2.88 vs 2.16, AIC P < 0.0001) and greater differences in restricted mean survival time between exposure groups (1.46 vs 1.10 years), findings that were confirmed in the external validation cohort of aspirin users. TXB2-M/eGFR cutpoints also provided better cardiovascular/stroke mortality risk discrimination than TXB2-M cutpoints (unadjusted hazard ratio 3.31 vs 2.13, AIC P < 0.0001). CONCLUSION: Adjustment for eGFR strengthens the association of urinary TXB2-M with long-term mortality risk irrespective of aspirin use.
Assuntos
Aspirina , Tromboxanos , Humanos , Prognóstico , Creatinina/urina , Aspirina/uso terapêutico , Tromboxano B2/metabolismo , Rim/metabolismoRESUMO
INTRODUCTION: This study is to evaluate the safety and pharmacokinetics (PK) of larsucosterol (DUR-928 or 25HC3S) in subjects with alcohol-associated hepatitis (AH), a devastating acute illness without US Food and Drug Administration-approved therapies. METHODS: This phase 2a, multicenter, open-label, dose escalation study evaluated the safety, PK, and efficacy signals of larsucosterol in 19 clinically diagnosed subjects with AH. Based on the model for end-stage liver disease (MELD) score, 7 subjects were considered to have moderate AH and 12 to have severe AH. All subjects received 1 or 2 intravenous infusions (72 hours apart) of larsucosterol at a dose of 30, 90, or 150 mg and were followed up for 28 days. Efficacy signals from a subgroup of subjects with severe AH were compared with those from 2 matched arms of those with severe AH treated with standard of care (SOC), including corticosteroids, from a contemporaneous study. RESULTS: All 19 larsucosterol-treated subjects survived the 28-day study. Fourteen (74%) of all subjects including 8 (67%) of the subjects with severe AH were discharged ≤72 hours after receiving a single infusion. There were no drug-related serious adverse events nor early terminations due to the treatment. PK profiles were not affected by disease severity. Biochemical parameters improved in most subjects. Serum bilirubin levels declined notably from baseline to day 7 and day 28, and MELD scores were reduced at day 28. The efficacy signals compared favorably with those from 2 matched groups treated with SOC. Lille scores at day 7 were <0.45 in 16 of the 18 (89%) subjects with day 7 samples. Lille scores from 8 subjects with severe AH who received 30 or 90 mg larsucosterol (doses used in phase 2b trial) were statistically significantly lower ( P < 0.01) than those from subjects with severe AH treated with SOC from the contemporaneous study. DISCUSSION: Larsucosterol was well tolerated at all 3 doses in subjects with AH without safety concerns. Data from this pilot study showed promising efficacy signals in subjects with AH. Larsucosterol is being evaluated in a phase 2b multicenter, randomized, double-blinded, placebo-controlled (AHFIRM) trial.
Assuntos
Doença Hepática Terminal , Hepatite Alcoólica , Humanos , Projetos Piloto , Índice de Gravidade de Doença , Hepatite Alcoólica/tratamento farmacológico , Hepatite Alcoólica/diagnósticoRESUMO
Background: Increasing ownership of smartphones among Americans provides an opportunity to use these technologies to manage medical conditions. We examine the influence of baseline smartwatch ownership on changes in self-reported anxiety, patient engagement, and health-related quality of life when prescribed smartwatch for AF detection. Method: We performed a post-hoc secondary analysis of the Pulsewatch study (NCT03761394), a clinical trial in which 120 participants were randomized to receive a smartwatch-smartphone app dyad and ECG patch monitor compared to an ECG patch monitor alone to establish the accuracy of the smartwatch-smartphone app dyad for detection of AF. At baseline, 14 days, and 44 days, participants completed the Generalized Anxiety Disorder-7 survey, the Health Survey SF-12, and the Consumer Health Activation Index. Mixed-effects linear regression models using repeated measures with anxiety, patient activation, physical and mental health status as outcomes were used to examine their association with smartwatch ownership at baseline. Results: Ninety-six participants, primarily White with high income and tertiary education, were randomized to receive a study smartwatch-smartphone dyad. Twenty-four (25%) participants previously owned a smartwatch. Compared to those who did not previously own a smartwatch, smartwatch owners reported significant greater increase in their self-reported physical health (ß = 5.07, P < 0.05), no differences in anxiety (ß = 0.92, P = 0.33), mental health (ß = -2.42, P = 0.16), or patient activation (ß = 1.86, P = 0.54). Conclusions: Participants who own a smartwatch at baseline reported a greater positive change in self-reported physical health, but not in anxiety, patient activation, or self-reported mental health over the study period.
RESUMO
TP53 aberrations constitute the highest risk subset of myelodysplastic neoplasms (MDS) and acute myeloid leukemia (AML). The International Consensus Classification questions the blast threshold between MDS and AML. In this study, we assess the distinction between MDS and AML for 76 patients with TP53 aberrations. We observed no significant differences between MDS and AML regarding TP53 genomics. Median overall survival (OS) was 223 days for the entire group, but prognostic discrimination within subgroups showed the most inferior OS (46 days) for AML with multihit allelic state plus TP53 variant allele frequency (VAF) > 50%. In multivariate analysis, unadjusted Cox models revealed the following variables as independent risk factors for mortality: AML (vs. MDS) (hazard ratio [HR]: 2.50, confidence interval [CI]: 1.4-4.4, p = 0.001), complex karyotype (HR: 3.00, CI: 1.4-6.1, p = 0.003), multihit status (HR: 2.30, CI 1.3-4.2, p = 0.005), and absence of hematopoietic cell transplant (HCT) (HR: 3.90, CI: 1.8-8.9, p = 0.0009). Clonal dynamic modeling showed a significant reduction in TP53 VAF with front-line hypomethylating agents. These findings clarify the impact of specific covariates on outcomes of TP53-aberrant myeloid neoplasms, irrespective of the diagnosis of MDS versus AML, and may influence HCT decisions.
RESUMO
Cutaneous T-cell lymphoma (CTCL) is an uncommon type of lymphoma involving malignant skin-resident or skin-homing T cells. Canine epitheliotropic lymphoma (EL) is the most common form of CTCL in dogs, and it also spontaneously arises from T lymphocytes in the mucosa and skin. Clinically, it can be difficult to distinguish early-stage CTCLs apart from other forms of benign interface dermatitis (ID) in both dogs and people. Our objective was to identify novel biomarkers that can distinguish EL from other forms of ID, and perform comparative transcriptomics of human CTCL and canine EL. Here, we present a retrospective gene expression study that employed archival tissue from biorepositories. We analyzed a discovery cohort of 6 canines and a validation cohort of 8 canines with EL which occurred spontaneously in client-owned companion dogs. We performed comparative targeted transcriptomics studies using NanoString to assess 160 genes from lesional skin biopsies from the discovery cohort and 800 genes from the validation cohort to identify any significant differences that may reflect oncogenesis and immunopathogenesis. We further sought to determine if gene expression in EL and CTCL are conserved across humans and canines by comparing our data to previously published human datasets. Similar chemokine profiles were observed in dog EL and human CTCL, and analyses were performed to validate potential biomarkers and drivers of disease. In dogs, we found enrichment of T cell gene signatures, with upregulation of IFNG, TNF, PRF1, IL15, CD244, CXCL10, and CCL5 in EL in dogs compared to healthy controls. Importantly, CTSW, TRAT1 and KLRK1 distinguished EL from all other forms of interface dermatitis we studied, providing much-needed biomarkers for the veterinary field. XCL1/XCL2 were also highly specific of EL in our validation cohort. Future studies exploring the oncogenesis of spontaneous lymphomas in companion animals will expand our understanding of these disorders. Biomarkers may be useful for predicting disease prognosis and treatment responses. We plan to use our data to inform future development of targeted therapies, as well as for repurposing drugs for both veterinary and human medicine.
RESUMO
BACKGROUND: Severe alcoholic hepatitis (AH) has a high short-term mortality rate. The MELD assesses disease severity and mortality; however, it is not specific for AH. We screened plasma samples from patients with severe AH for biomarkers of multiple pathological processes and identified predictors of short-term mortality. METHODS: Plasma was collected at baseline from 85 patients with severe AH (MELD≥20, Maddrey's discriminant function≥32) enrolled in the Defeat Alcoholic Steatohepatitis clinical trial (investigating IL-1 receptor antagonist+pentoxifylline+zinc vs. methylprednisolone+placebo). Samples were analyzed for 43 biomarkers and the markers' association with 28- and 90-day mortalities was assessed. RESULTS: Thirty-one (36.5%) patients died during the 90-day follow-up with similar ratios in the treatment groups. Eight biomarkers showed an association with mortality. IL-6, IL-22, interferon-α2, soluble TNF receptor 1, lipocalin-2, and α-fetoprotein levels were associated with 28-day mortality, while IL-6, IL-13, and endotoxin levels with 90-day mortality. In multivariable Cox regression, encephalopathy, lipocalin-2, and α-fetoprotein levels were independent predictors of 28-day mortality, and IL-6, IL-13, international normalized ratio levels, and age were independent predictors of 90-day mortality. The combination of IL-13 and age had superior performance in predicting 90-day mortality compared with MELD in the total cohort and the individual treatment groups. CONCLUSIONS: We identified predictors of short-term mortality in a cohort exclusively involving patients with severe AH. We created a composite score of IL-13 and age that predicts 90-day mortality regardless of the treatment type with a performance superior to MELD in severe AH.
Assuntos
Fatores Etários , Hepatite Alcoólica , Interleucina-13 , Humanos , alfa-Fetoproteínas , Biomarcadores/sangue , Hepatite Alcoólica/mortalidade , Interleucina-13/sangue , Interleucina-6 , Lipocalina-2RESUMO
BACKGROUND: Atrial fibrillation (AF) is a common cause of stroke, and timely diagnosis is critical for secondary prevention. Little is known about smartwatches for AF detection among stroke survivors. We aimed to examine accuracy, usability, and adherence to a smartwatch-based AF monitoring system designed by older stroke survivors and their caregivers. OBJECTIVE: This study aims to examine the feasibility of smartwatches for AF detection in older stroke survivors. METHODS: Pulsewatch is a randomized controlled trial (RCT) in which stroke survivors received either a smartwatch-smartphone dyad for AF detection (Pulsewatch system) plus an electrocardiogram patch or the patch alone for 14 days to assess the accuracy and usability of the system (phase 1). Participants were subsequently rerandomized to potentially 30 additional days of system use to examine adherence to watch wear (phase 2). Participants were aged 50 years or older, had survived an ischemic stroke, and had no major contraindications to oral anticoagulants. The accuracy for AF detection was determined by comparing it to cardiologist-overread electrocardiogram patch, and the usability was assessed with the System Usability Scale (SUS). Adherence was operationalized as daily watch wear time over the 30-day monitoring period. RESULTS: A total of 120 participants were enrolled (mean age 65 years; 50/120, 41% female; 106/120, 88% White). The Pulsewatch system demonstrated 92.9% (95% CI 85.3%-97.4%) accuracy for AF detection. Mean usability score was 65 out of 100, and on average, participants wore the watch for 21.2 (SD 8.3) of the 30 days. CONCLUSIONS: Our findings demonstrate that a smartwatch system designed by and for stroke survivors is a viable option for long-term arrhythmia detection among older adults at risk for AF, though it may benefit from strategies to enhance adherence to watch wear. TRIAL REGISTRATION: ClinicalTrials.gov NCT03761394; https://clinicaltrials.gov/study/NCT03761394. INTERNATIONAL REGISTERED REPORT IDENTIFIER (IRRID): RR2-10.1016/j.cvdhj.2021.07.002.
RESUMO
BACKGROUND: Chronic alcohol consumption impairs gut barrier function and perturbs the gut microbiome. Although shifts in bacterial communities in patients with alcohol-associated liver disease (ALD) have been characterized, less is known about the interactions between host metabolism and circulating microbe-derived metabolites during the progression of ALD. METHODS: A large panel of gut microbiome-derived metabolites of aromatic amino acids was quantified by stable isotope dilution liquid chromatography with online tandem mass spectrometry in plasma from healthy controls (n = 29), heavy drinkers (n = 10), patients with moderate (n = 16) or severe alcohol-associated hepatitis (n = 40), and alcohol-associated cirrhosis (n = 10). RESULTS: The tryptophan metabolites, serotonin and indole-3-propionic acid, and tyrosine metabolites, p-cresol sulfate, and p-cresol glucuronide, were decreased in patients with ALD. Patients with severe alcohol-associated hepatitis and alcohol-associated cirrhosis had the largest decrease in concentrations of tryptophan and tyrosine-derived metabolites compared to healthy control. Western blot analysis and interrogation of bulk RNA sequencing data from patients with various liver pathologies revealed perturbations in hepatic expression of phase II metabolism enzymes involved in sulfonation and glucuronidation in patients with severe forms of ALD. CONCLUSIONS: We identified several metabolites decreased in ALD and disruptions of hepatic phase II metabolism. These results indicate that patients with more advanced stages of ALD, including severe alcohol-associated hepatitis and alcohol-associated cirrhosis, had complex perturbations in metabolite concentrations that likely reflect both changes in the composition of the gut microbiome community and the ability of the host to enzymatically modify the gut-derived metabolites.
Assuntos
Aminoácidos Aromáticos , Microbioma Gastrointestinal , Hepatopatias Alcoólicas , Fígado , Humanos , Aminoácidos Aromáticos/metabolismo , Hepatite/metabolismo , Hepatite/fisiopatologia , Cirrose Hepática Alcoólica/metabolismo , Cirrose Hepática Alcoólica/fisiopatologia , Hepatopatias Alcoólicas/metabolismo , Hepatopatias Alcoólicas/fisiopatologia , Triptofano/metabolismo , Tirosina , Microbioma Gastrointestinal/fisiologia , Hepatite Alcoólica/metabolismo , Hepatite Alcoólica/fisiopatologia , Fígado/metabolismo , Fígado/fisiopatologiaRESUMO
BACKGROUND: Many interventions for widescale distribution of rapid antigen tests for COVID-19 have utilized online, direct-to-consumer (DTC) ordering systems; however, little is known about the sociodemographic characteristics of home-test users. We aimed to characterize the patterns of online orders for rapid antigen tests and determine geospatial and temporal associations with neighborhood characteristics and community incidence of COVID-19, respectively. METHODS: This observational study analyzed online, DTC orders for rapid antigen test kits from beneficiaries of the Say Yes! Covid Test program from March to November 2021 in five communities: Louisville, Kentucky; Indianapolis, Indiana; Fulton County, Georgia; O'ahu, Hawaii; and Ann Arbor/Ypsilanti, Michigan. Using spatial autoregressive models, we assessed the geospatial associations of test kit distribution with Census block-level education, income, age, population density, and racial distribution and Census tract-level Social Vulnerability Index. Lag association analyses were used to measure the association between online rapid antigen kit orders and community-level COVID-19 incidence. RESULTS: In total, 164,402 DTC test kits were ordered during the intervention. Distribution of tests at all sites were significantly geospatially clustered at the block-group level (Moran's I: p < 0.001); however, education, income, age, population density, race, and social vulnerability index were inconsistently associated with test orders across sites. In Michigan, Georgia, and Kentucky, there were strong associations between same-day COVID-19 incidence and test kit orders (Michigan: r = 0.89, Georgia: r = 0.85, Kentucky: r = 0.75). The incidence of COVID-19 during the current day and the previous 6-days increased current DTC orders by 9.0 (95% CI = 1.7, 16.3), 3.0 (95% CI = 1.3, 4.6), and 6.8 (95% CI = 3.4, 10.2) in Michigan, Georgia, and Kentucky, respectively. There was no same-day or 6-day lagged correlation between test kit orders and COVID-19 incidence in Indiana. CONCLUSIONS: Our findings suggest that online ordering is not associated with geospatial clustering based on sociodemographic characteristics. Observed temporal preferences for DTC ordering can guide public health messaging around DTC testing programs.
Assuntos
COVID-19 , Humanos , COVID-19/diagnóstico , COVID-19/epidemiologia , Fatores Sociodemográficos , Escolaridade , Censos , Análise por ConglomeradosRESUMO
Background: The detection of atrial fibrillation (AF) in stroke survivors is critical to decreasing the risk of recurrent stroke. Smartwatches have emerged as a convenient and accurate means of AF diagnosis; however, the impact on critical patient-reported outcomes, including anxiety, engagement, and quality of life, remains ill defined. Objectives: To examine the association between smartwatch prescription for AF detection and the patient-reported outcomes of anxiety, patient activation, and self-reported health. Methods: We used data from the Pulsewatch trial, a 2-phase randomized controlled trial that included participants aged 50 years or older with a history of ischemic stroke. Participants were randomized to use either a proprietary smartphone-smartwatch app for 30 days of AF monitoring or no cardiac rhythm monitoring. Validated surveys were deployed before and after the 30-day study period to assess anxiety, patient activation, and self-rated physical and mental health. Logistic regression and generalized estimation equations were used to examine the association between smartwatch prescription for AF monitoring and changes in the patient-reported outcomes. Results: A total of 110 participants (mean age 64 years, 41% female, 91% non-Hispanic White) were studied. Seventy percent of intervention participants were novice smartwatch users, as opposed to 84% of controls, and there was no significant difference in baseline rates of anxiety, activation, or self-rated health between the 2 groups. The incidence of new AF among smartwatch users was 6%. Participants who were prescribed smartwatches did not have a statistically significant change in anxiety, activation, or self-reported health as compared to those who were not prescribed smartwatches. The results held even after removing participants who received an AF alert on the watch. Conclusion: The prescription of smartwatches to stroke survivors for AF monitoring does not adversely affect key patient-reported outcomes. Further research is needed to better inform the successful deployment of smartwatches in clinical practice.
RESUMO
BACKGROUND: The performance of rapid antigen tests (Ag-RDTs) for screening asymptomatic and symptomatic persons for SARS-CoV-2 is not well established. OBJECTIVE: To evaluate the performance of Ag-RDTs for detection of SARS-CoV-2 among symptomatic and asymptomatic participants. DESIGN: This prospective cohort study enrolled participants between October 2021 and January 2022. Participants completed Ag-RDTs and reverse transcriptase polymerase chain reaction (RT-PCR) testing for SARS-CoV-2 every 48 hours for 15 days. SETTING: Participants were enrolled digitally throughout the mainland United States. They self-collected anterior nasal swabs for Ag-RDTs and RT-PCR testing. Nasal swabs for RT-PCR were shipped to a central laboratory, whereas Ag-RDTs were done at home. PARTICIPANTS: Of 7361 participants in the study, 5353 who were asymptomatic and negative for SARS-CoV-2 on study day 1 were eligible. In total, 154 participants had at least 1 positive RT-PCR result. MEASUREMENTS: The sensitivity of Ag-RDTs was measured on the basis of testing once (same-day), twice (after 48 hours), and thrice (after a total of 96 hours). The analysis was repeated for different days past index PCR positivity (DPIPPs) to approximate real-world scenarios where testing initiation may not always coincide with DPIPP 0. Results were stratified by symptom status. RESULTS: Among 154 participants who tested positive for SARS-CoV-2, 97 were asymptomatic and 57 had symptoms at infection onset. Serial testing with Ag-RDTs twice 48 hours apart resulted in an aggregated sensitivity of 93.4% (95% CI, 90.4% to 95.9%) among symptomatic participants on DPIPPs 0 to 6. When singleton positive results were excluded, the aggregated sensitivity on DPIPPs 0 to 6 for 2-time serial testing among asymptomatic participants was lower at 62.7% (CI, 57.0% to 70.5%), but it improved to 79.0% (CI, 70.1% to 87.4%) with testing 3 times at 48-hour intervals. LIMITATION: Participants tested every 48 hours; therefore, these data cannot support conclusions about serial testing intervals shorter than 48 hours. CONCLUSION: The performance of Ag-RDTs was optimized when asymptomatic participants tested 3 times at 48-hour intervals and when symptomatic participants tested 2 times separated by 48 hours. PRIMARY FUNDING SOURCE: National Institutes of Health RADx Tech program.
Assuntos
COVID-19 , Humanos , COVID-19/diagnóstico , Estudos Prospectivos , SARS-CoV-2 , Reação em Cadeia da Polimerase , Cognição , Sensibilidade e EspecificidadeRESUMO
OBJECTIVE: This study aimed to determine whether an infiltrative block with liposomal bupivacaine was associated with less rescue analgesia administration and lower pain scores than a bupivacaine splash block after ovariohysterectomy in dogs. ANIMALS: Eligible dogs included those that were spayed as part of a veterinary teaching laboratory. Dogs were up to 7 years old and otherwise healthy. A total of 136 dogs were analyzed. METHODS: All dogs underwent ovariohysterectomy performed by veterinary students. Dogs received hydromorphone and acepromazine premedication, propofol induction, isoflurane maintenance, and an NSAID. Dogs were randomly allocated to receive either a splash block with standard bupivacaine or an infiltrative block with liposomal bupivacaine for incisional analgesia. Postoperatively, all dogs were assessed by a blinded evaluator using the Colorado State University-Canine Acute Pain Scale (CSU-CAPS) and Glasgow Composite Measures Pain Scale-Short Form (GCPS-SF). Dogs received rescue analgesia with buprenorphine if they scored ≥ 2 on the CSU-CAPS scale. RESULTS: Dogs that received liposomal bupivacaine had a significantly lower incidence of (P = .04) and longer time to (P = .03) administration of rescue analgesia. There was an overall time-averaged significant difference between groups for CSU-CAPS (P = .049) and GCPS-SF scores (P = .015), with dogs in the bupivacaine group being more likely to have an elevated pain score at some point for both scales. CLINICAL RELEVANCE: The use of liposomal bupivacaine in an infiltrative block may decrease the need for rescue analgesia in dogs undergoing ovariohysterectomy compared to a bupivacaine splash block.
Assuntos
Analgesia , Doenças do Cão , Dor Pós-Operatória , Animais , Cães , Feminino , Analgesia/veterinária , Anestésicos Locais/uso terapêutico , Bupivacaína/uso terapêutico , Doenças do Cão/tratamento farmacológico , Doenças do Cão/prevenção & controle , Histerectomia/veterinária , Ovariectomia/veterinária , Dor Pós-Operatória/tratamento farmacológico , Dor Pós-Operatória/prevenção & controle , Dor Pós-Operatória/veterinária , Distribuição AleatóriaRESUMO
Background: Rapid antigen detection tests (Ag-RDT) for SARS-CoV-2 with emergency use authorization generally include a condition of authorization to evaluate the test's performance in asymptomatic individuals when used serially. We aim to describe a novel study design that was used to generate regulatory-quality data to evaluate the serial use of Ag-RDT in detecting SARS-CoV-2 virus among asymptomatic individuals. Methods: This prospective cohort study used a siteless, digital approach to assess longitudinal performance of Ag-RDT. Individuals over 2 years old from across the USA with no reported COVID-19 symptoms in the 14 days prior to study enrollment were eligible to enroll in this study. Participants throughout the mainland USA were enrolled through a digital platform between October 18, 2021 and February 15, 2022. Participants were asked to test using Ag-RDT and molecular comparators every 48 hours for 15 days. Enrollment demographics, geographic distribution, and SARS-CoV-2 infection rates are reported. Key Results: A total of 7361 participants enrolled in the study, and 492 participants tested positive for SARS-CoV-2, including 154 who were asymptomatic and tested negative to start the study. This exceeded the initial enrollment goals of 60 positive participants. We enrolled participants from 44 US states, and geographic distribution of participants shifted in accordance with the changing COVID-19 prevalence nationwide. Conclusions: The digital site-less approach employed in the "Test Us At Home" study enabled rapid, efficient, and rigorous evaluation of rapid diagnostics for COVID-19 and can be adapted across research disciplines to optimize study enrollment and accessibility.
RESUMO
Importance: Pharmacologic agents are often used to treat newborns with prenatal opioid exposure (POE) despite known adverse effects on neurodevelopment. Alternative nonpharmacological interventions are needed. Objective: To examine efficacy of a vibrating crib mattress for treating newborns with POE. Design, Setting, and Participants: In this dual-site randomized clinical trial, 208 term newborns with POE, enrolled from March 9, 2017, to March 10, 2020, were studied at their bedside throughout hospitalization. Interventions: Half the cohort received treatment as usual (TAU) and half received standard care plus low-level stochastic (random) vibrotactile stimulation (SVS) using a uniquely constructed crib mattress with a 3-hour on-off cycle. Study initiated in the newborn unit where newborns were randomized to TAU or SVS within 48 hours of birth. All infants whose symptoms met clinical criteria for pharmacologic treatment received morphine in the neonatal intensive care unit per standard care. Main Outcomes and Measures: The a priori primary outcomes analyzed were pharmacotherapy (administration of morphine treatment [AMT], first-line medication at both study sites [number of infants treated], and cumulative morphine dose) and hospital length of stay. Intention-to-treat analysis was conducted. Results: Analyses were performed on 181 newborns who completed hospitalization at the study sites (mean [SD] gestational age, 39.0 [1.2] weeks; mean [SD] birth weight, 3076 (489) g; 100 [55.2%] were female). Of the 181 analyzed infants, 121 (66.9%) were discharged without medication and 60 (33.1%) were transferred to the NICU for morphine treatment (31 [51.7%] TAU and 29 [48.3%] SVS). Treatment rate was not significantly different in the 2 groups: 35.6% (31 of 87 infants who received TAU) and 30.9% (29 of 94 infants who received SVS) (P = .60). Adjusting for site, sex, birth weight, opioid exposure, and feed type, infant duration on the vibrating mattress in the newborn unit was associated with reduction in AMT (adjusted odds ratio, 0.88 hours per day; 95% CI, 0.81-0.93 hours per day). This translated to a 50% relative reduction in AMT for infants who received SVS on average 6 hours per day. Among 32 infants transferred to the neonatal intensive care unit for morphine treatment who completed treatment within 3 weeks, those assigned to SVS finished treatment nearly twice as fast (hazard ratio, 1.96; 95% CI, 1.01-3.81), resulting in 3.18 fewer treatment days (95% CI, -0.47 to -0.04 days) and receiving a mean 1.76 mg/kg less morphine (95% CI, -3.02 to -0.50 mg/kg) than the TAU cohort. No effects of condition were observed among infants treated for more than 3 weeks (n = 28). Conclusions and Relevance: The findings of this clinical trial suggest that SVS may serve as a complementary nonpharmacologic intervention for newborns with POE. Reducing pharmacotherapy with SVS has implications for reduced hospitalization stays and costs, and possibly improved infant outcomes given the known adverse effects of morphine on neurodevelopment. Trial Registration: ClinicalTrials.gov Identifier: NCT02801331.
Assuntos
Analgésicos Opioides , Morfina , Lactente , Gravidez , Recém-Nascido , Humanos , Feminino , Adulto , Masculino , Analgésicos Opioides/efeitos adversos , Peso ao Nascer , Morfina/efeitos adversos , Unidades de Terapia Intensiva Neonatal , Idade GestacionalRESUMO
INTRODUCTION: We investigated the effect of daily oral Lactobacillus rhamnosus GG (LGG) in reducing liver injury/severity and drinking in patients with alcohol use disorder and moderately severe alcohol-associated hepatitis. METHODS: Forty-six male and female individuals with alcohol use disorder and moderate alcohol-associated hepatitis (12 ≤ model for end-stage liver disease score < 20, aged 21-67 years) received either LGG (n = 24) or placebo (n = 22). Data were collected/assessed at baseline and at 1, 3, and 6 months. RESULTS: LGG treatment was associated with a significant reduction in liver injury after 1 month. Six months of LGG treatment reduced heavy drinking levels to social or abstinence levels. DISCUSSION: LGG treatment was associated with an improvement in both liver injury and drinking.