Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Resultados 1 - 20 de 28
Filtrar
1.
Stat Med ; 43(15): 2894-2927, 2024 Jul 10.
Artículo en Inglés | MEDLINE | ID: mdl-38738397

RESUMEN

Estimating causal effects from large experimental and observational data has become increasingly prevalent in both industry and research. The bootstrap is an intuitive and powerful technique used to construct standard errors and confidence intervals of estimators. Its application however can be prohibitively demanding in settings involving large data. In addition, modern causal inference estimators based on machine learning and optimization techniques exacerbate the computational burden of the bootstrap. The bag of little bootstraps has been proposed in non-causal settings for large data but has not yet been applied to evaluate the properties of estimators of causal effects. In this article, we introduce a new bootstrap algorithm called causal bag of little bootstraps for causal inference with large data. The new algorithm significantly improves the computational efficiency of the traditional bootstrap while providing consistent estimates and desirable confidence interval coverage. We describe its properties, provide practical considerations, and evaluate the performance of the proposed algorithm in terms of bias, coverage of the true 95% confidence intervals, and computational time in a simulation study. We apply it in the evaluation of the effect of hormone therapy on the average time to coronary heart disease using a large observational data set from the Women's Health Initiative.


Asunto(s)
Algoritmos , Causalidad , Simulación por Computador , Humanos , Femenino , Intervalos de Confianza , Enfermedad Coronaria/epidemiología , Modelos Estadísticos , Interpretación Estadística de Datos , Sesgo , Estudios Observacionales como Asunto/métodos , Estudios Observacionales como Asunto/estadística & datos numéricos
2.
Ann Neurol ; 2022 Mar 02.
Artículo en Inglés | MEDLINE | ID: mdl-35233819

RESUMEN

OBJECTIVE: To identify the rates of neurological events following administration of mRNA (Pfizer, Moderna) or adenovirus vector (Janssen) vaccines in the U.S.. METHODS: We utilized publicly available data from the U.S. Vaccine Adverse Event Reporting System (VAERS) collected between January 1, 2021-June 14, 2021. All free text symptoms that were reported within 42 days of vaccine administration were manually reviewed and grouped into 36 individual neurological diagnostic categories. Post-vaccination neurological event rates were compared between vaccine types and to age-matched baseline incidence rates in the U.S. and rates of neurological events following COVID. RESULTS: Of 306,907,697 COVID vaccine doses administered during the study timeframe, 314,610 (0.1%) people reported any adverse event and 105,214 (0.03%) reported neurological adverse events in a median of 1 day (IQR0-3) from inoculation. Guillain-Barre Syndrome (GBS), and cerebral venous thrombosis (CVT) occurred in fewer than 1 per 1,000,000 doses. Significantly more neurological adverse events were reported following Janssen (Ad26.COV2.S) vaccination compared to either Pfizer-BioNtech (BNT162b2) or Moderna (mRNA-1273; 0.15% versus 0.03% versus 0.03% of doses, respectively,P<0.0001). The observed-to-expected ratios for GBS, CVT and seizure following Janssen vaccination were ≥1.5-fold higher than background rates. However, the rate of neurological events after acute SARS-CoV-2 infection was up to 617-fold higher than after COVID vaccination. INTERPRETATION: Reports of serious neurological events following COVID vaccination are rare. GBS, CVT and seizure may occur at higher than background rates following Janssen vaccination. Despite this, rates of neurological complications following acute SARS-CoV-2 infection are up to 617-fold higher than after COVID vaccination. This article is protected by copyright. All rights reserved.

3.
Cephalalgia ; 42(11-12): 1207-1217, 2022 10.
Artículo en Inglés | MEDLINE | ID: mdl-35514199

RESUMEN

BACKGROUND: Delayed-onset of headache seems a specific feature of cerebrovascular events after COVID-19 vaccines. METHODS: All consecutive events reported to the United States Vaccine Adverse Reporting System following COVID-19 vaccines (1 January to 24 June 2021), were assessed. The timing of headache onset post-vaccination in subjects with and without concomitant cerebrovascular events, including cerebral venous thrombosis, ischemic stroke, and intracranial haemorrhage was analysed. The diagnostic accuracy in predicting concurrent cerebrovascular events of the guideline- proposed threshold of three-days from vaccination to headache onset was evaluated. RESULTS: There were 314,610 events following 306,907,697 COVID-19 vaccine doses, including 41,700 headaches, and 178/41,700 (0.4%) cerebrovascular events. The median time between the vaccination and the headache onset was shorter in isolated headache (1 day vs. 4 (in cerebral venous thrombosis), 3 (in ischemic stroke), or 10 (in intracranial hemorrhage) days, all P < 0.001). Delayed onset of headache had an area under the curve of 0.83 (95% CI: 0.75-0.97) for cerebral venous thrombosis, 0.70 (95% CI: 0.63-76) for ischemic stroke and 0.76 (95% CI: 0.67-84) for intracranial hemorrhage, and >99% negative predictive value. CONCLUSION: Headache following COVID-19 vaccination occurs within 1 day and is rarely associated with cerebrovascular events. Delayed onset of headache 3 days post-vaccination was an accurate diagnostic biomarker for the occurrence of a concomitant cerebrovascular events.


Asunto(s)
COVID-19 , Accidente Cerebrovascular Isquémico , Vacunas , Trombosis de la Vena , Sistemas de Registro de Reacción Adversa a Medicamentos , Biomarcadores , COVID-19/epidemiología , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Cefalea/inducido químicamente , Cefalea/etiología , Humanos , Hemorragias Intracraneales/inducido químicamente , Estados Unidos , Vacunas/efectos adversos
4.
Stat Med ; 40(10): 2305-2320, 2021 05 10.
Artículo en Inglés | MEDLINE | ID: mdl-33665870

RESUMEN

Inverse probability of treatment weighting (IPTW), which has been used to estimate average treatment effects (ATE) using observational data, tenuously relies on the positivity assumption and the correct specification of the treatment assignment model, both of which are problematic assumptions in many observational studies. Various methods have been proposed to overcome these challenges, including truncation, covariate-balancing propensity scores, and stable balancing weights. Motivated by an observational study in spine surgery, in which positivity is violated and the true treatment assignment model is unknown, we present the use of optimal balancing by kernel optimal matching (KOM) to estimate ATE. By uniformly controlling the conditional mean squared error of a weighted estimator over a class of models, KOM simultaneously mitigates issues of possible misspecification of the treatment assignment model and is able to handle practical violations of the positivity assumption, as shown in our simulation study. Using data from a clinical registry, we apply KOM to compare two spine surgical interventions and demonstrate how the result matches the conclusions of clinical trials that IPTW estimates spuriously refute.


Asunto(s)
Modelos Estadísticos , Simulación por Computador , Humanos , Puntaje de Propensión
5.
Neurosurg Focus ; 49(5): E18, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-33130616

RESUMEN

OBJECTIVE: Spine surgery is especially susceptible to malpractice claims. Critics of the US medical liability system argue that it drives up costs, whereas proponents argue it deters negligence. Here, the authors study the relationship between malpractice claim density and outcomes. METHODS: The following methods were used: 1) the National Practitioner Data Bank was used to determine the number of malpractice claims per 100 physicians, by state, between 2005 and 2010; 2) the Nationwide Inpatient Sample was queried for spinal fusion patients; and 3) the Area Resource File was queried to determine the density of physicians, by state. States were categorized into 4 quartiles regarding the frequency of malpractice claims per 100 physicians. To evaluate the association between malpractice claims and death, discharge disposition, length of stay (LOS), and total costs, an inverse-probability-weighted regression-adjustment estimator was used. The authors controlled for patient and hospital characteristics. Covariates were used to train machine learning models to predict death, discharge disposition not to home, LOS, and total costs. RESULTS: Overall, 549,775 discharges following spinal fusions were identified, with 495,640 yielding state-level information about medical malpractice claim frequency per 100 physicians. Of these, 124,425 (25.1%), 132,613 (26.8%), 130,929 (26.4%), and 107,673 (21.7%) were from the lowest, second-lowest, second-highest, and highest quartile states, respectively, for malpractice claims per 100 physicians. Compared to the states with the fewest claims (lowest quartile), surgeries in states with the most claims (highest quartile) showed a statistically significantly higher odds of a nonhome discharge (OR 1.169, 95% CI 1.139-1.200), longer LOS (mean difference 0.304, 95% CI 0.256-0.352), and higher total charges (mean difference [log scale] 0.288, 95% CI 0.281-0.295) with no significant associations for mortality. For the machine learning models-which included medical malpractice claim density as a covariate-the areas under the curve for death and discharge disposition were 0.94 and 0.87, and the R2 values for LOS and total charge were 0.55 and 0.60, respectively. CONCLUSIONS: Spinal fusion procedures from states with a higher frequency of malpractice claims were associated with an increased odds of nonhome discharge, longer LOS, and higher total charges. This suggests that medicolegal climate may potentially alter practice patterns for a given spine surgeon and may have important implications for medical liability reform. Machine learning models that included medical malpractice claim density as a feature were satisfactory in prediction and may be helpful for patients, surgeons, hospitals, and payers.


Asunto(s)
Mala Praxis , Fusión Vertebral , Humanos , Tiempo de Internación , Aprendizaje Automático , Alta del Paciente , Fusión Vertebral/efectos adversos , Estados Unidos
7.
Stat Med ; 38(10): 1891-1902, 2019 05 10.
Artículo en Inglés | MEDLINE | ID: mdl-30592073

RESUMEN

Marginal structural Cox models have been used to estimate the causal effect of a time-varying treatment on a survival outcome in the presence of time-dependent confounders. These methods rely on the positivity assumption, which states that the propensity scores are bounded away from zero and one. Practical violations of this assumption are common in longitudinal studies, resulting in extreme weights that may yield erroneous inferences. Truncation, which consists of replacing outlying weights with less extreme ones, is the most common approach to control for extreme weights to date. While truncation reduces the variability in the weights and the consequent sampling variability of the estimator, it can also introduce bias. Instead of truncated weights, we propose using optimal probability weights, defined as those that have a specified variance and the smallest Euclidean distance from the original, untruncated weights. The set of optimal weights is obtained by solving a constrained quadratic optimization problem. The proposed weights are evaluated in a simulation study and applied to the assessment of the effect of treatment on time to death among people in Sweden who live with human immunodeficiency virus and inject drugs.


Asunto(s)
Infecciones por VIH/mortalidad , Modelos de Riesgos Proporcionales , Abuso de Sustancias por Vía Intravenosa/mortalidad , Infecciones por VIH/tratamiento farmacológico , Humanos , Estudios Longitudinales , Estudios Observacionales como Asunto/estadística & datos numéricos , Probabilidad , Puntaje de Propensión , Estudios Prospectivos , Sistema de Registros , Suecia/epidemiología
8.
Harm Reduct J ; 14(1): 57, 2017 08 16.
Artículo en Inglés | MEDLINE | ID: mdl-28814336

RESUMEN

BACKGROUND: People who inject drugs (PWID) frequently engage in injection risk behaviours exposing them to blood-borne infections. Understanding the underlying causes that drive various types and levels of risk behaviours is important to better target preventive interventions. METHODS: A total of 2150 PWID in Swedish remand prisons were interviewed between 2002 and 2012. Questions on socio-demographic and drug-related variables were asked in relation to the following outcomes: Having shared injection drug solution and having lent out or having received already used drug injection equipment within a 12 month recall period. RESULTS: Women shared solutions more than men (odds ratio (OR) 1.51, 95% confidence interval (CI) 1.03; 2.21). Those who had begun to inject drugs before age 17 had a higher risk (OR 1.43, 95% CI 0.99; 2.08) of having received used equipment compared to 17-19 year olds. Amphetamine-injectors shared solutions more than those injecting heroin (OR 2.43, 95% CI 1.64; 3.62). A housing contract lowered the risk of unsafe injection by 37-59% compared to being homeless. CONCLUSIONS: Women, early drug debut, amphetamine users and homeless people had a significantly higher level of injection risk behaviour and need special attention and tailored prevention to successfully combat hepatitis C and HIV transmission among PWID. TRIAL REGISTRATION: ClinicalTrials.gov Identifier, NCT02234167.


Asunto(s)
Consumidores de Drogas/psicología , Asunción de Riesgos , Abuso de Sustancias por Vía Intravenosa/psicología , Adolescente , Adulto , Factores de Edad , Edad de Inicio , Trastornos Relacionados con Anfetaminas/psicología , Femenino , Infecciones por VIH/prevención & control , Hepatitis C/prevención & control , Dependencia de Heroína/psicología , Personas con Mala Vivienda , Humanos , Masculino , Compartición de Agujas , Prisiones , Factores Sexuales , Factores Socioeconómicos , Suecia , Adulto Joven
9.
BMC Infect Dis ; 16(1): 759, 2016 12 16.
Artículo en Inglés | MEDLINE | ID: mdl-27986077

RESUMEN

BACKGROUND: The effect of peer support on virologic and immunologic treatment outcomes among HIVinfected patients receiving antiretroviral therapy (ART) was assessed in a cluster randomized controlled trial in Vietnam. METHODS: Seventy-one clusters (communes) were randomized in intervention or control, and a total of 640 patients initiating ART were enrolled. The intervention group received peer support with weekly home-visits. Both groups received first-line ART regimens according to the National Treatment Guidelines. Viral load (VL) (ExaVir™ Load) and CD4 counts were analyzed every 6 months. The primary endpoint was virologic failure (VL >1000 copies/ml). Patients were followed up for 24 months. Intention-to-treat analysis was used. Cluster longitudinal and survival analyses were used to study time to virologic failure and CD4 trends. RESULTS: Of 640 patients, 71% were males, mean age 32 years, 83% started with stavudine/lamivudine/nevirapine regimen. After a mean of 20.8 months, 78% completed the study, and the median CD4 increase was 286 cells/µl. Cumulative virologic failure risk was 7.2%. There was no significant difference between intervention and control groups in risk for and time to virologic failure and in CD4 trends. Risk factors for virologic failure were ART-non-naïve status [aHR 6.9;(95% CI 3.2-14.6); p < 0.01]; baseline VL ≥100,000 copies/ml [aHR 2.3;(95% CI 1.2-4.3); p < 0.05] and incomplete adherence (self-reported missing more than one dose during 24 months) [aHR 3.1;(95% CI 1.1-8.9); p < 0.05]. Risk factors associated with slower increase of CD4 counts were: baseline VL ≥100,000 copies/ml [adj.sq.Coeff (95% CI): -0.9 (-1.5;-0.3); p < 0.01] and baseline CD4 count <100 cells/µl [adj.sq.Coeff (95% CI): -5.7 (-6.3;-5.4); p < 0.01]. Having an HIV-infected family member was also significantly associated with gain in CD4 counts [adj.sq.Coeff (95% CI): 1.3 (0.8;1.9); p < 0.01]. CONCLUSION: There was a low virologic failure risk during the first 2 years of ART follow-up in a rural low-income setting in Vietnam. Peer support did not show any impact on virologic and immunologic outcomes after 2 years of follow up. TRIAL REGISTRATION: NCT01433601 .


Asunto(s)
Fármacos Anti-VIH/uso terapéutico , Infecciones por VIH/tratamiento farmacológico , Infecciones por VIH/virología , Lamivudine/uso terapéutico , Nevirapina/uso terapéutico , Grupo Paritario , Apoyo Social , Estavudina/uso terapéutico , Adulto , Recuento de Linfocito CD4 , Análisis por Conglomerados , Consejo , Femenino , Infecciones por VIH/inmunología , Infecciones por VIH/psicología , Humanos , Masculino , Resultado del Tratamiento , Vietnam/epidemiología , Carga Viral/efectos de los fármacos
10.
Artículo en Inglés | MEDLINE | ID: mdl-38520200

RESUMEN

OBJECTIVE: To examine patient characteristics that impact serial observation adherence among vestibular schwannoma (VS) patients. STUDY DESIGN: Retrospective chart review. SETTING: Single tertiary care center. METHODS: We selected for VS patients from 201 to 2020 who elected for serial observation as initial management. Patients under 18, with previous management, bilateral or intralabyrinthine VS, and neurofibromatosis type 2 were excluded. Demographics, tumor characteristics, and follow-up status were extracted. Single and multiple logistic regression was used to identify patient characteristics impacting follow-up. RESULTS: We identified 507 VS patients who chose serial observation as initial management. Most were female (56.0%), white (73.0%), and married (72.8%). The mean age was 59.3 and most had private insurance (56.4%). Median Charlson Comorbidity Index was 2.00. Mean pure tone audiometry (PTA) average was 41.7 Hz. Average tumor size was 9.04 mm. Of 507 patients, 358 (70.6%) returned for at least one follow-up. On multiple logistic regression analysis, patients with private insurance (odds ratio [OR]: 0.39, confidence interval [CI]: 0.22-0.68; P = .001), racial minority background (OR: 0.54, CI: 0.35-0.83; P = .005), worse PTA averages (OR: 0.99, CI: 0.98-1.00; P = .044), and older age at diagnosis (OR: 0.97, CI: 0.95-1.00; P = .038) were less likely to follow-up. CONCLUSION: Private health insurance, racial minority background, worse PTA average, and older age were associated with decreased follow-up among adult VS patients electing serial observation. Patients with these characteristics may require additional support to ensure serial observation adherence.

11.
Laryngoscope ; 134(5): 2236-2242, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-37937735

RESUMEN

OBJECTIVE: To investigate the impact of adjuvant radiotherapy in isolated locally advanced oral cavity cancers (pT3N0M0) without adverse features. METHODS: We selected all patients from the National Cancer Database (2004-2019) who underwent surgical treatment where the final pathology was T3N0M0 with negative margins. Demographics, details of treatment, and outcomes were abstracted. The impact of radiotherapy on survival was assessed with univariable, multivariable, and propensity score-matched analyses. RESULTS: We identified 571 patients in our survival cohort. Most were male (348, 60.9%), and median age was 65. Less than one-third (176, 30.8%) received adjuvant radiotherapy. The median length of follow-up was 29 months. Overall, adjuvant radiotherapy was associated with improved survival (87.2% vs. 77.7%, at 2 years, p < 0.01). On multivariable analysis controlling for age and comorbidities, this survival difference persisted (HR: 0.62, 95% CI: 0.43-0.90, p = 0.01). In a propensity score-matched population of 278 patients matched on age and comorbidities, adjuvant radiotherapy was still associated with longer survival (87.4% vs. 78.5%, p = 0.014). CONCLUSION: In our study, adjuvant radiotherapy was associated with improved survival in completely excised locally advanced oral cavity tumors (T3N0M0). However, a significant proportion of patients do not receive adjuvant radiotherapy. These findings highlight the need for continued efforts to promote guideline-recommended care. LEVEL OF EVIDENCE: 3 Laryngoscope, 134:2236-2242, 2024.


Asunto(s)
Neoplasias de la Boca , Humanos , Masculino , Anciano , Femenino , Radioterapia Adyuvante , Neoplasias de la Boca/radioterapia , Neoplasias de la Boca/cirugía , Estudios Retrospectivos
12.
Stat Methods Med Res ; 32(3): 524-538, 2023 03.
Artículo en Inglés | MEDLINE | ID: mdl-36632733

RESUMEN

Covariate balance is crucial in obtaining unbiased estimates of treatment effects in observational studies. Methods that target covariate balance have been successfully proposed and largely applied to estimate treatment effects on continuous outcomes. However, in many medical and epidemiological applications, the interest lies in estimating treatment effects on time-to-event outcomes. With this type of data, one of the most common estimands of interest is the marginal hazard ratio of the Cox proportional hazards model. In this article, we start by presenting robust orthogonality weights, a set of weights obtained by solving a quadratic constrained optimization problem that maximizes precision while constraining covariate balance defined as the correlation between confounders and treatment. By doing so, robust orthogonality weights optimally deal with both binary and continuous treatments. We then evaluate the performance of the proposed weights in estimating marginal hazard ratios of binary and continuous treatments with time-to-event outcomes in a simulation study. We finally apply robust orthogonality weights in the evaluation of the effect of hormone therapy on time to coronary heart disease and on the effect of red meat consumption on time to colon cancer among 24,069 postmenopausal women enrolled in the Women's Health Initiative observational study.


Asunto(s)
Enfermedad Coronaria , Femenino , Humanos , Modelos de Riesgos Proporcionales , Simulación por Computador , Resultado del Tratamiento
13.
Kidney Int Rep ; 8(11): 2385-2394, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-38025214

RESUMEN

Introduction: Excessive dialytic potassium (K) and acid removal are risk factors for arrhythmias; however, treatment-to-treatment dialysate modification is rarely performed. We conducted a multicenter, pilot randomized study to test the safety, feasibility, and efficacy of 4 point-of-care (POC) chemistry-guided protocols to adjust dialysate K and bicarbonate (HCO3) in outpatient hemodialysis (HD) clinics. Methods: Participants received implantable cardiac loop monitors and crossed over to four 4-week periods with adjustment of dialysate K or HCO3 at each treatment according to pre-HD POC values: (i) K-removal minimization, (ii) K-removal maximization, (iii) Acidosis avoidance, and (iv) Alkalosis avoidance. The primary end point was percentage of treatments adhering to the intervention algorithm. Secondary endpoints included pre-HD K and HCO variability, adverse events, and rates of clinically significant arrhythmias (CSAs). Results: Nineteen subjects were enrolled in the study. HD staff completed POC testing and correctly adjusted the dialysate in 604 of 708 (85%) of available HD treatments. There was 1 K ≤3, 29 HCO3 <20 and 2 HCO3 >32 mEq/l and no serious adverse events related to study interventions. Although there were no significant differences between POC results and conventional laboratory measures drawn concurrently, intertreatment K and HCO3 variability was high. There were 45 CSA events; most were transient atrial fibrillation (AF), with numerically fewer events during the alkalosis avoidance period (8) and K-removal maximization period (3) compared to other intervention periods (17). There were no significant differences in CSA duration among interventions. Conclusion: Algorithm-guided K/HCO3 adjustment based on POC testing is feasible. The variability of intertreatment K and HCO3 suggests that a POC-laboratory-guided algorithm could markedly alter dialysate-serum chemistry gradients. Definitive end point-powered trials should be considered.

14.
Otol Neurotol ; 44(5): 453-461, 2023 06 01.
Artículo en Inglés | MEDLINE | ID: mdl-37167445

RESUMEN

OBJECTIVE: Children with high-frequency severe-to-profound hearing loss and low-frequency residual hearing who do not derive significant benefit from hearing aids are now being considered for cochlear implantation. Previous research shows that hearing preservation is possible and may be desirable for the use of electroacoustic stimulation (EAS) in adults, but this topic remains underexplored in children. The goal of this study was to explore factors relating to hearing preservation, acceptance, and benefits of EAS for children. STUDY DESIGN: Retrospective review. SETTING: Tertiary academic medical center. PATIENTS: Forty children (48 ears) with preoperative low-frequency pure-tone averages of 75 dB HL or less at 250 and 500 Hz (n = 48). INTERVENTION: All patients underwent cochlear implantation with a standard-length electrode. MAIN OUTCOME MEASURE: Low-frequency audiometric thresholds, speech perception, and EAS usage were measured at initial stimulation, and 3 and 12 months postoperatively. Outcomes were compared between children with and without hearing preservation, and between EAS users and nonusers. RESULTS: Hearing was preserved at similar rates as adults but worse for children with an enlarged vestibular aqueduct. Fewer than half of children who qualified to use EAS chose to do so, citing a variety of audiologic and nonaudiologic reasons. No differences were detected in speech perception scores across the groups for words, sentences, or sentences in noise tests. CONCLUSIONS: Neither hearing preservation nor EAS use resulted in superior speech perception in children with preoperative residual hearing; rather, all children performed well after implantation.


Asunto(s)
Implantación Coclear , Implantes Cocleares , Pérdida Auditiva Sensorineural , Percepción del Habla , Adulto , Humanos , Niño , Estimulación Acústica/métodos , Resultado del Tratamiento , Umbral Auditivo/fisiología , Pérdida Auditiva Sensorineural/cirugía , Implantación Coclear/métodos , Percepción del Habla/fisiología , Audiometría de Tonos Puros
15.
Otol Neurotol ; 43(9): 980-986, 2022 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-36047686

RESUMEN

OBJECTIVE: To review the current literature regarding cochlear implantation in patients with retrocochlear pathologies and extract speech perception scores between 6 months and 1 year after surgery. DATABASES REVIEWED: PubMed/MEDLINE, Embase and Cochrane CENTRAL via Ovid, CINAHL Complete via Ebsco, and Web of Science. METHODS: The review was conducted according to Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. Search strategies included keywords and subject headings to maximize retrieval and reflect cochlear implants and retrocochlear pathologies. Patients with previously resected vestibular schwannoma (VS) were excluded. RESULTS: There were 2,524 abstracts screened against inclusion criteria, and 53 studies were included, with individual data available for 171 adult patients. Pathologies included were either observed or irradiated VS (previously operated tumors were excluded) (n = 99, 57.9%), superficial siderosis (n = 39, 22.8%), neurosarcoidosis (n = 11, 6.4%), and previous central nervous system or skull base radiation (n = 22, 12.9%). Mean (standard deviation) postoperative consonant-nucleus-consonant (CNC) word scores were 45.4% (24.2) for observed VS, 44.4% (20.8) for irradiated VS, 43.6% (21.0) for superficial siderosis, 89.5% (3.0) for neurosarcoidosis, and 30.0% (30.2) in patients with previous central nervous system or skull base irradiation. Irradiated compared with observed VS had similar postoperative CNC word scores (effect size, 0.06; p = 0.71). Age, sex, maximal tumor dimension, and neurofibromatosis type 2 status did not significantly impact cochlear implant performance in patients with VS. Eighty-two percent of patients with reported device usage were daily users, and overall, 82% of cases benefitted from cochlear implantation. CONCLUSION: Cochlear implantation in patients with concomitant retrocochlear pathology generally results in improved speech discrimination scores sustained over time.


Asunto(s)
Implantación Coclear , Implantes Cocleares , Neuroma Acústico , Sarcoidosis , Siderosis , Percepción del Habla , Adulto , Enfermedades del Sistema Nervioso Central , Implantación Coclear/métodos , Humanos , Neuroma Acústico/complicaciones , Neuroma Acústico/cirugía , Sarcoidosis/complicaciones , Sarcoidosis/cirugía , Siderosis/complicaciones , Resultado del Tratamiento
16.
Vaccine ; 40(42): 6133-6140, 2022 10 06.
Artículo en Inglés | MEDLINE | ID: mdl-36117003

RESUMEN

Well-regulated clinical trials have shown FDA-approved COVID-19 vaccines to be immunogenic and highly efficacious. We evaluated seroconversion rates in adults reporting ≥ 1 dose of an mRNA COVID-19 vaccine in a cohort study of nearly 8000 adults residing in North Carolina to validate immunogenicity using a novel approach: at-home, participant administered point-of-care testing. Overall, 91.4% had documented seroconversion within 75 days of first vaccination (median: 31 days). Participants who were older and male participants were less likely to seroconvert (adults aged 41-65: adjusted hazard ratio [aHR] 0.69 [95% confidence interval (CI): 0.64, 0.73], adults aged 66-95: aHR 0.55 [95% CI: 0.50, 0.60], compared to those 18-40; males: aHR 0.92 [95% CI: 0.87, 0.98], compared to females). Participants with evidence of prior infection were more likely to seroconvert than those without (aHR 1.50 [95% CI: 1.19, 1.88]) and those receiving BNT162b2 were less likely to seroconvert compared to those receiving mRNA-1273 (aHR 0.84 [95% CI: 0.79, 0.90]). Reporting at least one new symptom after first vaccination did not affect time to seroconversion, but participants reporting at least one new symptom after second vaccination were more likely to seroconvert (aHR 1.11 [95% CI: 1.05, 1.17]). This data demonstrates the high community-level immunogenicity of COVID-19 vaccines, albeit with notable differences in older adults, and feasibility of using at-home, participant administered point-of-care testing for community cohort monitoring. Trial registration: ClinicalTrials.gov NCT04342884.


Asunto(s)
COVID-19 , Vacunas , Anciano , Anticuerpos Antivirales , Vacuna BNT162 , COVID-19/prevención & control , Vacunas contra la COVID-19/efectos adversos , Estudios de Cohortes , Femenino , Humanos , Inmunogenicidad Vacunal , Masculino , North Carolina/epidemiología , ARN Mensajero , Seroconversión
17.
Prev Med Rep ; 28: 101857, 2022 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-35706687

RESUMEN

Wearing a facemask can help to decrease the transmission of COVID-19. We investigated self-reported mask use among subjects aged 18 years and older participating in the COVID-19 Community Research Partnership (CRP), a prospective longitudinal COVID-19 surveillance study in the mid-Atlantic and southeastern United States. We included those participants who completed ≥5 daily surveys each month from December 1, 2020 through August 31, 2021. Mask use was defined as self-reported use of a face mask or face covering on every interaction with others outside the household within a distance of less than 6 feet. Participants were considered vaccinated if they reported receiving ≥1 COVID-19 vaccine dose. Participants (n = 17,522) were 91% non-Hispanic White, 68% female, median age 57 years, 26% healthcare workers, with 95% self-reported receiving ≥1 COVID-19 vaccine dose through August 2021; mean daily survey response was 85%. Mask use was higher among vaccinated than unvaccinated participants across the study period, regardless of the month of the first dose. Mask use remained relatively stable from December 2020 through April (range 71-80% unvaccinated; 86-93% vaccinated) and declined in both groups beginning in mid-May 2021 to 34% and 42% respectively in June 2021; mask use increased again since July 2021. Mask use by all was lower during weekends and on Christmas and Easter, regardless of vaccination status. Independent predictors of higher mask use were vaccination, age ≥65 years, female sex, racial or ethnic minority group, and healthcare worker occupation, whereas a history of self-reported prior COVID-19 illness was associated with lower use.

18.
Artículo en Inglés | MEDLINE | ID: mdl-35162131

RESUMEN

The double burden of HIV/AIDS and tuberculosis (TB), coupled with endemic and problematic food insecurity in Africa, can interact to negatively impact health outcomes, creating a syndemic. For people living with HIV/AIDS (PWH), food insecurity is a significant risk factor for acquiring TB due to the strong nutritional influences and co-occurring contextual barriers. We aim to synthesize evidence on the syndemic relationship between HIV/AIDS and TB co-infection and food insecurity in Africa. We conducted a scoping review of studies in Africa that included co-infected adults and children, with evidence of food insecurity, characterized by insufficient to lack of access to macronutrients. We sourced information from major public health databases. Qualitative, narrative analysis was used to synthesize the data. Of 1072 articles screened, 18 articles discussed the syndemic effect of HIV/AIDS and TB co-infection and food insecurity. Reporting of food insecurity was inconsistent, however, five studies estimated it using a validated scale. Food insecure co-infected adults had an average BMI of 16.5-18.5 kg/m2. Negative outcomes include death (n = 6 studies), depression (n = 1 study), treatment non-adherence, weight loss, wasting, opportunistic infections, TB-related lung diseases, lethargy. Food insecurity was a precursor to co-infection, especially with the onset/increased incidence of TB in PWH. Economic, social, and facility-level factors influenced the negative impact of food insecurity on the health of co-infected individuals. Nutritional support, economic relief, and psychosocial support minimized the harmful effects of food insecurity in HIV-TB populations. Interventions that tackle one or more components of a syndemic interaction can have beneficial effects on health outcomes and experiences of PWH with TB in Africa.


Asunto(s)
Infecciones por VIH , Tuberculosis , Adulto , África/epidemiología , Niño , Inseguridad Alimentaria , Abastecimiento de Alimentos , Infecciones por VIH/epidemiología , Infecciones por VIH/psicología , Humanos , Sindémico , Tuberculosis/epidemiología
19.
Vaccines (Basel) ; 10(11)2022 Nov 13.
Artículo en Inglés | MEDLINE | ID: mdl-36423018

RESUMEN

We characterize the overall incidence and risk factors for breakthrough infection among fully vaccinated participants in the North Carolina COVID-19 Community Research Partnership cohort. Among 15,808 eligible participants, 638 reported a positive SARS-CoV-2 test after vaccination. Factors associated with a lower risk of breakthrough in the time-to-event analysis included older age, prior SARS-CovV-2 infection, higher rates of face mask use, and receipt of a booster vaccination. Higher rates of breakthrough were reported by participants vaccinated with BNT162b2 or Ad26.COV2.S compared to mRNA-1273, in suburban or rural counties compared to urban counties, and during circulation of the Delta and Omicron variants.

20.
PLoS One ; 17(3): e0260574, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35302997

RESUMEN

INTRODUCTION: The COVID-19 Community Research Partnership is a population-based longitudinal syndromic and sero-surveillance study. The study includes over 17,000 participants from six healthcare systems in North Carolina who submitted over 49,000 serology results. The purpose of this study is to use these serology data to estimate the cumulative proportion of the North Carolina population that has either been infected with SARS-CoV-2 or developed a measurable humoral response to vaccination. METHODS: Adult community residents were invited to participate in the study between April 2020 and February 2021. Demographic information was collected and daily symptom screen was completed using a secure, HIPAA-compliant, online portal. A portion of participants were mailed kits containing a lateral flow assay to be used in-home to test for presence of anti-SARS-CoV-2 IgM or IgG antibodies. The cumulative proportion of participants who tested positive at least once during the study was estimated. A standard Cox proportional hazards model was constructed to illustrate the probability of seroconversion over time up to December 20, 2020 (before vaccines available). A separate analysis was performed to describe the influence of vaccines through February 15, 2021. RESULTS: 17,688 participants contributed at least one serology result. 68.7% of the population were female, and 72.2% were between 18 and 59 years of age. The average number of serology results submitted per participant was 3.0 (±1.9). By December 20, 2020, the overall probability of seropositivity in the CCRP population was 32.6%. By February 15, 2021 the probability among healthcare workers and non-healthcare workers was 83% and 49%, respectively. An inflection upward in the probability of seropositivity was demonstrated around the end of December, suggesting an influence of vaccinations, especially for healthcare workers. Among healthcare workers, those in the oldest age category (60+ years) were 38% less likely to have seroconverted by February 15, 2021. CONCLUSIONS: Results of this study suggest more North Carolina residents may have been infected with SARS-CoV-2 than the number of documented cases as determined by positive RNA or antigen tests. The influence of vaccinations on seropositivity among North Carolina residents is also demonstrated. Additional research is needed to fully characterize the impact of seropositivity on immunity and the ultimate course of the pandemic.


Asunto(s)
Anticuerpos Antivirales/análisis , COVID-19/epidemiología , Personal de Salud/estadística & datos numéricos , SARS-CoV-2/inmunología , Adulto , Factores de Edad , Participación de la Comunidad , Femenino , Humanos , Estudios Longitudinales , Masculino , Persona de Mediana Edad , North Carolina/epidemiología , Seroconversión , Adulto Joven
SELECCIÓN DE REFERENCIAS
Detalles de la búsqueda