RESUMEN
Artificial intelligence (AI) and machine learning (ML) are important tools across many fields of health and medical research. Pharmacoepidemiologists can bring essential methodological rigor and study design expertise to the design and use of these technologies within healthcare settings. AI/ML-based tools also play a role in pharmacoepidemiology research, as we may apply them to answer our own research questions, take responsibility for evaluating medical devices with AI/ML components, or participate in interdisciplinary research to create new AI/ML algorithms. While epidemiologic expertise is essential to deploying AI/ML responsibly and ethically, the rapid advancement of these technologies in the past decade has resulted in a knowledge gap for many in the field. This article provides a brief overview of core AI/ML concepts, followed by a discussion of potential applications of AI/ML in pharmacoepidemiology research, and closes with a review of important concepts across application areas, including interpretability and fairness. This review is intended to provide an accessible, practical overview of AI/ML for pharmacoepidemiology research, with references to further, more detailed resources on fundamental topics.
Asunto(s)
Inteligencia Artificial , Aprendizaje Automático , Farmacoepidemiología , Farmacoepidemiología/métodos , Humanos , Investigación sobre Servicios de Salud , Proyectos de Investigación , AlgoritmosRESUMEN
OBJECTIVE: To assess whether National Football League (NFL) players diagnosed with a concussion have an increased risk of injury after return to football. METHODS: A retrospective cohort study analysed the hazard of subsequent time-loss lower extremity (LEX) or any musculoskeletal injury among NFL players diagnosed with a concussion in 2015-2021 preseason or regular season games compared with: (1) all non-concussed players participating in the same game and (2) players with time-loss upper extremity injury. Cox proportional hazards models were adjusted for number of injuries and concussions in the prior year, player tenure and roster position. Additional models accounted for time lost from participation after concussion. RESULTS: There was no statistical difference in the hazards of LEX injury or any musculoskeletal injury among concussed players compared with non-concussed players, though concussed players had a slightly elevated hazard of injury (LEX injury: HR=1.12, 95% CI 0.90 to 1.41; any musculoskeletal injury: HR=1.08, 95% CI: 0.89 to 1.31). When comparing to players with upper extremity injuries, the hazard of injury for concussed players was not statistically different, though HRs suggested a lower injury risk among concussed players (LEX injury: HR=0.78, 95% CI: 0.60 to 1.02; any musculoskeletal injury: HR=0.82, 95% CI: 0.65 to 1.04). CONCLUSION: We found no statistical difference in the risk of subsequent injury among NFL players returning from concussion compared with non-concussed players in the same game or players returning from upper extremity injury. These results suggest deconditioning or other factors associated with lost participation time may explain subsequent injury risk in concussed players observed in some settings after return to play.
Asunto(s)
Traumatismos en Atletas , Conmoción Encefálica , Fútbol Americano , Volver al Deporte , Humanos , Conmoción Encefálica/epidemiología , Fútbol Americano/lesiones , Estudios Retrospectivos , Traumatismos en Atletas/epidemiología , Masculino , Modelos de Riesgos Proporcionales , Sistema Musculoesquelético/lesiones , Factores de Riesgo , Extremidad Superior/lesiones , Adulto JovenRESUMEN
OBJECTIVES: This manuscript presents a comprehensive framework for the assessment of the value of real-world evidence (RWE) in healthcare decision-making. While RWE has been proposed to overcome some limitations of traditional, one-off studies, no systematic framework exists to measure if RWE actually lowers the burden. This framework aims to fill that gap by providing conceptual approaches for evaluating the time and cost efficiencies of RWE, thus guiding strategic investments in RWE infrastructure. METHODS: The framework consists of four components: (114th Congress. 21st Century Cures Act.; 2015. https://www.congress.gov/114/plaws/publ255/PLAW-114publ255.pdf .) identification of stakeholders using and producing RWE, (National Health Council. Glossary of Patient Engagement Terms. Published 2019. Accessed May 18. 2021. https://nationalhealthcouncil.org/glossary-of-patient-engagement-terms/ .) understanding value propositions on how RWE can benefit stakeholders, (Center for Drug Evaluation and Research. CDER Patient-Focused Drug Development. U.S. Food & Drug Administration.) defining key performance indicators (KPIs), and (U.S. Department of Health and Human Services - Food and Drug Administration: Center for Devices and Radiological Health and Center for Biologics Evaluation and Research. Use of Real-World Evidence to Support Regulatory Decision-Making for Medical Devices - Guidance for Industry and Food and Drug Administration Staff. 2017. http://www.fda.gov/BiologicsBloodVaccines/GuidanceComplianceRegulatoryInformation/Guida .) establishing metrics and case studies to assess value. KPIs are categorized as 'better, faster, or cheaper" as an indicator of value: better focusing on high-quality actionable evidence; 'faster,' denoting time-saving in evidence generation, and 'cheaper,' emphasizing cost-efficiency decision compared to methodologies that do not involve data routinely collected in clinical practice. Metrics and relevant case studies are tailored based on stakeholder value propositions and selected KPIs that can be used to assess what value has been created by using RWE compared to traditional evidence-generation approaches and comparing different RWE sources. RESULTS: Operationalized through metrics and case studies drawn from the literature, the value of RWE is documented as improving treatment effect heterogeneity evaluation, expanding medical product labels, and expediting post-market compliance. RWE is also shown to reduce the cost and time required to produce evidence compared to traditional one-off approaches. An original example of a metric that measures the time saved by RWE methods to detect a signal of a product failure was presented based on analysis of the National Cardiovascular Disease Registry. CONCLUSIONS: The framework presented in this manuscript offers a comprehensive approach for evaluating the value of RWE, applicable to all stakeholders engaged in leveraging RWE for healthcare decision-making. Through the proposed metrics and illustrated case studies, valuable insights are provided into the heightened efficiency, cost-effectiveness, and improved decision-making within clinical and regulatory domains facilitated by RWE. While this framework is primarily focused on medical devices, it could potentially inform the determination of RWE value in other medical products. By discerning the variations in cost, time, and data utility among various evidence-generation methods, stakeholders are empowered to invest strategically in RWE infrastructure and shape future research endeavors.
Asunto(s)
Sistema de Registros , Participación de los Interesados , Humanos , Análisis Costo-Beneficio , Toma de Decisiones , Estados UnidosRESUMEN
BACKGROUND: Understanding the epidemiology of injuries to athletes is essential to informing injury prevention efforts. HYPOTHESIS: The incidence and impact of basketball-related injuries among National Basketball Association (NBA) players from 2013-2014 through 2018-2019 is relatively stable over time. STUDY DESIGN: Descriptive epidemiology study. LEVEL OF EVIDENCE: Level 3. METHODS: Injuries from 2013-2014 through 2018-2019 were analyzed using the NBA Injury and Illness Database from an electronic medical record system. Descriptive statistics were calculated for injuries by season, game-loss, and onset. Incidence rates were estimated using Poisson models and linear trend tests. RESULTS: Between 552 and 606 players participated in ≥1 game per season during the study. Annual injury incidence ranged from 1550 to 1892, with 33.6% to 38.5% resulting in a missed NBA game. Game-loss injury rates ranged from 5.6 to 7.0 injuries per 10,000 player-minutes from 2014-2015 through 2018-2019 (P = 0.19); the rate was lower in 2013-2014 (5.0 injuries per 10,000 player-minutes), partly due to increased preseason injury rates and transition of reporting processes. The 6-year game-loss injury rate in preseason and regular season games was 6.9 (95% CI 6.0, 8.0) and 6.2 (95% CI 6.0, 6.5) injuries per 10,000 player-minutes; the rate in playoff games was lower (P < 0.01) at 2.8 (95% CI 2.2, 3.6). Most (73%) game-loss injuries had acute onset; 44.4% to 52.5% of these involved contact with another player. CONCLUSION: From 2013-2014 through 2018-2019, over one-third of injuries resulted in missed NBA games, with highest rates of game-loss injuries in preseason games and lowest rates in playoff games. Most game-loss injuries had acute onset, and half of those involved contact with another player. CLINICAL RELEVANCE: These findings - through reliable data reporting by team medical staff in an audited system - can guide evidence-based injury reduction strategies and inform player health priorities.
RESUMEN
AIM: Hypertension and diabetes mellitus (DM) are major causes of morbidity and mortality, with growing burdens in low-income countries where they are underdiagnosed and undertreated. Advances in machine learning may provide opportunities to enhance diagnostics in settings with limited medical infrastructure. MATERIALS AND METHODS: A non-interventional study was conducted to develop and validate a machine learning algorithm to estimate cardiovascular clinical and laboratory parameters. At two sites in Kenya, digital retinal fundus photographs were collected alongside blood pressure (BP), laboratory measures and medical history. The performance of machine learning models, originally trained using data from the UK Biobank, were evaluated for their ability to estimate BP, glycated haemoglobin, estimated glomerular filtration rate and diagnoses from fundus images. RESULTS: In total, 301 participants were enrolled. Compared with the UK Biobank population used for algorithm development, participants from Kenya were younger and would probably report Black/African ethnicity, with a higher body mass index and prevalence of DM and hypertension. The mean absolute error was comparable or slightly greater for systolic BP, diastolic BP, glycated haemoglobin and estimated glomerular filtration rate. The model trained to identify DM had an area under the receiver operating curve of 0.762 (0.818 in the UK Biobank) and the hypertension model had an area under the receiver operating curve of 0.765 (0.738 in the UK Biobank). CONCLUSIONS: In a Kenyan population, machine learning models estimated cardiovascular parameters with comparable or slightly lower accuracy than in the population where they were trained, suggesting model recalibration may be appropriate. This study represents an incremental step toward leveraging machine learning to make early cardiovascular screening more accessible, particularly in resource-limited settings.
Asunto(s)
Enfermedades Cardiovasculares , Aprendizaje Profundo , Factores de Riesgo de Enfermedad Cardiaca , Humanos , Kenia/epidemiología , Masculino , Femenino , Persona de Mediana Edad , Estudios Prospectivos , Adulto , Enfermedades Cardiovasculares/epidemiología , Enfermedades Cardiovasculares/diagnóstico , Enfermedades Cardiovasculares/etiología , Hipertensión/epidemiología , Hipertensión/complicaciones , Hipertensión/diagnóstico , Algoritmos , Fotograbar , Fondo de Ojo , Anciano , Diabetes Mellitus/epidemiología , Factores de Riesgo , Retinopatía Diabética/epidemiología , Retinopatía Diabética/diagnósticoRESUMEN
SARS-CoV-2 antibody levels may serve as a correlate for immunity and could inform optimal booster timing. The relationship between antibody levels and protection from infection was evaluated in vaccinated individuals from the US National Basketball Association who had antibody levels measured at a single time point from September 12, 2021, to December 31, 2021. Cox proportional hazards models were used to estimate the risk of infection within 90 days of serologic testing by antibody level (<250, 250-800, and >800 AU/mL1 ), adjusting for age, time since last vaccine dose, and history of SARS-CoV-2 infection. Individuals were censored on date of booster receipt. The analytic cohort comprised 2323 individuals and was 78.2% male, 68.1% aged ≤40 years, and 56.4% vaccinated (primary series) with the Pfizer-BioNTech mRNA vaccine. Among the 2248 (96.8%) individuals not yet boosted at antibody testing, 77% completed their primary vaccine series 4-6 months before testing and the median (interquartile range) antibody level was 293.5 (interquartile range: 121.0-740.5) AU/mL. Those with levels <250 AU/mL (adj hazard ratio [HR]: 2.4; 95% confidence interval [CI]: 1.5-3.7) and 250-800 AU/mL (adj HR: 1.5; 95% CI: 0.98-2.4) had greater infection risk compared to those with levels >800 AU/mL. Antibody levels could inform individual COVID-19 risk and booster scheduling.
Asunto(s)
Baloncesto , COVID-19 , Vacunas , Humanos , Masculino , Femenino , COVID-19/prevención & control , SARS-CoV-2 , Anticuerpos AntiviralesRESUMEN
BACKGROUND: This exploratory study compares self-reported COVID-19 vaccine side effects and breakthrough infections in people who described themselves as having diabetes with those who did not identify as having diabetes. OBJECTIVE: The study uses person-reported data to evaluate differences in the perception of COVID-19 vaccine side effects between adults with diabetes and those who did not report having diabetes. METHODS: This is a retrospective cohort study conducted using data provided online by adults aged 18 years and older residing in the United States. The participants who voluntarily self-enrolled between March 19, 2021, and July 16, 2022, in the IQVIA COVID-19 Active Research Experience project reported clinical and demographic information, COVID-19 vaccination, whether they had experienced any side effects, test-confirmed infections, and consented to linkage with prescription claims. No distinction was made for this study to differentiate prediabetes or type 1 and type 2 diabetes nor to verify reports of positive COVID-19 tests. Person-reported medication use was validated using pharmacy claims and a subset of the linked data was used for a sensitivity analysis of medication effects. Multivariate logistic regression was used to estimate the adjusted odds ratios of vaccine side effects or breakthrough infections by diabetic status, adjusting for age, gender, education, race, ethnicity (Hispanic or Latino), BMI, smoker, receipt of an influenza vaccine, vaccine manufacturer, and all medical conditions. Evaluations of diabetes medication-specific vaccine side effects are illustrated graphically to support the examination of the magnitude of side effect differences for various medications and combinations of medications used to manage diabetes. RESULTS: People with diabetes (n=724) reported experiencing fewer side effects within 2 weeks of vaccination for COVID-19 than those without diabetes (n=6417; mean 2.7, SD 2.0 vs mean 3.1, SD 2.0). The adjusted risk of having a specific side effect or any side effect was lower among those with diabetes, with significant reductions in fatigue and headache but no differences in breakthrough infections over participants' maximum follow-up time. Diabetes medication use did not consistently affect the risk of specific side effects, either using self-reported medication use or using only diabetes medications that were confirmed by pharmacy health insurance claims for people who also reported having diabetes. CONCLUSIONS: People with diabetes reported fewer vaccine side effects than participants not reporting having diabetes, with a similar risk of breakthrough infection. TRIAL REGISTRATION: ClinicalTrials.gov NCT04368065; https://clinicaltrials.gov/study/NCT04368065.
RESUMEN
BACKGROUND: Transient traumatic neuropraxia of either the brachial plexus or cervical nerve root(s) is commonly described as a "stinger" or "burner" by the athlete. Stingers in American Football commonly occur acutely as isolated injuries; however, concomitant injuries, including cervical spine pathologies, have also been reported. HYPOTHESIS: Among National Football League (NFL) athletes, the incidence rate of stingers is higher during the regular season than during the preseason and among positions with high velocity impacts such as running backs, linebackers, defensive backs, and receivers. STUDY DESIGN: Retrospective epidemiology study. LEVEL OF EVIDENCE: Level 4. METHODS: Aggregation of all in-game injuries with a clinical impression of "neck brachial plexus stretch" or "neck brachial plexus compression" entered into the NFL injury surveillance database through the centralized league-wide electronic medical record system over 5 years (2015-2019 seasons). Incidence rates per player-play were calculated and reported. RESULTS: A total of 691 in-game stingers occurred during the study period, with a mean of 138.2 per year. Average single-season injury risk for incident stinger was 3.74% (95% CI, 3.46%-4.05%). The incidence rate was higher during regular season games than during preseason games (12.26 per 100,000 player-plays [11.30-13.31] vs 8.87 [7.31-10.76], P < 0.01, respectively). The highest reported stinger incidence rates were among running backs and linebackers (both >15 per 100,000 player-plays). Among stingers, 76.41% did not miss time. Of those that resulted in time lost from football activities, mean time missed due to injury was 4.79 days (range, 3.17-6.41 days). Concomitant injuries were relatively low (7.09%). CONCLUSION: In-game stinger incidence was stable across the study period and occurred most frequently in running backs and linebackers. Stingers were more common during the regular season, and most players did not miss time. Concomitant injuries were relatively rare. CLINICAL RELEVANCE: An improved understanding of the expected time loss due to stinger and concomitant injuries may provide insight for medical personnel in managing these injuries.
Asunto(s)
Fútbol Americano , Humanos , Incidencia , Estudios Retrospectivos , Fútbol Americano/lesiones , Estados Unidos/epidemiología , Plexo Braquial/lesiones , Traumatismos en Atletas/epidemiología , MasculinoRESUMEN
In January 2021, 999 COVID-19 positive adults in the US enrolled in an online, direct-to-patient registry to describe daily symptom severity and progression over 28 days. The most commonly reported and persistent symptoms were fatigue, headache, decreased sense of taste, decreased sense of smell, and cough. Fast resolving symptoms included gastrointestinal symptoms (nausea, vomiting, diarrhea) and those related to fever and chills. While more than half (56%) of patients reported overall symptom improvement during the 28-day study period, 60% of patients were still reporting at least 1 COVID-19 symptom at the end of 28 days. Risk factors for experiencing symptoms for longer duration included at least one of the following: older age (> 60 years), higher BMI, lung disease, and receiving medication for hypertension. The study demonstrates the value of patient-reported data to provide important and timely insights to COVID-19 disease and symptom progression and the potential of using real-world data to inform clinical trial design and endpoints.
Asunto(s)
COVID-19 , Enfermedades Gastrointestinales , Adulto , Humanos , SARS-CoV-2 , Evaluación de Síntomas , Sistema de RegistrosRESUMEN
Background: Shoulder instability encompasses a spectrum of glenohumeral pathology ranging from subluxation to dislocation. While dislocation frequently leads to removal from play, athletes are often able to play through subluxation. Previous research on glenohumeral instability among athletes has largely focused on missed-time injuries, which has likely disproportionately excluded subluxation injuries and underestimated the overall incidence of shoulder instability. Purpose: To describe the epidemiology of shoulder instability injuries resulting in no missed time beyond the date of injury (non-missed time injuries) among athletes in the National Football League (NFL). Study Design: Descriptive epidemiology study. Methods: The NFL's electronic medical record was retrospectively reviewed to identify non-missed time shoulder instability injuries during the 2015 through 2019 seasons. For each injury, player age, player position, shoulder laterality, instability type, instability direction, injury timing, injury setting, and injury mechanism were recorded. For injuries that occurred during games, incidence rates were calculated based on time during the season as well as player position. The influence of player position on instability direction was also investigated. Results: Of the 546 shoulder instability injuries documented during the study period, 162 were non-missed time injuries. The majority of non-missed time injuries were subluxations (97.4%), occurred during games (70.7%), and resulted from a contact mechanism (91.2%). The overall incidence rate of game-related instability was 1.6 injuries per 100,000 player-plays and was highest during the postseason (3.5 per 100,000 player-plays). The greatest proportion of non-missed time injuries occurred in defensive secondary players (28.4%) and offensive linemen (19.8%), while kickers/punters and defensive secondary players had the highest game incidence rates (5.5 and 2.1 per 100,000 player-plays, respectively). In terms of direction, 54.3% of instability events were posterior, 31.9% anterior, 8.5% multidirectional, and 5.3% inferior. Instability events were most often anterior among linebackers and wide receivers (50% and 100%, respectively), while posterior instability was most common in defensive linemen (66.7%), defensive secondary players (58.6%), quarterbacks (100.0%), running backs (55.6%), and tight ends (75.0%). Conclusion: The majority of non-missed time shoulder instability injuries (97.4%) were subluxations, which were likely excluded from or underreported in previous shoulder instability studies due to the inherent difficulty of detecting and diagnosing shoulder subluxation.
RESUMEN
Real-world evidence (RWE) is being used to provide information on diverse groups of patients who may be highly impacted by disease but are not typically studied in traditional randomized clinical trials (RCT) and to obtain insights from everyday care settings and real-world adherence to inform clinical practice. RWE is derived from so-called real-world data (RWD), ie, information generated by clinicians in the course of everyday patient care, and is sometimes coupled with systematic input from patients in the form of patient-reported outcomes or from wearable biosensors. Studies using RWD are conducted to evaluate how well medical interventions, services, and diagnostics perform under conditions of real-world use, and may include long-term follow-up. Here, we describe the main types of studies used to generate RWE and offer pointers for clinicians interested in study design and execution. Our tactical guidance addresses (1) opportunistic study designs, (2) considerations about representativeness of study participants, (3) expectations for transparency about data provenance, handling and quality assessments, and (4) considerations for strengthening studies using record linkage and/or randomization in pragmatic clinical trials. We also discuss likely sources of bias and suggest mitigation strategies. We see a future where clinical records - patient-generated data and other RWD - are brought together and harnessed by robust study design with efficient data capture and strong data curation. Traditional RCT will remain the mainstay of drug development, but RWE will play a growing role in clinical, regulatory, and payer decision-making. The most meaningful RWE will come from collaboration with astute clinicians with deep practice experience and questioning minds working closely with patients and researchers experienced in the development of RWE.
RESUMEN
The impact of a prior SARS-CoV-2 infection on the progression of subsequent infections has been unclear. Using a convenience sample of 94,812 longitudinal RT-qPCR measurements from anterior nares and oropharyngeal swabs, we identified 71 individuals with two well-sampled SARS-CoV-2 infections between March 11th, 2020, and July 28th, 2022. We compared the SARS-CoV-2 viral kinetics of first vs. second infections in this group, adjusting for viral variant, vaccination status, and age. Relative to first infections, second infections usually featured a faster clearance time. Furthermore, a person's relative (rank-order) viral clearance time, compared to others infected with the same variant, was roughly conserved across first and second infections, so that individuals who had a relatively fast clearance time in their first infection also tended to have a relatively fast clearance time in their second infection (Spearman correlation coefficient: 0.30, 95% credible interval (0.12, 0.46)). These findings provide evidence that, like vaccination, immunity from a prior SARS-CoV-2 infection shortens the duration of subsequent acute SARS-CoV-2 infections principally by reducing viral clearance time. Additionally, there appears to be an inherent element of the immune response, or some other host factor, that shapes a person's relative ability to clear SARS-CoV-2 infection that persists across sequential infections.
Asunto(s)
COVID-19 , Humanos , SARS-CoV-2 , Prueba de COVID-19 , Proyectos de Investigación , CinéticaRESUMEN
BACKGROUND: The National Basketball Association (NBA) suspended operations in response to the COVID-19 pandemic in March 2020. To safely complete the 2019-20 season, the NBA created a closed campus in Orlando, Florida, known as the NBA "Bubble." More than 5000 individuals lived, worked, and played basketball at a time of high local prevalence of SARS-CoV-2. METHODS: Stringent protocols governed campus life to protect NBA and support personnel from contracting COVID-19. Participants quarantined before departure and upon arrival. Medical and social protocols required that participants remain on campus, test regularly, physically distance, mask, use hand hygiene, and more. Cleaning, disinfection, and air filtration was enhanced. Campus residents were screened daily and confirmed cases of COVID-19 were investigated. RESULTS: In the Bubble population, 148 043 COVID-19 reverse transcriptase PCR (RT-PCR) tests were performed across approximately 5000 individuals; Orlando had a 4% to 15% test positivity rate in this timeframe. There were 44 COVID-19 cases diagnosed either among persons during arrival quarantine or in non-team personnel while working on campus after testing but before receipt of a positive result. No cases of COVID-19 were identified among NBA players or NBA team staff living in the Bubble once cleared from quarantine. CONCLUSIONS: Drivers of success included the requirement for players and team staff to reside and remain on campus, well-trained compliance monitors, unified communication, layers of protection between teams and the outside, activation of high-quality laboratory diagnostics, and available mental health services. An emphasis on data management, evidence-based decision-making, and the willingness to evolve protocols were instrumental to successful operations. These lessons hold broad applicability for future pandemic preparedness efforts.
Asunto(s)
Baloncesto , COVID-19 , Humanos , Pandemias , Estaciones del Año , SARS-CoV-2RESUMEN
For many bacterial proteins, specific localizations within the cell have been demonstrated, but enzymes involved in central metabolism are usually considered to be homogenously distributed within the cytoplasm. Here, we provide an example for a spatially defined localization of a unique enzyme complex found in actinobacteria, the hybrid pyruvate/2-oxoglutarate dehydrogenase complex (PDH-ODH). In non-actinobacterial cells, PDH and ODH form separate multienzyme complexes of megadalton size composed of three different subunits, E1, E2, and E3. The actinobacterial PDH-ODH complex is composed of four subunits, AceE (E1p), AceF (E2p), Lpd (E3), and OdhA (E1oE2o). Using fluorescence microscopy, we observed that in Corynebacterium glutamicum, all four subunits are co-localized in distinct spots at the cell poles, and in larger cells, additional spots are present at mid-cell. These results further confirm the existence of the hybrid complex. The unphosporylated OdhI protein, which binds to OdhA and inhibits ODH activity, was co-localized with OdhA at the poles, whereas phosphorylated OdhI, which does not bind OdhA, was distributed in the entire cytoplasm. Isocitrate dehydrogenase and glutamate dehydrogenase, both metabolically linked to ODH, were evenly distributed in the cytoplasm. Based on the available structural data for individual PDH-ODH subunits, a novel supramolecular architecture of the hybrid complex differing from classical PDH and ODH complexes has to be postulated. Our results suggest that localization at the poles or at mid-cell is most likely caused by nucleoid exclusion and results in a spatially organized metabolism in actinobacteria, with consequences yet to be studied. IMPORTANCE Enzymes involved in the central metabolism of bacteria are usually considered to be distributed within the entire cytoplasm. Here, we provide an example for a spatially defined localization of a unique enzyme complex of actinobacteria, the hybrid pyruvate dehydrogenase/2-oxoglutarate dehydrogenase (PDH-ODH) complex composed of four different subunits. Using fusions with mVenus or mCherry and fluorescence microscopy, we show that all four subunits are co-localized in distinct spots at the cell poles, and in larger cells, additional spots were observed at mid-cell. These results clearly support the presence of the hybrid PDH-ODH complex and suggest a similar localization in other actinobacteria. The observation of a defined spatial localization of an enzyme complex catalyzing two key reactions of central metabolism poses questions regarding possible consequences for the availability of substrates and products within the cell and other bacterial enzyme complexes showing similar behavior.
RESUMEN
This study (1) determined the association of time since initial vaccine regimen, booster dose receipt, and COVID-19 history with antibody titer, as well as change in titer levels over a defined period, and (2) determined risk of COVID-19 associated with low titer levels. This observational study used data from staff participating in the National Football League COVID-19 Monitoring Program. A cohort of staff consented to antibody-focused sub-study, during which detailed longitudinal data were collected. Among all staff in the program who received antibody testing, COVID-19 incidence following antibody testing was determined. Five hundred eighty-six sub-study participants completed initial antibody testing; 80% (469) completed follow-up testing 50-101 days later. Among 389 individuals who were not boosted at initial testing, the odds of titer < 1000 AU/mL (vs. ≥1000 AU/mL) increased 44% (odds ratio [OR] = 1.44, 95% confidence interval [CI]: 1.18-1.75) for every 30 days since final dose. Among 126 participants boosted before initial testing with no COVID-19 history, 125 (99%) had a value > 2500 AU/ml; 86 (96%) of 90 tested at follow-up and did not develop COVID-19 in the interim remained at that value. One thousand fifty-seven fully vaccinated (330 [29%] boosted at antibody test) individuals participating in the monitoring program were followed to determine COVID-19 status. Individuals with titer value < 1000 AU/mL had twice the risk of COVID-19 as those with >2500 AU/mL (HR = 2.02, 95% CI: 1.28-3.18). Antibody levels decrease postvaccination; boosting increases titer values. While antibody level is not a clear proxy for infection immunity, lower titer values are associated with higher COVID-19 incidence, suggesting increased protection from boosters.
Asunto(s)
COVID-19 , Humanos , Estudios de Cohortes , COVID-19/epidemiología , COVID-19/prevención & control , Pruebas Inmunológicas , Oportunidad Relativa , Vacunación , Anticuerpos AntiviralesRESUMEN
Sport-related concussion remains an area of high concern for contact sport athletes and their families, as well as for the medical and scientific communities. The National Football League (NFL), along with the NFL Players Association and experts in the field, has developed protocols for the detection and management of sport-related concussions. This article reviews the NFL's most recent concussion protocol including preseason education and baseline testing for players, concussion surveillance by gameday medical teams and neurotrauma consultants and athletic trainers, gameday concussion protocol and procedures, and return to participation guidelines.
RESUMEN
Background: Anterior cruciate ligament (ACL) tears are a high-frequency injury requiring a lengthy recovery in professional American football players. Concomitant pathology associated with ACL tears as identified on magnetic resonance imaging (MRI) is not well understood in these athletes. Purpose: To describe the MRI findings of concomitant injuries associated with ACL tears among athletes in the National Football League (NFL). Study Design: Cross-sectional study; Level of evidence, 3. Methods: Of 314 ACL injuries in NFL athletes from 2015 through 2019, 191 complete MRI scans from the time of primary ACL injury were identified and reviewed by 2 fellowship-trained musculoskeletal radiologists. Data were collected on ACL tear type and location, as well as presence and location of bone bruises, meniscal tears, articular cartilage pathology, and concomitant ligament pathology. Mechanism data from video review were linked with imaging data to assess association between injury mechanism (contact vs noncontact) and presence of concomitant pathology. Results: Bone bruises were evident in 94.8% of ACL tears in this cohort, most often in the lateral tibial plateau (81%). Meniscal, additional ligamentous, and/or cartilage injury was present in 89% of these knees. Meniscal tears were present in 70% of knees, lateral (59%) more than medial (41%). Additional ligamentous injury was present in 71% of all MRI scans, more often a grade 1/2 sprain (67%) rather than a grade 3 tear (33%), and most often involving the medial collateral ligament (MCL) (57%) and least often the posterior cruciate ligament (10%). Chondral damage was evident in 49% of all MRI scans, with ≥1 full-thickness defect in 25% of all MRI scans, most often lateral. Most (79%) ACL tears did not involve direct contact to the injured lower extremity. Direct contact injuries (21%) were more likely to have a concomitant MCL tear and/or medial patellofemoral ligament injury and less likely to have a medial meniscal tear. Conclusion: ACL tears were rarely isolated injuries in this cohort of professional American football athletes. Bone bruises were almost always present, and additional meniscal, ligamentous, and chondral injuries were also common. MRI findings varied by injury mechanism.
RESUMEN
OBJECTIVE: To describe cognitive symptoms in people not hospitalised at study enrolment for SARS-CoV-2 infection and associated demographics, medical history, other neuropsychiatric symptoms and SARS-CoV-2 vaccination. DESIGN: Longitudinal observational study. SETTING: Direct-to-participant registry with community-based recruitment via email and social media including Google, Facebook and Reddit, targeting adult US residents. Demographics, medical history, COVID-19-like symptoms, tests and vaccinations were collected through enrolment and follow-up surveys. PARTICIPANTS: Participants who reported positive COVID-19 test results between 15 December 2020 and 13 December 2021. Those with cognitive symptoms were compared with those not reporting such symptoms. MAIN OUTCOME MEASURE: Self-reported cognitive symptoms (defined as 'feeling disoriented or having trouble thinking' from listed options or related written-in symptoms) RESULTS: Of 3908 participants with a positive COVID-19 test result, 1014 (25.9%) reported cognitive symptoms at any time point during enrolment or follow-up, with approximately half reporting moderate/severe symptoms. Cognitive symptoms were associated with other neuropsychiatric symptoms, including dysgeusia, anosmia, trouble waking up, insomnia, headache, anxiety and depression. In multivariate analyses, female sex (OR, 95% CI): 1.7 (1.3 to 2.2), age (40-49 years (OR: 1.5 (1.2-1.9) compared with 18-29 years), history of autoimmune disease (OR: 1.5 (1.2-2.1)), lung disease (OR: 1.7 (1.3-2.2)) and depression (OR: 1.4 (1.1-1.7)) were associated with cognitive symptoms. Conversely, black race (OR: 0.6 (0.5-0.9)) and COVID-19 vaccination before infection (OR: 0.6 (0.4-0.7)) were associated with reduced occurrence of cognitive symptoms. CONCLUSIONS: In this study, cognitive symptoms among COVID-19-positive participants were associated with female gender, age, autoimmune disorders, lung disease and depression. Vaccination and black race were associated with lower occurrence of cognitive symptoms. A constellation of neuropsychiatric and psychological symptoms occurred with cognitive symptoms. Our findings suggest COVID-19's full health and economic burden may be underestimated. TRIAL REGISTRATION NUMBER: NCT04368065.
Asunto(s)
COVID-19 , Adulto , Humanos , Femenino , Persona de Mediana Edad , COVID-19/diagnóstico , COVID-19/epidemiología , Vacunas contra la COVID-19 , SARS-CoV-2 , Ansiedad/epidemiología , CogniciónRESUMEN
BACKGROUND: Lower extremity (LEX) strains, including hamstring, quadriceps, adductor, and calf strains, are among the most common injuries in sports. These injuries lead to high burden, resulting in significant missed participation time. PURPOSE: To describe the incidence of LEX strains in professional American football. STUDY DESIGN: Descriptive epidemiology study. METHODS: This study included all players who played in ≥1 National Football League (NFL) game or sustained a LEX strain during participation in the 2015-2019 seasons. LEX strain frequency was calculated by setting (game, practice, conditioning), timing in season (offseason, preseason, regular season, postseason), and roster position. Game incidence rates were calculated for season, roster position, and play type. LEX strains were identified in the standardized leaguewide electronic health record (n = 32 teams). RESULTS: Across 5 years, 5780 LEX strains were reported among 2769 players (1-year risk, 26.7%; 95% CI, 26.0%-27.3%); 69% (n = 4015) resulted in time loss. Among all LEX strains, 54.7% were hamstring (n = 3163), 24.1% adductor (n = 1393), 12.6% calf (n = 728), 8.3% quadriceps (n = 477), and 0.3% multiple muscle groups (n = 19). Most were reported during preseason practices (n = 1076; 27%) and regular season games (n = 1060; 26%). The 2-week period of training camp practices comprised 19% of all time-loss strains. Among game injuries, preseason games had the highest rate of LEX strain (2.9/10,000 player-plays; 95% CI, 2.6-3.2). Defensive secondary players accounted for the highest proportion of time-loss LEX strains (27%; n = 1082). In games, punt plays had nearly twice the injury rate of kickoff plays (14.9/1000 plays [95% CI, 13.1-17.0] vs 7.5/1000 plays [95% CI, 6.2-8.9], respectively) and >3 times the rate of pass plays (4.3/1000 plays; 95% CI, 4.0-4.7) and run plays (2.6/1000 plays; 95% CI, 2.3-2.9). In aggregate, LEX strains led to an estimated 16,748 participation days missed each year and a median 12 days missed per injury. CONCLUSION: LEX strains affected 1 in 4 NFL players each year, resulting in a high burden of injury in terms of time lost from practice and competition. Safe return to the NFL season during training camp and reduction of injuries during regular season games are key focuses for future injury reduction.
Asunto(s)
Fútbol Americano , Esguinces y Distensiones , Humanos , Fútbol Americano/lesiones , Extremidad Inferior/lesiones , Esguinces y Distensiones/epidemiología , Músculo Cuádriceps/lesionesRESUMEN
In this work, we aim to accurately predict the number of hospitalizations during the COVID-19 pandemic by developing a spatiotemporal prediction model. We propose HOIST, an Ising dynamics-based deep learning model for spatiotemporal COVID-19 hospitalization prediction. By drawing the analogy between locations and lattice sites in statistical mechanics, we use the Ising dynamics to guide the model to extract and utilize spatial relationships across locations and model the complex influence of granular information from real-world clinical evidence. By leveraging rich linked databases, including insurance claims, census information, and hospital resource usage data across the U.S., we evaluate the HOIST model on the large-scale spatiotemporal COVID-19 hospitalization prediction task for 2299 counties in the U.S. In the 4-week hospitalization prediction task, HOIST achieves 368.7 mean absolute error, 0.6 [Formula: see text] and 0.89 concordance correlation coefficient score on average. Our detailed number needed to treat (NNT) and cost analysis suggest that future COVID-19 vaccination efforts may be most impactful in rural areas. This model may serve as a resource for future county and state-level vaccination efforts.