Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 542
Filter
Add more filters

Publication year range
1.
Nature ; 614(7946): 102-107, 2023 02.
Article in English | MEDLINE | ID: mdl-36697827

ABSTRACT

Living amphibians (Lissamphibia) include frogs and salamanders (Batrachia) and the limbless worm-like caecilians (Gymnophiona). The estimated Palaeozoic era gymnophionan-batrachian molecular divergence1 suggests a major gap in the record of crown lissamphibians prior to their earliest fossil occurrences in the Triassic period2-6. Recent studies find a monophyletic Batrachia within dissorophoid temnospondyls7-10, but the absence of pre-Jurassic period caecilian fossils11,12 has made their relationships to batrachians and affinities to Palaeozoic tetrapods controversial1,8,13,14. Here we report the geologically oldest stem caecilian-a crown lissamphibian from the Late Triassic epoch of Arizona, USA-extending the caecilian record by around 35 million years. These fossils illuminate the tempo and mode of early caecilian morphological and functional evolution, demonstrating a delayed acquisition of musculoskeletal features associated with fossoriality in living caecilians, including the dual jaw closure mechanism15,16, reduced orbits17 and the tentacular organ18. The provenance of these fossils suggests a Pangaean equatorial origin for caecilians, implying that living caecilian biogeography reflects conserved aspects of caecilian function and physiology19, in combination with vicariance patterns driven by plate tectonics20. These fossils reveal a combination of features that is unique to caecilians alongside features that are shared with batrachian and dissorophoid temnospondyls, providing new and compelling evidence supporting a single origin of living amphibians within dissorophoid temnospondyls.


Subject(s)
Amphibians , Anura , Fossils , Phylogeny , Urodela , Animals , Amphibians/anatomy & histology , Anura/anatomy & histology , Arizona , Urodela/anatomy & histology , Orbit/anatomy & histology , Jaw/anatomy & histology , Musculoskeletal System/anatomy & histology
2.
Ecol Lett ; 27(5): e14427, 2024 May.
Article in English | MEDLINE | ID: mdl-38698677

ABSTRACT

Tree diversity can promote both predator abundance and diversity. However, whether this translates into increased predation and top-down control of herbivores across predator taxonomic groups and contrasting environmental conditions remains unresolved. We used a global network of tree diversity experiments (TreeDivNet) spread across three continents and three biomes to test the effects of tree species richness on predation across varying climatic conditions of temperature and precipitation. We recorded bird and arthropod predation attempts on plasticine caterpillars in monocultures and tree species mixtures. Both tree species richness and temperature increased predation by birds but not by arthropods. Furthermore, the effects of tree species richness on predation were consistent across the studied climatic gradient. Our findings provide evidence that tree diversity strengthens top-down control of insect herbivores by birds, underscoring the need to implement conservation strategies that safeguard tree diversity to sustain ecosystem services provided by natural enemies in forests.


Subject(s)
Arthropods , Biodiversity , Birds , Climate , Predatory Behavior , Trees , Animals , Arthropods/physiology , Birds/physiology , Food Chain , Larva/physiology
3.
Am J Transplant ; 2024 Sep 26.
Article in English | MEDLINE | ID: mdl-39341343

ABSTRACT

In the US liver allocation system, non-standardized MELD exceptions increase the waitlist priority of candidates whose MELD scores are felt to underestimate their true medical urgency. We determined whether NSEs accurately depict pre-transplant mortality risk by performing mixed-effects Cox proportional hazards models and estimating concordance indices. We also studied the change in frequency of NSEs after the National Liver Review Board's (NLRB) implementation in May 2019. Between June 2016 and April 2022, 60,322 adult candidates were listed, of which 10,280 (17.0%) received an NSE at least once. The mean allocation MELD was 23.9, an increase of 12.0 points from the mean laboratory MELD of 11.9 (p < 0.001). A one-point increase in allocation MELD score due to an NSE was associated with, on average, a 2% reduction in hazard of pre-transplant death (cause-specific HR 0.98, 95% CI [0.96, 1.00], p = 0.02) compared to those with the same laboratory MELD. Laboratory MELD was more accurate than allocation MELD with NSEs in rank-ordering candidates (c-index 0.889 vs 0.857). The proportion of candidates with NSEs decreased significantly after the NLRB from 21.5% to 12.8% (p < 0.001). NSEs substantially increase the waitlist priority of candidates with objectively low medical urgency.

4.
Am J Transplant ; 2024 Sep 16.
Article in English | MEDLINE | ID: mdl-39293517

ABSTRACT

Donation after circulatory death (DCD) is driving the increase in deceased organ donors in the United States. Normothermic regional perfusion (NRP) and ex situ machine perfusion (es-MP) have been instrumental in improving liver transplant outcomes and graft utilization. This study examines the current landscape of liver utilization from cardiac DCD donors in the United States. Using the United Network for Organ Sharing Standard Transplant Analysis and Research file, all adult (≥18 years old) DCD donors in the United States from which the heart was used for transplantation from October 1, 2020, to September 30, 2023, were compared by procurement technique (NRP versus super rapid recovery [SRR]) and storage strategy (es-MP versus static cold storage). One hundred eighty-eight livers were transplanted from 309 thoracoabdominal NRP donors (61% utilization) versus 305 (56%) liver transplants from 544 SRR donors. es-MP was used in 20% (n = 38) of NRP cases versus 32% (98) of SRR cases. Of the liver grafts, 281 (59%) were exposed to NRP, es-MP, or both. While there is widespread utilization of machine perfusion, more research is needed to determine optimal graft management strategies, particularly concerning the use of multiple technologies in complementary ways. More complete data collection is necessary at a national level to address these important research questions.

5.
J Urol ; 211(4): 552-562, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38299570

ABSTRACT

PURPOSE: Excess body and visceral fat increase the risk of death from prostate cancer (PCa). This phase II study aimed to test whether weight reduction by > 5% total body weight counteracts obesity-driven PCa biomarkers. MATERIALS AND METHODS: Forty men scheduled for prostatectomy were randomized into intervention (n = 20) or control (n = 20) arms. Intervention participants followed a weight management program for 4 to 16 weeks before and 6 months after surgery. Control participants received standardized educational materials. All participants attended visits at baseline, 1 week before surgery, and 6 months after surgery. Circulating immune cells, cytokines, and chemokines were evaluated. Weight loss, body composition/distribution, quality of life, and nutrition literacy were assessed. Prostate tissue samples obtained from biopsy and surgery were analyzed. RESULTS: From baseline to surgery (mean = 5 weeks), the intervention group achieved 5.5% of weight loss (95% CI, 4%-7%). Compared to the control, the intervention also reduced insulin, total cholesterol, LDL cholesterol, leptin, leptin:adiponectin ratio, and visceral adipose tissue. The intervention group had reduced c-peptide, plasminogen-activator-inhibitor-1, and T cell count from baseline to surgery. Myeloid-derived suppressor cells were not statistically different by group. Intervention group anthropometrics improved, including visceral and overall fat loss. No prostate tissue markers changed significantly. Quality of life measures of general and emotional health improved in the intervention group. The intervention group maintained or kept losing to a net loss of 11% initial body weight (95% CI, 8%-14%) at the study end. CONCLUSIONS: Our study demonstrated improvements in body composition, PCa biomarkers, and quality of life with a weight management intervention.


Subject(s)
Leptin , Prostatic Neoplasms , Male , Humans , Prostate , Quality of Life , Adipose Tissue , Obesity/complications , Obesity/therapy , Biomarkers , Body Weight , Prostatic Neoplasms/therapy , Weight Loss
6.
Am J Kidney Dis ; 84(4): 416-426, 2024 Oct.
Article in English | MEDLINE | ID: mdl-38636649

ABSTRACT

RATIONALE & OBJECTIVE: The US Kidney Allocation System (KAS) prioritizes candidates with a≤20% estimated posttransplant survival (EPTS) to receive high-longevity kidneys defined by a≤20% Kidney Donor Profile Index (KDPI). Use of EPTS in the KAS deprioritizes candidates with older age, diabetes, and longer dialysis durations. We assessed whether this use also disadvantages race and ethnicity minority candidates, who are younger but more likely to have diabetes and longer durations of kidney failure requiring dialysis. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Adult candidates for and recipients of kidney transplantation represented in the Scientific Registry of Transplant Recipients from January 2015 through December 2020. EXPOSURE: Race and ethnicity. OUTCOME: Age-adjusted assignment to≤20% EPTS, transplantation of a≤20% KDPI kidney, and posttransplant survival in longevity-matched recipients by race and ethnicity. ANALYTIC APPROACH: Multivariable logistic regression, Fine-Gray competing risks survival analysis, and Kaplan-Meier and Cox proportional hazards methods. RESULTS: The cohort included 199,444 candidates (7% Asian, 29% Black, 19% Hispanic or Latino, and 43% White) listed for deceased donor kidney transplantation. Non-White candidates had significantly higher rates of diabetes, longer dialysis duration, and were younger than White candidates. Adjusted for age, Asian, Black, and Hispanic or Latino candidates had significantly lower odds of having a ETPS score of≤20% (odds ratio, 0.86 [95% CI, 0.81-0.91], 0.52 [95% CI, 0.50-0.54], and 0.49 [95% CI, 0.47-0.51]), and were less likely to receive a≤20% KDPI kidney (sub-hazard ratio, 0.70 [0.66-0.75], 0.89 [0.87-0.92], and 0.73 [0.71-0.76]) compared with White candidates. Among recipients with≤20% EPTS scores transplanted with a≤20% KDPI deceased donor kidney, Asian and Hispanic recipients had lower posttransplant mortality (HR, 0.45 [0.27-0.77] and 0.63 [0.47-0.86], respectively) and Black recipients had higher but not statistically significant posttransplant mortality (HR, 1.22 [0.99-1.52]) compared with White recipients. LIMITATIONS: Provider reported race and ethnicity data and 5-year post transplant follow-up period. CONCLUSIONS: The US kidney allocation system is less likely to identify race and ethnicity minority candidates as having a≤20% EPTS score, which triggers allocation of high-longevity deceased donor kidneys. These findings should inform the Organ Procurement and Transplant Network about how to remedy the race and ethnicity disparities introduced through KAS's current approach of allocating allografts with longer predicted longevity to recipients with longer estimated posttransplant survival. PLAIN-LANGUAGE SUMMARY: The US Kidney Allocation System prioritizes giving high-longevity, high-quality kidneys to patients on the waiting list who have a high estimated posttransplant survival (EPTS) score. EPTS is calculated based on the patient's age, whether the patient has diabetes, whether the patient has a history of organ transplantation, and the number of years spent on dialysis. Our analyses show that Asian, Black or African American, and Hispanic or Latino patients were less likely to receive high-longevity kidneys compared with White patients, despite having similar or better posttransplant survival outcomes.


Subject(s)
Kidney Transplantation , Tissue and Organ Procurement , Humans , Male , Female , Middle Aged , United States/epidemiology , Adult , Cohort Studies , Tissue Donors , Kidney Failure, Chronic/surgery , Kidney Failure, Chronic/ethnology , Kidney Failure, Chronic/mortality , Graft Survival , Aged , Ethnicity , Longevity , Registries , Racial Groups
7.
New Phytol ; 243(3): 1205-1219, 2024 Aug.
Article in English | MEDLINE | ID: mdl-38855965

ABSTRACT

Decades of studies have demonstrated links between biodiversity and ecosystem functioning, yet the generality of the relationships and the underlying mechanisms remain unclear, especially for forest ecosystems. Using 11 tree-diversity experiments, we tested tree species richness-community productivity relationships and the role of arbuscular (AM) or ectomycorrhizal (ECM) fungal-associated tree species in these relationships. Tree species richness had a positive effect on community productivity across experiments, modified by the diversity of tree mycorrhizal associations. In communities with both AM and ECM trees, species richness showed positive effects on community productivity, which could have resulted from complementarity between AM and ECM trees. Moreover, both AM and ECM trees were more productive in mixed communities with both AM and ECM trees than in communities assembled by their own mycorrhizal type of trees. In communities containing only ECM trees, species richness had a significant positive effect on productivity, whereas species richness did not show any significant effects on productivity in communities containing only AM trees. Our study provides novel explanations for variations in diversity-productivity relationships by suggesting that tree-mycorrhiza interactions can shape productivity in mixed-species forest ecosystems.


Subject(s)
Biodiversity , Mycorrhizae , Trees , Mycorrhizae/physiology , Trees/microbiology , Species Specificity
8.
JAMA ; 331(6): 500-509, 2024 02 13.
Article in English | MEDLINE | ID: mdl-38349372

ABSTRACT

Importance: The US heart allocation system prioritizes medically urgent candidates with a high risk of dying without transplant. The current therapy-based 6-status system is susceptible to manipulation and has limited rank ordering ability. Objective: To develop and validate a candidate risk score that incorporates current clinical, laboratory, and hemodynamic data. Design, Setting, and Participants: A registry-based observational study of adult heart transplant candidates (aged ≥18 years) from the US heart allocation system listed between January 1, 2019, and December 31, 2022, split by center into training (70%) and test (30%) datasets. Adult candidates were listed between January 1, 2019, and December 31, 2022. Main Outcomes and Measures: A US candidate risk score (US-CRS) model was developed by adding a predefined set of predictors to the current French Candidate Risk Score (French-CRS) model. Sensitivity analyses were performed, which included intra-aortic balloon pumps (IABP) and percutaneous ventricular assist devices (VAD) in the definition of short-term mechanical circulatory support (MCS) for the US-CRS. Performance of the US-CRS model, French-CRS model, and 6-status model in the test dataset was evaluated by time-dependent area under the receiver operating characteristic curve (AUC) for death without transplant within 6 weeks and overall survival concordance (c-index) with integrated AUC. Results: A total of 16 905 adult heart transplant candidates were listed (mean [SD] age, 53 [13] years; 73% male; 58% White); 796 patients (4.7%) died without a transplant. The final US-CRS contained time-varying short-term MCS (ventricular assist-extracorporeal membrane oxygenation or temporary surgical VAD), the log of bilirubin, estimated glomerular filtration rate, the log of B-type natriuretic peptide, albumin, sodium, and durable left ventricular assist device. In the test dataset, the AUC for death within 6 weeks of listing for the US-CRS model was 0.79 (95% CI, 0.75-0.83), for the French-CRS model was 0.72 (95% CI, 0.67-0.76), and 6-status model was 0.68 (95% CI, 0.62-0.73). Overall c-index for the US-CRS model was 0.76 (95% CI, 0.73-0.80), for the French-CRS model was 0.69 (95% CI, 0.65-0.73), and 6-status model was 0.67 (95% CI, 0.63-0.71). Classifying IABP and percutaneous VAD as short-term MCS reduced the effect size by 54%. Conclusions and Relevance: In this registry-based study of US heart transplant candidates, a continuous multivariable allocation score outperformed the 6-status system in rank ordering heart transplant candidates by medical urgency and may be useful for the medical urgency component of heart allocation.


Subject(s)
Heart Failure , Heart Transplantation , Tissue and Organ Procurement , Adult , Female , Humans , Male , Middle Aged , Bilirubin , Clinical Laboratory Services , Heart , Risk Factors , Risk Assessment , Heart Failure/mortality , Heart Failure/surgery , United States , Health Care Rationing/methods , Predictive Value of Tests , Tissue and Organ Procurement/methods , Tissue and Organ Procurement/organization & administration
9.
Can Assoc Radiol J ; : 8465371241266785, 2024 Jul 27.
Article in English | MEDLINE | ID: mdl-39066637

ABSTRACT

Purpose: This study evaluates the efficacy of a commercial medical Named Entity Recognition (NER) model combined with a post-processing protocol in identifying incidental pulmonary nodules from CT reports. Methods: We analyzed 9165 anonymized CT reports and classified them into 3 categories: no nodules, nodules present, and nodules >6 mm. For each report, a generic medical NER model annotated entities and their relations, which were then filtered through inclusion/exclusion criteria selected to identify pulmonary nodules. Ground truth was established by manual review. To better understand the relationship between model performance and nodule prevalence, a subset of the data was programmatically balanced to equalize the number of reports in each class category. Results: In the unbalanced subset of the data, the model achieved a sensitivity of 97%, specificity of 99%, and accuracy of 99% in detecting pulmonary nodules mentioned in the reports. For nodules >6 mm, sensitivity was 95%, specificity was 100%, and accuracy was 100%. In the balanced subset of the data, sensitivity was 99%, specificity 96%, and accuracy 97% for nodule detection; for larger nodules, sensitivity was 94%, specificity 99%, and accuracy 98%. Conclusions: The NER model demonstrated high sensitivity and specificity in detecting pulmonary nodules reported in CT scans, including those >6 mm which are potentially clinically significant. The results were consistent across both unbalanced and balanced datasets indicating that the model performance is independent of nodule prevalence. Implementing this technology in hospital systems could automate the identification of at-risk patients, ensuring timely follow-up and potentially reducing missed or late-stage cancer diagnoses.

10.
Yale J Biol Med ; 97(2): 253-263, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38947109

ABSTRACT

Environmental mismatches are defined as changes in the environment that induce public health crises. Well known mismatches leading to chronic disease include the availability of technologies that facilitate unhealthy diets and sedentary lifestyles, both factors that adversely affect cardiovascular health. This commentary puts these mismatches in context with biota alteration, an environmental mismatch involving hygiene-related technologies necessary for avoidance of infectious disease. Implementation of hygiene-related technologies causes a loss of symbiotic helminths and protists, profoundly affecting immune function and facilitating a variety of chronic conditions, including allergic disorders, autoimmune diseases, and several inflammation-associated neuropsychiatric conditions. Unfortunately, despite an established understanding of the biology underpinning this and other environmental mismatches, public health agencies have failed to stem the resulting tide of increased chronic disease burden. Both biomedical research and clinical practice continue to focus on an ineffective and reactive pharmaceutical-based paradigm. It is argued that the healthcare of the future could take into account the biology of today, effectively and proactively dealing with environmental mismatch and the resulting chronic disease burden.


Subject(s)
Immune System Diseases , Humans , Chronic Disease , Animals , Environment
11.
Crit Care Med ; 51(8): 1012-1022, 2023 08 01.
Article in English | MEDLINE | ID: mdl-36995088

ABSTRACT

OBJECTIVES: A unilateral do-not-resuscitate (UDNR) order is a do-not-resuscitate order placed using clinician judgment which does not require consent from a patient or surrogate. This study assessed how UDNR orders were used during the COVID-19 pandemic. DESIGN: We analyzed a retrospective cross-sectional study of UDNR use at two academic medical centers between April 2020 and April 2021. SETTING: Two academic medical centers in the Chicago metropolitan area. PATIENTS: Patients admitted to an ICU between April 2020 and April 2021 who received vasopressor or inotropic medications to select for patients with high severity of illness. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: The 1,473 patients meeting inclusion criteria were 53% male, median age 64 (interquartile range, 54-73), and 38% died during admission or were discharged to hospice. Clinicians placed do not resuscitate orders for 41% of patients ( n = 604/1,473) and UDNR orders for 3% of patients ( n = 51/1,473). The absolute rate of UDNR orders was higher for patients who were primary Spanish speaking (10% Spanish vs 3% English; p ≤ 0.0001), were Hispanic or Latinx (7% Hispanic/Latinx vs 3% Black vs 2% White; p = 0.003), positive for COVID-19 (9% vs 3%; p ≤ 0.0001), or were intubated (5% vs 1%; p = 0.001). In the base multivariable logistic regression model including age, race/ethnicity, primary language spoken, and hospital location, Black race (adjusted odds ratio [aOR], 2.5; 95% CI, 1.3-4.9) and primary Spanish language (aOR, 4.4; 95% CI, 2.1-9.4) had higher odds of UDNR. After adjusting the base model for severity of illness, primary Spanish language remained associated with higher odds of UDNR order (aOR, 2.8; 95% CI, 1.7-4.7). CONCLUSIONS: In this multihospital study, UDNR orders were used more often for primary Spanish-speaking patients during the COVID-19 pandemic, which may be related to communication barriers Spanish-speaking patients and families experience. Further study is needed to assess UDNR use across hospitals and enact interventions to improve potential disparities.


Subject(s)
COVID-19 , Humans , Male , Middle Aged , Female , Resuscitation Orders , Retrospective Studies , Cross-Sectional Studies , Pandemics
12.
J Card Fail ; 29(4): 517-526, 2023 04.
Article in English | MEDLINE | ID: mdl-36632933

ABSTRACT

Heart failure (HF) is a clinical syndrome that is divided into 3 subtypes based on the left ventricular ejection fraction. Every subtype has specific clinical characteristics and concomitant diseases, substantially increasing risk of thromboembolic complications, such as stroke, peripheral embolism and pulmonary embolism. Despite the annual prevalence of 1% and devastating clinical consequences, thromboembolic complications are not typically recognized as the leading problem in patients with HF, representing an underappreciated clinical challenge. Although the currently available data do not support routine anticoagulation in patients with HF and sinus rhythm, initial reports suggest that such strategy might be beneficial in a subset of patients at especially high thromboembolic risk. Considering the existing evidence gap, we aimed to review the currently available data regarding coagulation disorders in acute and chronic HF based on the insight from preclinical and clinical studies, to summarize the evidence regarding anticoagulation in HF in special-case scenarios and to outline future research directions so as to establish the optimal patient-tailored strategies for antiplatelet and anticoagulant therapy in HF. In summary, we highlight the top 10 pearls in the management of patients with HF and no other specific indications for oral anticoagulation therapy. Further studies are urgently needed to shed light on the pathophysiological role of platelet activation in HF and to evaluate whether antiplatelet or antithrombotic therapy could be beneficial in patients with HF. LAY SUMMARY: Heart failure (HF) is a clinical syndrome divided into 3 subtypes on the basis of the left ventricular systolic function. Every subtype has specific clinical characteristics and concomitant diseases, substantially increasing the risk of thromboembolic complications, such as stroke, peripheral embolism and pulmonary embolism. Despite the annual prevalence of 1% and devastating clinical consequences, thromboembolic complications are not typically recognized as the leading problem in patients with HF, representing an underappreciated clinical challenge. Although the currently available data do not support routine anticoagulation in patients with HF and no atrial arrhythmia, initial reports suggest that such a strategy might be beneficial in a subset of patients at especially high risk of thrombotic complications. Considering the existing evidence gap, we aimed to review the currently available data regarding coagulation problems in stable and unstable patients with HF based on the insight from preclinical and clinical studies, to summarize the evidence regarding anticoagulation in HF in specific patient groups and to outline future research directions to establish the optimal strategies for antiplatelet and anticoagulant therapy in HF, tailored to the needs of an individual patient. In summary, we highlight the top 10 pearls in the management of patients with HF and no other specific indications for oral anticoagulation therapy.


Subject(s)
Atrial Fibrillation , Blood Coagulation Disorders , Heart Failure , Pulmonary Embolism , Stroke , Thromboembolism , Humans , Stroke Volume , Heart Failure/complications , Heart Failure/drug therapy , Heart Failure/epidemiology , Ventricular Function, Left , Anticoagulants/therapeutic use , Thromboembolism/drug therapy , Thromboembolism/epidemiology , Thromboembolism/etiology , Stroke/etiology , Blood Coagulation Disorders/complications , Blood Coagulation Disorders/drug therapy , Arrhythmias, Cardiac , Atrial Fibrillation/complications
13.
J Surg Oncol ; 128(2): 280-288, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37073788

ABSTRACT

BACKGROUND: Outcomes for pancreatic adenocarcinoma (PDAC) remain difficult to prognosticate. Multiple models attempt to predict survival following the resection of PDAC, but their utility in the neoadjuvant population is unknown. We aimed to assess their accuracy among patients that received neoadjuvant chemotherapy (NAC). METHODS: We performed a multi-institutional retrospective analysis of patients who received NAC and underwent resection of PDAC. Two prognostic systems were evaluated: the Memorial Sloan Kettering Cancer Center Pancreatic Adenocarcinoma Nomogram (MSKCCPAN) and the American Joint Committee on Cancer (AJCC) staging system. Discrimination between predicted and actual disease-specific survival was assessed using the Uno C-statistic and Kaplan-Meier method. Calibration of the MSKCCPAN was assessed using the Brier score. RESULTS: A total of 448 patients were included. There were 232 (51.8%) females, and the mean age was 64.1 years (±9.5). Most had AJCC Stage I or II disease (77.7%). For the MSKCCPAN, the Uno C-statistic at 12-, 24-, and 36-month time points was 0.62, 0.63, and 0.62, respectively. The AJCC system demonstrated similarly mediocre discrimination. The Brier score for the MSKCCPAN was 0.15 at 12 months, 0.26 at 24 months, and 0.30 at 36 months, demonstrating modest calibration. CONCLUSIONS: Current survival prediction models and staging systems for patients with PDAC undergoing resection after NAC have limited accuracy.


Subject(s)
Adenocarcinoma , Carcinoma, Pancreatic Ductal , Pancreatic Neoplasms , Female , Humans , Male , Middle Aged , Adenocarcinoma/surgery , Carcinoma, Pancreatic Ductal/drug therapy , Carcinoma, Pancreatic Ductal/surgery , Neoadjuvant Therapy , Neoplasm Staging , Nomograms , Pancreatic Neoplasms/drug therapy , Pancreatic Neoplasms/surgery , Prognosis , Retrospective Studies , Pancreatic Neoplasms
14.
Platelets ; 34(1): 2154330, 2023 Dec.
Article in English | MEDLINE | ID: mdl-36524601

ABSTRACT

Chronic kidney disease (CKD) is a global health problem and an independent risk factor for cardiovascular morbidity and mortality. Despite evidence-based therapies significantly improving cardiovascular mortality outcomes in the general population and those with non-dialysis-dependent CKD, this risk reduction has not translated to patients with end-stage kidney disease (ESKD). Absent from all major antiplatelet trials, this has led to insufficient safety data for P2Y12 inhibitor prescriptions and treatment inequity in this subpopulation. This review article presents an overview of the progression of research in understanding antiplatelet therapy for ischaemic heart disease in patients with advanced CKD (defined as eGFR <30 mL/min/1.73 m2). Beyond trial recruitment strategies, new approaches should focus on registry documentation by CKD stage, risk stratification with biomarkers associated with inflammation and haemorrhage and building a knowledge base on optimal duration of dual and single antiplatelet therapies.


What is the context? Patients with kidney disease are more likely to experience a heart attack than those without.Those with advanced kidney disease have a higher risk of death following a heart attack.Over the past two decades, advances in treatment following a heart attack have reduced the risk of death, however this has not translated to those with advanced kidney disease.Progression of kidney disease influences antiplatelet (e.g. clopidogrel) treatment efficacy.What is new?This contemporary review analyses registry and trial data to highlight some of the issues surrounding treatment inequity in patients with advanced kidney disease.This article describes potential mechanisms by which progression of kidney disease can influence clotting, bleeding and antiplatelet treatments.What is the impact?Further research into antiplatelet therapy for patients with advanced kidney disease is required.Registry and trial data can improve upon classification of kidney disease for future research.Future trials in antiplatelet therapy for advanced kidney disease are anticipated.


Subject(s)
Coronary Artery Disease , Myocardial Ischemia , Renal Insufficiency, Chronic , Humans , Platelet Aggregation Inhibitors/pharmacology , Platelet Aggregation Inhibitors/therapeutic use , Vacuum , Renal Insufficiency, Chronic/complications , Renal Insufficiency, Chronic/drug therapy , Hemorrhage/complications , Coronary Artery Disease/complications , Myocardial Ischemia/complications , Myocardial Ischemia/drug therapy , Myocardial Ischemia/chemically induced
15.
Clin Oral Implants Res ; 34(1): 13-19, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36245313

ABSTRACT

AIM: The aim of the present study was to evaluate soft and hard tissue alterations around implants with a modified marginal portion placed in a healed, sloped ridge over 3 years of follow-up. MATERIAL AND METHODS: 65 patients with a single recipient implant site in an alveolar ridge with a lingual-buccal sloped configuration were recruited. Implants with a modified geometry in the marginal portion were installed in such a way that the sloped part of the device was located at the buccal and most apical position of the osteotomy preparation. Crowns were placed 21 weeks after implant placement. Radiologic examinations were performed at implant installation and at 1 and 3 years of follow-up. Bleeding on probing (BoP), probing pocket depth (PPD), and clinical attachment level (CAL; from the crown margin) were recorded at the insertion of the prosthesis and after 1 and 3 years. RESULTS: 57 patients with 57 implant-supported restorations attended the 3 years follow-up examination. The radiographic analysis revealed a mean marginal bone loss of 0.57 mm during the 3 years period. While the average bone loss between 1 and 3 years amounted to 0.30 mm, approximately 50% of the implants showed no bone loss during this period. The results from the clinical examinations showed a CAL gain of 0.11 ± 0.85 mm between baseline and 3 years of follow-up. About 65% of the implants showed no loss of attachment between 1 and 3 years. BoP and PPD ≥5 mm were identified at <10% of implants at the 3 years examination. CONCLUSION: Hard and soft tissues formed around dental implants that were designed to match the morphology of an alveolar ridge with a lingual-buccal sloped configuration remained stable over 3 years.


Subject(s)
Alveolar Bone Loss , Dental Implants, Single-Tooth , Dental Implants , Humans , Dental Implantation, Endosseous/methods , Prospective Studies , Alveolar Process/diagnostic imaging , Alveolar Process/surgery , Crowns , Alveolar Bone Loss/diagnostic imaging , Alveolar Bone Loss/surgery , Follow-Up Studies , Dental Prosthesis, Implant-Supported
16.
Can Assoc Radiol J ; 74(3): 548-556, 2023 Aug.
Article in English | MEDLINE | ID: mdl-36542834

ABSTRACT

PURPOSE: To develop and assess the performance of a machine learning model which screens chest radiographs for 14 labels, and to determine whether fine-tuning the model on local data improves its performance. Generalizability at different institutions has been an obstacle to machine learning model implementation. We hypothesized that the performance of a model trained on an open-source dataset will improve at our local institution after being fine-tuned on local data. METHODS: In this retrospective, institutional review board approved study, an ensemble of neural networks was trained on open-source datasets of chest radiographs for the detection of 14 labels. This model was then fine-tuned using 4510 local radiograph studies, using radiologists' reports as the gold standard to evaluate model performance. Both the open-source and fine-tuned models' accuracy were tested on 802 local radiographs. Receiver-operator characteristic curves were calculated, and statistical analysis was completed using DeLong's method and Wilcoxon signed-rank test. RESULTS: The fine-tuned model identified 12 of 14 pathology labels with area under the curves greater than .75. After fine-tuning with local data, the model performed statistically significantly better overall, and specifically in detecting six pathology labels (P < .01). CONCLUSIONS: A machine learning model able to accurately detect 14 labels simultaneously on chest radiographs was developed using open-source data, and its performance was improved after fine-tuning on local site data. This simple method of fine-tuning existing models on local data could improve the generalizability of existing models across different institutions to further improve their local performance.


Subject(s)
Deep Learning , Humans , Retrospective Studies , Radiography , Machine Learning , Neural Networks, Computer
17.
Am J Transplant ; 22(6): 1683-1690, 2022 06.
Article in English | MEDLINE | ID: mdl-34951528

ABSTRACT

The Organ Procurement and Transplant Network (OPTN) implemented a new heart allocation policy on October 18, 2018. Published estimates of lower posttransplant survival under the new policy in cohorts with limited follow-up may be biased by informative censoring. Using the Scientific Registry of Transplant Recipients, we used the Kaplan-Meier method to estimate 1-year posttransplant survival for pre-policy (November 1, 2016, to October 31, 2017) and post-policy cohorts (November 1, 2018, to October 31, 2019) with follow-up through March 2, 2021. We adjusted for changes in recipient population over time with a multivariable Cox proportional hazards model. To demonstrate the effect of inadequate follow-up on post-policy survival estimates, we repeated the analysis but only included follow-up through October 31, 2019. Transplant programs transplanted 2594 patients in the pre-policy cohort and 2761 patients in the post-policy cohort. With follow-up through March 2, 2021, unadjusted 1-year posttransplant survival was 90.6% (89.5%-91.8%) in the pre-policy cohort and 90.8% (89.7%-91.9%) in the post-policy cohort (adjusted HR = 0.93 [0.77-1.12]). Ignoring follow-up after October 31, 2019, the post-policy estimate was biased downward (1-year: 82.2%). When estimated with adequate follow-up, 1-year posttransplant survival under the new heart allocation policy was not significantly different.


Subject(s)
Heart Transplantation , Tissue and Organ Procurement , Humans , Policy , Registries , Tissue Donors , Transplant Recipients
18.
Crit Care Med ; 50(2): 212-223, 2022 02 01.
Article in English | MEDLINE | ID: mdl-35100194

ABSTRACT

OBJECTIVES: Body temperature trajectories of infected patients are associated with specific immune profiles and survival. We determined the association between temperature trajectories and distinct manifestations of coronavirus disease 2019. DESIGN: Retrospective observational study. SETTING: Four hospitals within an academic healthcare system from March 2020 to February 2021. PATIENTS: All adult patients hospitalized with coronavirus disease 2019. INTERVENTIONS: Using a validated group-based trajectory model, we classified patients into four previously defined temperature trajectory subphenotypes using oral temperature measurements from the first 72 hours of hospitalization. Clinical characteristics, biomarkers, and outcomes were compared between subphenotypes. MEASUREMENTS AND MAIN RESULTS: The 5,903 hospitalized coronavirus disease 2019 patients were classified into four subphenotypes: hyperthermic slow resolvers (n = 1,452, 25%), hyperthermic fast resolvers (1,469, 25%), normothermics (2,126, 36%), and hypothermics (856, 15%). Hypothermics had abnormal coagulation markers, with the highest d-dimer and fibrin monomers (p < 0.001) and the highest prevalence of cerebrovascular accidents (10%, p = 0.001). The prevalence of venous thromboembolism was significantly different between subphenotypes (p = 0.005), with the highest rate in hypothermics (8.5%) and lowest in hyperthermic slow resolvers (5.1%). Hyperthermic slow resolvers had abnormal inflammatory markers, with the highest C-reactive protein, ferritin, and interleukin-6 (p < 0.001). Hyperthermic slow resolvers had increased odds of mechanical ventilation, vasopressors, and 30-day inpatient mortality (odds ratio, 1.58; 95% CI, 1.13-2.19) compared with hyperthermic fast resolvers. Over the course of the pandemic, we observed a drastic decrease in the prevalence of hyperthermic slow resolvers, from representing 53% of admissions in March 2020 to less than 15% by 2021. We found that dexamethasone use was associated with significant reduction in probability of hyperthermic slow resolvers membership (27% reduction; 95% CI, 23-31%; p < 0.001). CONCLUSIONS: Hypothermics had abnormal coagulation markers, suggesting a hypercoagulable subphenotype. Hyperthermic slow resolvers had elevated inflammatory markers and the highest odds of mortality, suggesting a hyperinflammatory subphenotype. Future work should investigate whether temperature subphenotypes benefit from targeted antithrombotic and anti-inflammatory strategies.


Subject(s)
Body Temperature , COVID-19/pathology , Hyperthermia/pathology , Hypothermia/pathology , Phenotype , Academic Medical Centers , Aged , Anti-Inflammatory Agents/therapeutic use , Biomarkers/blood , Blood Coagulation , Cohort Studies , Dexamethasone/therapeutic use , Female , Humans , Inflammation , Male , Middle Aged , Organ Dysfunction Scores , Retrospective Studies , SARS-CoV-2
19.
J Anat ; 240(1): 1-10, 2022 01.
Article in English | MEDLINE | ID: mdl-34346066

ABSTRACT

Snake venom is produced, transported and delivered by the sophisticated venom delivery system (VDS). When snakes bite, the venom travels from the venom gland through the venom duct into needle-like fangs that inject it into their prey. To counteract breakages, fangs are continuously replaced throughout life. Currently, the anatomy of the connection between the duct and the fang has not been described, and the mechanism by which the duct is reconnected to the replacement fang has not been identified. We examined the VDS in 3D in representative species from two families and one subfamily (Elapidae, Viperidae, Atractaspidinae) using contrast-enhanced microCT (diceCT), followed by dissection and histology. We observed that the venom duct bifurcates immediately anterior to the fangs so that both the original and replacement fangs are separately connected and functional in delivering venom. When a fang is absent, the canal leading to the empty position is temporarily closed. We found that elapid snakes have a crescent-shaped venom reservoir where venom likely pools before it enters the fang. These findings form the final piece of the puzzle of VDS anatomy in front-fanged venomous snakes. Additionally, they provide further evidence for independent evolution of the VDS in these three snake taxa.


Subject(s)
Tooth , Viperidae , Animals , Humans , Snake Venoms , Snakes/anatomy & histology , Tooth/anatomy & histology
20.
Eur J Pediatr ; 181(5): 1835-1857, 2022 May.
Article in English | MEDLINE | ID: mdl-35175416

ABSTRACT

Although widely believed by pediatricians and parents to be safe for use in infants and children when used as directed, increasing evidence indicates that early life exposure to paracetamol (acetaminophen) may cause long-term neurodevelopmental problems. Furthermore, recent studies in animal models demonstrate that cognitive development is exquisitely sensitive to paracetamol exposure during early development. In this study, evidence for the claim that paracetamol is safe was evaluated using a systematic literature search. Publications on PubMed between 1974 and 2017 that contained the keywords "infant" and either "paracetamol" or "acetaminophen" were considered. Of those initial 3096 papers, 218 were identified that made claims that paracetamol was safe for use with infants or children. From these 218, a total of 103 papers were identified as sources of authority for the safety claim.   Conclusion: A total of 52 papers contained actual experiments designed to test safety, and had a median follow-up time of 48 h. None monitored neurodevelopment. Furthermore, no trial considered total exposure to drug since birth, eliminating the possibility that the effects of drug exposure on long-term neurodevelopment could be accurately assessed. On the other hand, abundant and sufficient evidence was found to conclude that paracetamol does not induce acute liver damage in babies or children when used as directed. What is Known: • Paracetamol (acetaminophen) is widely thought by pediatricians and parents to be safe when used as directed in the pediatric population, and is the most widely used drug in that population, with more than 90% of children exposed to the drug in some reports. • Paracetamol is known to cause liver damage in adults under conditions of oxidative stress or when used in excess, but increasing evidence from studies in humans and in laboratory animals indicates that the target organ for paracetamol toxicity during early development is the brain, not the liver. What is New: • This study finds hundreds of published reports in the medical literature asserting that paracetamol is safe when used as directed, providing a foundation for the widespread belief that the drug is safe. • This study shows that paracetamol was proven to be safe by approximately 50 short-term studies demonstrating the drug's safety for the pediatric liver, but the drug was never shown to be safe for neurodevelopment. Paracetamol is widely believed to be safe for infants and children when used as directed, despite mounting evidence in humans and in laboratory animals indicating that the drug is not safe for neurodevelopment. An exhaustive search of published work cited for safe use of paracetamol in the pediatric population revealed 52 experimental studies pointing toward safety, but the median follow-up time was only 48 h, and neurodevelopment was never assessed.


Subject(s)
Acetaminophen , Analgesics, Non-Narcotic , Acetaminophen/adverse effects , Analgesics, Non-Narcotic/adverse effects , Child , Humans
SELECTION OF CITATIONS
SEARCH DETAIL