Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 544
Filtrar
Mais filtros

Tipo de documento
Intervalo de ano de publicação
1.
Nature ; 614(7946): 102-107, 2023 02.
Artigo em Inglês | MEDLINE | ID: mdl-36697827

RESUMO

Living amphibians (Lissamphibia) include frogs and salamanders (Batrachia) and the limbless worm-like caecilians (Gymnophiona). The estimated Palaeozoic era gymnophionan-batrachian molecular divergence1 suggests a major gap in the record of crown lissamphibians prior to their earliest fossil occurrences in the Triassic period2-6. Recent studies find a monophyletic Batrachia within dissorophoid temnospondyls7-10, but the absence of pre-Jurassic period caecilian fossils11,12 has made their relationships to batrachians and affinities to Palaeozoic tetrapods controversial1,8,13,14. Here we report the geologically oldest stem caecilian-a crown lissamphibian from the Late Triassic epoch of Arizona, USA-extending the caecilian record by around 35 million years. These fossils illuminate the tempo and mode of early caecilian morphological and functional evolution, demonstrating a delayed acquisition of musculoskeletal features associated with fossoriality in living caecilians, including the dual jaw closure mechanism15,16, reduced orbits17 and the tentacular organ18. The provenance of these fossils suggests a Pangaean equatorial origin for caecilians, implying that living caecilian biogeography reflects conserved aspects of caecilian function and physiology19, in combination with vicariance patterns driven by plate tectonics20. These fossils reveal a combination of features that is unique to caecilians alongside features that are shared with batrachian and dissorophoid temnospondyls, providing new and compelling evidence supporting a single origin of living amphibians within dissorophoid temnospondyls.


Assuntos
Anfíbios , Anuros , Fósseis , Filogenia , Urodelos , Animais , Anfíbios/anatomia & histologia , Anuros/anatomia & histologia , Arizona , Urodelos/anatomia & histologia , Órbita/anatomia & histologia , Arcada Osseodentária/anatomia & histologia , Sistema Musculoesquelético/anatomia & histologia
2.
Ecol Lett ; 27(5): e14427, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38698677

RESUMO

Tree diversity can promote both predator abundance and diversity. However, whether this translates into increased predation and top-down control of herbivores across predator taxonomic groups and contrasting environmental conditions remains unresolved. We used a global network of tree diversity experiments (TreeDivNet) spread across three continents and three biomes to test the effects of tree species richness on predation across varying climatic conditions of temperature and precipitation. We recorded bird and arthropod predation attempts on plasticine caterpillars in monocultures and tree species mixtures. Both tree species richness and temperature increased predation by birds but not by arthropods. Furthermore, the effects of tree species richness on predation were consistent across the studied climatic gradient. Our findings provide evidence that tree diversity strengthens top-down control of insect herbivores by birds, underscoring the need to implement conservation strategies that safeguard tree diversity to sustain ecosystem services provided by natural enemies in forests.


Assuntos
Artrópodes , Biodiversidade , Aves , Clima , Comportamento Predatório , Árvores , Animais , Artrópodes/fisiologia , Aves/fisiologia , Cadeia Alimentar , Larva/fisiologia
3.
Am J Transplant ; 2024 Sep 26.
Artigo em Inglês | MEDLINE | ID: mdl-39341343

RESUMO

In the US liver allocation system, nonstandardized model for end-stage liver disease (MELD) exceptions (NSEs) increase the waitlist priority of candidates whose MELD scores are felt to underestimate their true medical urgency. We determined whether NSEs accurately depict pretransplant mortality risk by performing mixed-effects Cox proportional hazards models and estimating concordance indices. We also studied the change in frequency of NSEs after the National Liver Review Board's implementation in May 2019. Between June 2016 and April 2022, 60,322 adult candidates were listed, of whom 10,280 (17.0%) received an NSE at least once. The mean allocation MELD was 23.9, an increase of 12.0 points from the mean laboratory MELD of 11.9 (P < .001). A 1-point increase in allocation MELD score due to an NSE was associated with, on average, a 2% reduction in hazard of pretransplant death (cause-specific hazard ratio: 0.98; 95% CI: 0.96, 1.00; P = .02) compared with those with the same laboratory MELD. Laboratory MELD was more accurate than allocation MELD with NSEs in rank-ordering candidates (c-index: 0.889 vs 0.857). The proportion of candidates with NSEs decreased significantly after the National Liver Review Board from 21.5% to 12.8% (P < .001). NSEs substantially increase the waitlist priority of candidates with objectively low medical urgency.

4.
Am J Transplant ; 2024 Sep 16.
Artigo em Inglês | MEDLINE | ID: mdl-39293517

RESUMO

Donation after circulatory death (DCD) is driving the increase in deceased organ donors in the United States. Normothermic regional perfusion (NRP) and ex situ machine perfusion (es-MP) have been instrumental in improving liver transplant outcomes and graft utilization. This study examines the current landscape of liver utilization from cardiac DCD donors in the United States. Using the United Network for Organ Sharing Standard Transplant Analysis and Research file, all adult (≥18 years old) DCD donors in the United States from which the heart was used for transplantation from October 1, 2020, to September 30, 2023, were compared by procurement technique (NRP versus super rapid recovery [SRR]) and storage strategy (es-MP versus static cold storage). One hundred eighty-eight livers were transplanted from 309 thoracoabdominal NRP donors (61% utilization) versus 305 (56%) liver transplants from 544 SRR donors. es-MP was used in 20% (n = 38) of NRP cases versus 32% (98) of SRR cases. Of the liver grafts, 281 (59%) were exposed to NRP, es-MP, or both. While there is widespread utilization of machine perfusion, more research is needed to determine optimal graft management strategies, particularly concerning the use of multiple technologies in complementary ways. More complete data collection is necessary at a national level to address these important research questions.

5.
J Urol ; 211(4): 552-562, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38299570

RESUMO

PURPOSE: Excess body and visceral fat increase the risk of death from prostate cancer (PCa). This phase II study aimed to test whether weight reduction by > 5% total body weight counteracts obesity-driven PCa biomarkers. MATERIALS AND METHODS: Forty men scheduled for prostatectomy were randomized into intervention (n = 20) or control (n = 20) arms. Intervention participants followed a weight management program for 4 to 16 weeks before and 6 months after surgery. Control participants received standardized educational materials. All participants attended visits at baseline, 1 week before surgery, and 6 months after surgery. Circulating immune cells, cytokines, and chemokines were evaluated. Weight loss, body composition/distribution, quality of life, and nutrition literacy were assessed. Prostate tissue samples obtained from biopsy and surgery were analyzed. RESULTS: From baseline to surgery (mean = 5 weeks), the intervention group achieved 5.5% of weight loss (95% CI, 4%-7%). Compared to the control, the intervention also reduced insulin, total cholesterol, LDL cholesterol, leptin, leptin:adiponectin ratio, and visceral adipose tissue. The intervention group had reduced c-peptide, plasminogen-activator-inhibitor-1, and T cell count from baseline to surgery. Myeloid-derived suppressor cells were not statistically different by group. Intervention group anthropometrics improved, including visceral and overall fat loss. No prostate tissue markers changed significantly. Quality of life measures of general and emotional health improved in the intervention group. The intervention group maintained or kept losing to a net loss of 11% initial body weight (95% CI, 8%-14%) at the study end. CONCLUSIONS: Our study demonstrated improvements in body composition, PCa biomarkers, and quality of life with a weight management intervention.


Assuntos
Leptina , Neoplasias da Próstata , Masculino , Humanos , Próstata , Qualidade de Vida , Tecido Adiposo , Obesidade/complicações , Obesidade/terapia , Biomarcadores , Peso Corporal , Neoplasias da Próstata/terapia , Redução de Peso
6.
Am J Kidney Dis ; 84(4): 416-426, 2024 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-38636649

RESUMO

RATIONALE & OBJECTIVE: The US Kidney Allocation System (KAS) prioritizes candidates with a≤20% estimated posttransplant survival (EPTS) to receive high-longevity kidneys defined by a≤20% Kidney Donor Profile Index (KDPI). Use of EPTS in the KAS deprioritizes candidates with older age, diabetes, and longer dialysis durations. We assessed whether this use also disadvantages race and ethnicity minority candidates, who are younger but more likely to have diabetes and longer durations of kidney failure requiring dialysis. STUDY DESIGN: Observational cohort study. SETTING & PARTICIPANTS: Adult candidates for and recipients of kidney transplantation represented in the Scientific Registry of Transplant Recipients from January 2015 through December 2020. EXPOSURE: Race and ethnicity. OUTCOME: Age-adjusted assignment to≤20% EPTS, transplantation of a≤20% KDPI kidney, and posttransplant survival in longevity-matched recipients by race and ethnicity. ANALYTIC APPROACH: Multivariable logistic regression, Fine-Gray competing risks survival analysis, and Kaplan-Meier and Cox proportional hazards methods. RESULTS: The cohort included 199,444 candidates (7% Asian, 29% Black, 19% Hispanic or Latino, and 43% White) listed for deceased donor kidney transplantation. Non-White candidates had significantly higher rates of diabetes, longer dialysis duration, and were younger than White candidates. Adjusted for age, Asian, Black, and Hispanic or Latino candidates had significantly lower odds of having a ETPS score of≤20% (odds ratio, 0.86 [95% CI, 0.81-0.91], 0.52 [95% CI, 0.50-0.54], and 0.49 [95% CI, 0.47-0.51]), and were less likely to receive a≤20% KDPI kidney (sub-hazard ratio, 0.70 [0.66-0.75], 0.89 [0.87-0.92], and 0.73 [0.71-0.76]) compared with White candidates. Among recipients with≤20% EPTS scores transplanted with a≤20% KDPI deceased donor kidney, Asian and Hispanic recipients had lower posttransplant mortality (HR, 0.45 [0.27-0.77] and 0.63 [0.47-0.86], respectively) and Black recipients had higher but not statistically significant posttransplant mortality (HR, 1.22 [0.99-1.52]) compared with White recipients. LIMITATIONS: Provider reported race and ethnicity data and 5-year post transplant follow-up period. CONCLUSIONS: The US kidney allocation system is less likely to identify race and ethnicity minority candidates as having a≤20% EPTS score, which triggers allocation of high-longevity deceased donor kidneys. These findings should inform the Organ Procurement and Transplant Network about how to remedy the race and ethnicity disparities introduced through KAS's current approach of allocating allografts with longer predicted longevity to recipients with longer estimated posttransplant survival. PLAIN-LANGUAGE SUMMARY: The US Kidney Allocation System prioritizes giving high-longevity, high-quality kidneys to patients on the waiting list who have a high estimated posttransplant survival (EPTS) score. EPTS is calculated based on the patient's age, whether the patient has diabetes, whether the patient has a history of organ transplantation, and the number of years spent on dialysis. Our analyses show that Asian, Black or African American, and Hispanic or Latino patients were less likely to receive high-longevity kidneys compared with White patients, despite having similar or better posttransplant survival outcomes.


Assuntos
Transplante de Rim , Obtenção de Tecidos e Órgãos , Humanos , Masculino , Feminino , Pessoa de Meia-Idade , Estados Unidos/epidemiologia , Adulto , Estudos de Coortes , Doadores de Tecidos , Falência Renal Crônica/cirurgia , Falência Renal Crônica/etnologia , Falência Renal Crônica/mortalidade , Sobrevivência de Enxerto , Idoso , Etnicidade , Longevidade , Sistema de Registros , Grupos Raciais
7.
New Phytol ; 243(3): 1205-1219, 2024 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-38855965

RESUMO

Decades of studies have demonstrated links between biodiversity and ecosystem functioning, yet the generality of the relationships and the underlying mechanisms remain unclear, especially for forest ecosystems. Using 11 tree-diversity experiments, we tested tree species richness-community productivity relationships and the role of arbuscular (AM) or ectomycorrhizal (ECM) fungal-associated tree species in these relationships. Tree species richness had a positive effect on community productivity across experiments, modified by the diversity of tree mycorrhizal associations. In communities with both AM and ECM trees, species richness showed positive effects on community productivity, which could have resulted from complementarity between AM and ECM trees. Moreover, both AM and ECM trees were more productive in mixed communities with both AM and ECM trees than in communities assembled by their own mycorrhizal type of trees. In communities containing only ECM trees, species richness had a significant positive effect on productivity, whereas species richness did not show any significant effects on productivity in communities containing only AM trees. Our study provides novel explanations for variations in diversity-productivity relationships by suggesting that tree-mycorrhiza interactions can shape productivity in mixed-species forest ecosystems.


Assuntos
Biodiversidade , Micorrizas , Árvores , Micorrizas/fisiologia , Árvores/microbiologia , Especificidade da Espécie
8.
JAMA ; 331(6): 500-509, 2024 02 13.
Artigo em Inglês | MEDLINE | ID: mdl-38349372

RESUMO

Importance: The US heart allocation system prioritizes medically urgent candidates with a high risk of dying without transplant. The current therapy-based 6-status system is susceptible to manipulation and has limited rank ordering ability. Objective: To develop and validate a candidate risk score that incorporates current clinical, laboratory, and hemodynamic data. Design, Setting, and Participants: A registry-based observational study of adult heart transplant candidates (aged ≥18 years) from the US heart allocation system listed between January 1, 2019, and December 31, 2022, split by center into training (70%) and test (30%) datasets. Adult candidates were listed between January 1, 2019, and December 31, 2022. Main Outcomes and Measures: A US candidate risk score (US-CRS) model was developed by adding a predefined set of predictors to the current French Candidate Risk Score (French-CRS) model. Sensitivity analyses were performed, which included intra-aortic balloon pumps (IABP) and percutaneous ventricular assist devices (VAD) in the definition of short-term mechanical circulatory support (MCS) for the US-CRS. Performance of the US-CRS model, French-CRS model, and 6-status model in the test dataset was evaluated by time-dependent area under the receiver operating characteristic curve (AUC) for death without transplant within 6 weeks and overall survival concordance (c-index) with integrated AUC. Results: A total of 16 905 adult heart transplant candidates were listed (mean [SD] age, 53 [13] years; 73% male; 58% White); 796 patients (4.7%) died without a transplant. The final US-CRS contained time-varying short-term MCS (ventricular assist-extracorporeal membrane oxygenation or temporary surgical VAD), the log of bilirubin, estimated glomerular filtration rate, the log of B-type natriuretic peptide, albumin, sodium, and durable left ventricular assist device. In the test dataset, the AUC for death within 6 weeks of listing for the US-CRS model was 0.79 (95% CI, 0.75-0.83), for the French-CRS model was 0.72 (95% CI, 0.67-0.76), and 6-status model was 0.68 (95% CI, 0.62-0.73). Overall c-index for the US-CRS model was 0.76 (95% CI, 0.73-0.80), for the French-CRS model was 0.69 (95% CI, 0.65-0.73), and 6-status model was 0.67 (95% CI, 0.63-0.71). Classifying IABP and percutaneous VAD as short-term MCS reduced the effect size by 54%. Conclusions and Relevance: In this registry-based study of US heart transplant candidates, a continuous multivariable allocation score outperformed the 6-status system in rank ordering heart transplant candidates by medical urgency and may be useful for the medical urgency component of heart allocation.


Assuntos
Insuficiência Cardíaca , Transplante de Coração , Obtenção de Tecidos e Órgãos , Adulto , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Bilirrubina , Serviços de Laboratório Clínico , Coração , Fatores de Risco , Medição de Risco , Insuficiência Cardíaca/mortalidade , Insuficiência Cardíaca/cirurgia , Estados Unidos , Alocação de Recursos para a Atenção à Saúde/métodos , Valor Preditivo dos Testes , Obtenção de Tecidos e Órgãos/métodos , Obtenção de Tecidos e Órgãos/organização & administração
9.
Can Assoc Radiol J ; : 8465371241266785, 2024 Jul 27.
Artigo em Inglês | MEDLINE | ID: mdl-39066637

RESUMO

Purpose: This study evaluates the efficacy of a commercial medical Named Entity Recognition (NER) model combined with a post-processing protocol in identifying incidental pulmonary nodules from CT reports. Methods: We analyzed 9165 anonymized CT reports and classified them into 3 categories: no nodules, nodules present, and nodules >6 mm. For each report, a generic medical NER model annotated entities and their relations, which were then filtered through inclusion/exclusion criteria selected to identify pulmonary nodules. Ground truth was established by manual review. To better understand the relationship between model performance and nodule prevalence, a subset of the data was programmatically balanced to equalize the number of reports in each class category. Results: In the unbalanced subset of the data, the model achieved a sensitivity of 97%, specificity of 99%, and accuracy of 99% in detecting pulmonary nodules mentioned in the reports. For nodules >6 mm, sensitivity was 95%, specificity was 100%, and accuracy was 100%. In the balanced subset of the data, sensitivity was 99%, specificity 96%, and accuracy 97% for nodule detection; for larger nodules, sensitivity was 94%, specificity 99%, and accuracy 98%. Conclusions: The NER model demonstrated high sensitivity and specificity in detecting pulmonary nodules reported in CT scans, including those >6 mm which are potentially clinically significant. The results were consistent across both unbalanced and balanced datasets indicating that the model performance is independent of nodule prevalence. Implementing this technology in hospital systems could automate the identification of at-risk patients, ensuring timely follow-up and potentially reducing missed or late-stage cancer diagnoses.

10.
Yale J Biol Med ; 97(2): 253-263, 2024 Jun.
Artigo em Inglês | MEDLINE | ID: mdl-38947109

RESUMO

Environmental mismatches are defined as changes in the environment that induce public health crises. Well known mismatches leading to chronic disease include the availability of technologies that facilitate unhealthy diets and sedentary lifestyles, both factors that adversely affect cardiovascular health. This commentary puts these mismatches in context with biota alteration, an environmental mismatch involving hygiene-related technologies necessary for avoidance of infectious disease. Implementation of hygiene-related technologies causes a loss of symbiotic helminths and protists, profoundly affecting immune function and facilitating a variety of chronic conditions, including allergic disorders, autoimmune diseases, and several inflammation-associated neuropsychiatric conditions. Unfortunately, despite an established understanding of the biology underpinning this and other environmental mismatches, public health agencies have failed to stem the resulting tide of increased chronic disease burden. Both biomedical research and clinical practice continue to focus on an ineffective and reactive pharmaceutical-based paradigm. It is argued that the healthcare of the future could take into account the biology of today, effectively and proactively dealing with environmental mismatch and the resulting chronic disease burden.


Assuntos
Doenças do Sistema Imunitário , Humanos , Doença Crônica , Animais , Meio Ambiente
11.
Crit Care Med ; 51(8): 1012-1022, 2023 08 01.
Artigo em Inglês | MEDLINE | ID: mdl-36995088

RESUMO

OBJECTIVES: A unilateral do-not-resuscitate (UDNR) order is a do-not-resuscitate order placed using clinician judgment which does not require consent from a patient or surrogate. This study assessed how UDNR orders were used during the COVID-19 pandemic. DESIGN: We analyzed a retrospective cross-sectional study of UDNR use at two academic medical centers between April 2020 and April 2021. SETTING: Two academic medical centers in the Chicago metropolitan area. PATIENTS: Patients admitted to an ICU between April 2020 and April 2021 who received vasopressor or inotropic medications to select for patients with high severity of illness. INTERVENTIONS: None. MEASUREMENTS AND MAIN RESULTS: The 1,473 patients meeting inclusion criteria were 53% male, median age 64 (interquartile range, 54-73), and 38% died during admission or were discharged to hospice. Clinicians placed do not resuscitate orders for 41% of patients ( n = 604/1,473) and UDNR orders for 3% of patients ( n = 51/1,473). The absolute rate of UDNR orders was higher for patients who were primary Spanish speaking (10% Spanish vs 3% English; p ≤ 0.0001), were Hispanic or Latinx (7% Hispanic/Latinx vs 3% Black vs 2% White; p = 0.003), positive for COVID-19 (9% vs 3%; p ≤ 0.0001), or were intubated (5% vs 1%; p = 0.001). In the base multivariable logistic regression model including age, race/ethnicity, primary language spoken, and hospital location, Black race (adjusted odds ratio [aOR], 2.5; 95% CI, 1.3-4.9) and primary Spanish language (aOR, 4.4; 95% CI, 2.1-9.4) had higher odds of UDNR. After adjusting the base model for severity of illness, primary Spanish language remained associated with higher odds of UDNR order (aOR, 2.8; 95% CI, 1.7-4.7). CONCLUSIONS: In this multihospital study, UDNR orders were used more often for primary Spanish-speaking patients during the COVID-19 pandemic, which may be related to communication barriers Spanish-speaking patients and families experience. Further study is needed to assess UDNR use across hospitals and enact interventions to improve potential disparities.


Assuntos
COVID-19 , Humanos , Masculino , Pessoa de Meia-Idade , Feminino , Ordens quanto à Conduta (Ética Médica) , Estudos Retrospectivos , Estudos Transversais , Pandemias
12.
J Card Fail ; 29(4): 517-526, 2023 04.
Artigo em Inglês | MEDLINE | ID: mdl-36632933

RESUMO

Heart failure (HF) is a clinical syndrome that is divided into 3 subtypes based on the left ventricular ejection fraction. Every subtype has specific clinical characteristics and concomitant diseases, substantially increasing risk of thromboembolic complications, such as stroke, peripheral embolism and pulmonary embolism. Despite the annual prevalence of 1% and devastating clinical consequences, thromboembolic complications are not typically recognized as the leading problem in patients with HF, representing an underappreciated clinical challenge. Although the currently available data do not support routine anticoagulation in patients with HF and sinus rhythm, initial reports suggest that such strategy might be beneficial in a subset of patients at especially high thromboembolic risk. Considering the existing evidence gap, we aimed to review the currently available data regarding coagulation disorders in acute and chronic HF based on the insight from preclinical and clinical studies, to summarize the evidence regarding anticoagulation in HF in special-case scenarios and to outline future research directions so as to establish the optimal patient-tailored strategies for antiplatelet and anticoagulant therapy in HF. In summary, we highlight the top 10 pearls in the management of patients with HF and no other specific indications for oral anticoagulation therapy. Further studies are urgently needed to shed light on the pathophysiological role of platelet activation in HF and to evaluate whether antiplatelet or antithrombotic therapy could be beneficial in patients with HF. LAY SUMMARY: Heart failure (HF) is a clinical syndrome divided into 3 subtypes on the basis of the left ventricular systolic function. Every subtype has specific clinical characteristics and concomitant diseases, substantially increasing the risk of thromboembolic complications, such as stroke, peripheral embolism and pulmonary embolism. Despite the annual prevalence of 1% and devastating clinical consequences, thromboembolic complications are not typically recognized as the leading problem in patients with HF, representing an underappreciated clinical challenge. Although the currently available data do not support routine anticoagulation in patients with HF and no atrial arrhythmia, initial reports suggest that such a strategy might be beneficial in a subset of patients at especially high risk of thrombotic complications. Considering the existing evidence gap, we aimed to review the currently available data regarding coagulation problems in stable and unstable patients with HF based on the insight from preclinical and clinical studies, to summarize the evidence regarding anticoagulation in HF in specific patient groups and to outline future research directions to establish the optimal strategies for antiplatelet and anticoagulant therapy in HF, tailored to the needs of an individual patient. In summary, we highlight the top 10 pearls in the management of patients with HF and no other specific indications for oral anticoagulation therapy.


Assuntos
Fibrilação Atrial , Transtornos da Coagulação Sanguínea , Insuficiência Cardíaca , Embolia Pulmonar , Acidente Vascular Cerebral , Tromboembolia , Humanos , Volume Sistólico , Insuficiência Cardíaca/complicações , Insuficiência Cardíaca/tratamento farmacológico , Insuficiência Cardíaca/epidemiologia , Função Ventricular Esquerda , Anticoagulantes/uso terapêutico , Tromboembolia/tratamento farmacológico , Tromboembolia/epidemiologia , Tromboembolia/etiologia , Acidente Vascular Cerebral/etiologia , Transtornos da Coagulação Sanguínea/complicações , Transtornos da Coagulação Sanguínea/tratamento farmacológico , Arritmias Cardíacas , Fibrilação Atrial/complicações
13.
J Surg Oncol ; 128(2): 280-288, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-37073788

RESUMO

BACKGROUND: Outcomes for pancreatic adenocarcinoma (PDAC) remain difficult to prognosticate. Multiple models attempt to predict survival following the resection of PDAC, but their utility in the neoadjuvant population is unknown. We aimed to assess their accuracy among patients that received neoadjuvant chemotherapy (NAC). METHODS: We performed a multi-institutional retrospective analysis of patients who received NAC and underwent resection of PDAC. Two prognostic systems were evaluated: the Memorial Sloan Kettering Cancer Center Pancreatic Adenocarcinoma Nomogram (MSKCCPAN) and the American Joint Committee on Cancer (AJCC) staging system. Discrimination between predicted and actual disease-specific survival was assessed using the Uno C-statistic and Kaplan-Meier method. Calibration of the MSKCCPAN was assessed using the Brier score. RESULTS: A total of 448 patients were included. There were 232 (51.8%) females, and the mean age was 64.1 years (±9.5). Most had AJCC Stage I or II disease (77.7%). For the MSKCCPAN, the Uno C-statistic at 12-, 24-, and 36-month time points was 0.62, 0.63, and 0.62, respectively. The AJCC system demonstrated similarly mediocre discrimination. The Brier score for the MSKCCPAN was 0.15 at 12 months, 0.26 at 24 months, and 0.30 at 36 months, demonstrating modest calibration. CONCLUSIONS: Current survival prediction models and staging systems for patients with PDAC undergoing resection after NAC have limited accuracy.


Assuntos
Adenocarcinoma , Carcinoma Ductal Pancreático , Neoplasias Pancreáticas , Feminino , Humanos , Masculino , Pessoa de Meia-Idade , Adenocarcinoma/cirurgia , Carcinoma Ductal Pancreático/tratamento farmacológico , Carcinoma Ductal Pancreático/cirurgia , Terapia Neoadjuvante , Estadiamento de Neoplasias , Nomogramas , Neoplasias Pancreáticas/tratamento farmacológico , Neoplasias Pancreáticas/cirurgia , Prognóstico , Estudos Retrospectivos , Neoplasias Pancreáticas
14.
Platelets ; 34(1): 2154330, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-36524601

RESUMO

Chronic kidney disease (CKD) is a global health problem and an independent risk factor for cardiovascular morbidity and mortality. Despite evidence-based therapies significantly improving cardiovascular mortality outcomes in the general population and those with non-dialysis-dependent CKD, this risk reduction has not translated to patients with end-stage kidney disease (ESKD). Absent from all major antiplatelet trials, this has led to insufficient safety data for P2Y12 inhibitor prescriptions and treatment inequity in this subpopulation. This review article presents an overview of the progression of research in understanding antiplatelet therapy for ischaemic heart disease in patients with advanced CKD (defined as eGFR <30 mL/min/1.73 m2). Beyond trial recruitment strategies, new approaches should focus on registry documentation by CKD stage, risk stratification with biomarkers associated with inflammation and haemorrhage and building a knowledge base on optimal duration of dual and single antiplatelet therapies.


What is the context? Patients with kidney disease are more likely to experience a heart attack than those without.Those with advanced kidney disease have a higher risk of death following a heart attack.Over the past two decades, advances in treatment following a heart attack have reduced the risk of death, however this has not translated to those with advanced kidney disease.Progression of kidney disease influences antiplatelet (e.g. clopidogrel) treatment efficacy.What is new?This contemporary review analyses registry and trial data to highlight some of the issues surrounding treatment inequity in patients with advanced kidney disease.This article describes potential mechanisms by which progression of kidney disease can influence clotting, bleeding and antiplatelet treatments.What is the impact?Further research into antiplatelet therapy for patients with advanced kidney disease is required.Registry and trial data can improve upon classification of kidney disease for future research.Future trials in antiplatelet therapy for advanced kidney disease are anticipated.


Assuntos
Doença da Artéria Coronariana , Isquemia Miocárdica , Insuficiência Renal Crônica , Humanos , Inibidores da Agregação Plaquetária/farmacologia , Inibidores da Agregação Plaquetária/uso terapêutico , Vácuo , Insuficiência Renal Crônica/complicações , Insuficiência Renal Crônica/tratamento farmacológico , Hemorragia/complicações , Doença da Artéria Coronariana/complicações , Isquemia Miocárdica/complicações , Isquemia Miocárdica/tratamento farmacológico , Isquemia Miocárdica/induzido quimicamente
15.
Clin Oral Implants Res ; 34(1): 13-19, 2023 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-36245313

RESUMO

AIM: The aim of the present study was to evaluate soft and hard tissue alterations around implants with a modified marginal portion placed in a healed, sloped ridge over 3 years of follow-up. MATERIAL AND METHODS: 65 patients with a single recipient implant site in an alveolar ridge with a lingual-buccal sloped configuration were recruited. Implants with a modified geometry in the marginal portion were installed in such a way that the sloped part of the device was located at the buccal and most apical position of the osteotomy preparation. Crowns were placed 21 weeks after implant placement. Radiologic examinations were performed at implant installation and at 1 and 3 years of follow-up. Bleeding on probing (BoP), probing pocket depth (PPD), and clinical attachment level (CAL; from the crown margin) were recorded at the insertion of the prosthesis and after 1 and 3 years. RESULTS: 57 patients with 57 implant-supported restorations attended the 3 years follow-up examination. The radiographic analysis revealed a mean marginal bone loss of 0.57 mm during the 3 years period. While the average bone loss between 1 and 3 years amounted to 0.30 mm, approximately 50% of the implants showed no bone loss during this period. The results from the clinical examinations showed a CAL gain of 0.11 ± 0.85 mm between baseline and 3 years of follow-up. About 65% of the implants showed no loss of attachment between 1 and 3 years. BoP and PPD ≥5 mm were identified at <10% of implants at the 3 years examination. CONCLUSION: Hard and soft tissues formed around dental implants that were designed to match the morphology of an alveolar ridge with a lingual-buccal sloped configuration remained stable over 3 years.


Assuntos
Perda do Osso Alveolar , Implantes Dentários para Um Único Dente , Implantes Dentários , Humanos , Implantação Dentária Endóssea/métodos , Estudos Prospectivos , Processo Alveolar/diagnóstico por imagem , Processo Alveolar/cirurgia , Coroas , Perda do Osso Alveolar/diagnóstico por imagem , Perda do Osso Alveolar/cirurgia , Seguimentos , Prótese Dentária Fixada por Implante
16.
Can Assoc Radiol J ; 74(3): 548-556, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-36542834

RESUMO

PURPOSE: To develop and assess the performance of a machine learning model which screens chest radiographs for 14 labels, and to determine whether fine-tuning the model on local data improves its performance. Generalizability at different institutions has been an obstacle to machine learning model implementation. We hypothesized that the performance of a model trained on an open-source dataset will improve at our local institution after being fine-tuned on local data. METHODS: In this retrospective, institutional review board approved study, an ensemble of neural networks was trained on open-source datasets of chest radiographs for the detection of 14 labels. This model was then fine-tuned using 4510 local radiograph studies, using radiologists' reports as the gold standard to evaluate model performance. Both the open-source and fine-tuned models' accuracy were tested on 802 local radiographs. Receiver-operator characteristic curves were calculated, and statistical analysis was completed using DeLong's method and Wilcoxon signed-rank test. RESULTS: The fine-tuned model identified 12 of 14 pathology labels with area under the curves greater than .75. After fine-tuning with local data, the model performed statistically significantly better overall, and specifically in detecting six pathology labels (P < .01). CONCLUSIONS: A machine learning model able to accurately detect 14 labels simultaneously on chest radiographs was developed using open-source data, and its performance was improved after fine-tuning on local site data. This simple method of fine-tuning existing models on local data could improve the generalizability of existing models across different institutions to further improve their local performance.


Assuntos
Aprendizado Profundo , Humanos , Estudos Retrospectivos , Radiografia , Aprendizado de Máquina , Redes Neurais de Computação
17.
Am J Transplant ; 22(6): 1683-1690, 2022 06.
Artigo em Inglês | MEDLINE | ID: mdl-34951528

RESUMO

The Organ Procurement and Transplant Network (OPTN) implemented a new heart allocation policy on October 18, 2018. Published estimates of lower posttransplant survival under the new policy in cohorts with limited follow-up may be biased by informative censoring. Using the Scientific Registry of Transplant Recipients, we used the Kaplan-Meier method to estimate 1-year posttransplant survival for pre-policy (November 1, 2016, to October 31, 2017) and post-policy cohorts (November 1, 2018, to October 31, 2019) with follow-up through March 2, 2021. We adjusted for changes in recipient population over time with a multivariable Cox proportional hazards model. To demonstrate the effect of inadequate follow-up on post-policy survival estimates, we repeated the analysis but only included follow-up through October 31, 2019. Transplant programs transplanted 2594 patients in the pre-policy cohort and 2761 patients in the post-policy cohort. With follow-up through March 2, 2021, unadjusted 1-year posttransplant survival was 90.6% (89.5%-91.8%) in the pre-policy cohort and 90.8% (89.7%-91.9%) in the post-policy cohort (adjusted HR = 0.93 [0.77-1.12]). Ignoring follow-up after October 31, 2019, the post-policy estimate was biased downward (1-year: 82.2%). When estimated with adequate follow-up, 1-year posttransplant survival under the new heart allocation policy was not significantly different.


Assuntos
Transplante de Coração , Obtenção de Tecidos e Órgãos , Humanos , Políticas , Sistema de Registros , Doadores de Tecidos , Transplantados
18.
Crit Care Med ; 50(2): 212-223, 2022 02 01.
Artigo em Inglês | MEDLINE | ID: mdl-35100194

RESUMO

OBJECTIVES: Body temperature trajectories of infected patients are associated with specific immune profiles and survival. We determined the association between temperature trajectories and distinct manifestations of coronavirus disease 2019. DESIGN: Retrospective observational study. SETTING: Four hospitals within an academic healthcare system from March 2020 to February 2021. PATIENTS: All adult patients hospitalized with coronavirus disease 2019. INTERVENTIONS: Using a validated group-based trajectory model, we classified patients into four previously defined temperature trajectory subphenotypes using oral temperature measurements from the first 72 hours of hospitalization. Clinical characteristics, biomarkers, and outcomes were compared between subphenotypes. MEASUREMENTS AND MAIN RESULTS: The 5,903 hospitalized coronavirus disease 2019 patients were classified into four subphenotypes: hyperthermic slow resolvers (n = 1,452, 25%), hyperthermic fast resolvers (1,469, 25%), normothermics (2,126, 36%), and hypothermics (856, 15%). Hypothermics had abnormal coagulation markers, with the highest d-dimer and fibrin monomers (p < 0.001) and the highest prevalence of cerebrovascular accidents (10%, p = 0.001). The prevalence of venous thromboembolism was significantly different between subphenotypes (p = 0.005), with the highest rate in hypothermics (8.5%) and lowest in hyperthermic slow resolvers (5.1%). Hyperthermic slow resolvers had abnormal inflammatory markers, with the highest C-reactive protein, ferritin, and interleukin-6 (p < 0.001). Hyperthermic slow resolvers had increased odds of mechanical ventilation, vasopressors, and 30-day inpatient mortality (odds ratio, 1.58; 95% CI, 1.13-2.19) compared with hyperthermic fast resolvers. Over the course of the pandemic, we observed a drastic decrease in the prevalence of hyperthermic slow resolvers, from representing 53% of admissions in March 2020 to less than 15% by 2021. We found that dexamethasone use was associated with significant reduction in probability of hyperthermic slow resolvers membership (27% reduction; 95% CI, 23-31%; p < 0.001). CONCLUSIONS: Hypothermics had abnormal coagulation markers, suggesting a hypercoagulable subphenotype. Hyperthermic slow resolvers had elevated inflammatory markers and the highest odds of mortality, suggesting a hyperinflammatory subphenotype. Future work should investigate whether temperature subphenotypes benefit from targeted antithrombotic and anti-inflammatory strategies.


Assuntos
Temperatura Corporal , COVID-19/patologia , Hipertermia/patologia , Hipotermia/patologia , Fenótipo , Centros Médicos Acadêmicos , Idoso , Anti-Inflamatórios/uso terapêutico , Biomarcadores/sangue , Coagulação Sanguínea , Estudos de Coortes , Dexametasona/uso terapêutico , Feminino , Humanos , Inflamação , Masculino , Pessoa de Meia-Idade , Escores de Disfunção Orgânica , Estudos Retrospectivos , SARS-CoV-2
19.
J Anat ; 240(1): 1-10, 2022 01.
Artigo em Inglês | MEDLINE | ID: mdl-34346066

RESUMO

Snake venom is produced, transported and delivered by the sophisticated venom delivery system (VDS). When snakes bite, the venom travels from the venom gland through the venom duct into needle-like fangs that inject it into their prey. To counteract breakages, fangs are continuously replaced throughout life. Currently, the anatomy of the connection between the duct and the fang has not been described, and the mechanism by which the duct is reconnected to the replacement fang has not been identified. We examined the VDS in 3D in representative species from two families and one subfamily (Elapidae, Viperidae, Atractaspidinae) using contrast-enhanced microCT (diceCT), followed by dissection and histology. We observed that the venom duct bifurcates immediately anterior to the fangs so that both the original and replacement fangs are separately connected and functional in delivering venom. When a fang is absent, the canal leading to the empty position is temporarily closed. We found that elapid snakes have a crescent-shaped venom reservoir where venom likely pools before it enters the fang. These findings form the final piece of the puzzle of VDS anatomy in front-fanged venomous snakes. Additionally, they provide further evidence for independent evolution of the VDS in these three snake taxa.


Assuntos
Dente , Viperidae , Animais , Humanos , Venenos de Serpentes , Serpentes/anatomia & histologia , Dente/anatomia & histologia
20.
Eur J Pediatr ; 181(5): 1835-1857, 2022 May.
Artigo em Inglês | MEDLINE | ID: mdl-35175416

RESUMO

Although widely believed by pediatricians and parents to be safe for use in infants and children when used as directed, increasing evidence indicates that early life exposure to paracetamol (acetaminophen) may cause long-term neurodevelopmental problems. Furthermore, recent studies in animal models demonstrate that cognitive development is exquisitely sensitive to paracetamol exposure during early development. In this study, evidence for the claim that paracetamol is safe was evaluated using a systematic literature search. Publications on PubMed between 1974 and 2017 that contained the keywords "infant" and either "paracetamol" or "acetaminophen" were considered. Of those initial 3096 papers, 218 were identified that made claims that paracetamol was safe for use with infants or children. From these 218, a total of 103 papers were identified as sources of authority for the safety claim.   Conclusion: A total of 52 papers contained actual experiments designed to test safety, and had a median follow-up time of 48 h. None monitored neurodevelopment. Furthermore, no trial considered total exposure to drug since birth, eliminating the possibility that the effects of drug exposure on long-term neurodevelopment could be accurately assessed. On the other hand, abundant and sufficient evidence was found to conclude that paracetamol does not induce acute liver damage in babies or children when used as directed. What is Known: • Paracetamol (acetaminophen) is widely thought by pediatricians and parents to be safe when used as directed in the pediatric population, and is the most widely used drug in that population, with more than 90% of children exposed to the drug in some reports. • Paracetamol is known to cause liver damage in adults under conditions of oxidative stress or when used in excess, but increasing evidence from studies in humans and in laboratory animals indicates that the target organ for paracetamol toxicity during early development is the brain, not the liver. What is New: • This study finds hundreds of published reports in the medical literature asserting that paracetamol is safe when used as directed, providing a foundation for the widespread belief that the drug is safe. • This study shows that paracetamol was proven to be safe by approximately 50 short-term studies demonstrating the drug's safety for the pediatric liver, but the drug was never shown to be safe for neurodevelopment. Paracetamol is widely believed to be safe for infants and children when used as directed, despite mounting evidence in humans and in laboratory animals indicating that the drug is not safe for neurodevelopment. An exhaustive search of published work cited for safe use of paracetamol in the pediatric population revealed 52 experimental studies pointing toward safety, but the median follow-up time was only 48 h, and neurodevelopment was never assessed.


Assuntos
Acetaminofen , Analgésicos não Narcóticos , Acetaminofen/efeitos adversos , Analgésicos não Narcóticos/efeitos adversos , Criança , Humanos
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA