Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 470
Filter
1.
J Biol Chem ; 299(8): 105065, 2023 08.
Article in English | MEDLINE | ID: mdl-37468098

ABSTRACT

Pancreatic beta cells maintain glucose homeostasis by secreting pulses of insulin in response to a rise in plasma glucose. Pulsatile insulin secretion occurs as a result of glucose-induced oscillations in beta-cell cytosolic Ca2+. The endoplasmic reticulum (ER) helps regulate beta-cell cytosolic Ca2+, and ER stress can lead to ER Ca2+ reduction, beta-cell dysfunction, and an increased risk of type 2 diabetes. However, the mechanistic effects of ER stress on individual calcium channels are not well understood. To determine the effects of tunicamycin-induced ER stress on ER inositol 1,4,5-triphosphate receptors (IP3Rs) and ryanodine receptors (RyRs) and their involvement in subsequent Ca2+ dysregulation, we treated INS-1 832/13 cells and primary mouse islets with ER stress inducer tunicamycin (TM). We showed TM treatment increased RyR1 mRNA without affecting RyR2 mRNA and decreased both IP3R1 and IP3R3 mRNA. Furthermore, we found stress reduced ER Ca2+ levels, triggered oscillations in cytosolic Ca2+ under subthreshold glucose conditions, and increased apoptosis and that these changes were prevented by cotreatment with the RyR1 inhibitor dantrolene. In addition, we demonstrated silencing RyR1-suppressed TM-induced subthreshold cytosolic Ca2+ oscillations, but silencing RyR2 did not affect these oscillations. In contrast, inhibiting IP3Rs with xestospongin-C failed to suppress the TM-induced cytosolic Ca2+ oscillations and did not protect beta cells from TM-induced apoptosis although xestospongin-C inclusion did prevent ER Ca2+ reduction. Taken together, these results show changes in RyR1 play a critical role in ER stress-induced Ca2+ dysfunction and beta-cell apoptosis.


Subject(s)
Calcium Signaling , Endoplasmic Reticulum Stress , Insulin-Secreting Cells , Ryanodine Receptor Calcium Release Channel , Animals , Mice , Apoptosis , Diabetes Mellitus, Type 2/metabolism , Glucose/metabolism , Homeostasis , Insulin-Secreting Cells/metabolism , Ryanodine Receptor Calcium Release Channel/genetics , Ryanodine Receptor Calcium Release Channel/metabolism , Tunicamycin , Rats , Cell Line
2.
Br J Nutr ; 131(3): 482-488, 2024 02 14.
Article in English | MEDLINE | ID: mdl-37694547

ABSTRACT

Retinol binding protein (RBP) is used as a proxy for retinol in population-based assessments of vitamin A deficiency (VAD) for cost-effectiveness and feasibility. When the cut-off of < 0·7 µmol/l for retinol is applied to RBP to define VAD, an equivalence of the two biomarkers is assumed. Evidence suggests that the relationship between retinol and RBP is not 1:1, particularly in populations with a high burden of infection or inflammation. The goal of this analysis was to longitudinally evaluate the retinol:RBP ratio over 1 month of follow-up among fifty-two individuals exposed to norovirus (n 26 infected, n 26 uninfected), test whether inflammation (measured as α-1-acid glycoprotein (AGP) and C-reactive protein (CRP)) affects retinol, RBP and the ratio between the two and assess whether adjusting vitamin A biomarkers for AGP or CRP improves the equivalence of retinol and RBP. We found that the median molar ratio between retinol and RBP was the same among infected (0·68) and uninfected (0·68) individuals. AGP was associated with the ratio and RBP individually, controlling for CRP, and CRP was associated with both retinol and RBP individually, controlling for AGP over 1 month of follow-up. Adjusting for inflammation led to a slight increase in the ratio among infected individuals (0·71) but remained significantly different from the expected value of one. These findings highlight the need for updated recommendations from the WHO on a cut-off value for RBP and an appropriate method for measuring and adjusting for inflammation when using RBP in population assessments of VAD.


Subject(s)
Norovirus , Vitamin A Deficiency , Humans , Vitamin A , C-Reactive Protein/analysis , Orosomucoid/metabolism , Biomarkers , Vitamin A Deficiency/epidemiology , Retinol-Binding Proteins/metabolism , Inflammation , Norovirus/metabolism
3.
Transpl Infect Dis ; 26(1): e14166, 2024 Feb.
Article in English | MEDLINE | ID: mdl-37846848

ABSTRACT

BACKGROUND: Heart transplantation is the therapy of choice in patients with advanced heart failure refractory to other medical or surgical management. However, heart transplants are associated with complications that increase posttransplant morbidity and mortality. Infections are one of the most important complications after this procedure. Therefore, infections in the first year after heart transplantation were evaluated. METHODS: A retrospective cohort study of infections after heart transplants was conducted in a teaching hospital in Colombia between 2011 and 2019. Patients registered in the institutional heart transplant database (RETRAC) were included in the study. Microbiological isolates and infectious serological data were matched with the identities of heart transplant recipients and data from clinical records of individuals registered in the RETRAC were analyzed. The cumulative incidences of events according to the type of microorganism isolated were estimated using Kaplan-Meier survival analyses. RESULTS: Seventy-nine patients were included in the study. Median age was 49 years (37.4-56.3), and 26.58% of patients were women. Eighty-seven infections were documented, of which 55.17% (48) were bacterial, 22.99% (20) were viral, and 12.64% (11) were fungal. Bacterial infections predominated in the first month. In the first year, infections caused 38.96% of hospital admissions and were the second cause of death after heart transplants (25.0%). CONCLUSION: Posttransplant infections in the first year of follow-up were frequent. Bacterial infections predominated in the early posttransplant period. Infections, mainly bacterial, were the second most common cause of death and the most common cause of hospitalization in the first year after heart transplantation.


Subject(s)
Bacterial Infections , Heart Failure , Heart Transplantation , Humans , Female , Middle Aged , Male , Retrospective Studies , Latin America/epidemiology , Heart Transplantation/adverse effects , Heart Failure/epidemiology , Heart Failure/surgery , Bacterial Infections/epidemiology
4.
Kidney Blood Press Res ; 49(1): 165-172, 2024.
Article in English | MEDLINE | ID: mdl-38359802

ABSTRACT

INTRODUCTION: Arterial hypertension represents one of the main comorbidities observed in patients with heart failure (HF) and one of the main risk factors for its development. Despite this, studies assessing this hypertensive etiology are scarce in Latin America. Our objective was to analyze the prevalence of HF of hypertensive etiology and evaluate its prognosis in patients enrolled in the Colombian Heart Failure Registry (RECOLFACA by its Spanish acronym). METHODS: RECOLFACA recruited adult patients diagnosed with HF in 60 centers in Colombia between 2017 and 2019. The primary outcome was all-cause mortality. A Cox proportional hazards regression model was used to assess factors associated with primary outcomes in patients with hypertensive HF. A p value <0.05 was considered significant. All statistical tests were two-tailed. RESULTS: Out of the total number of patients evaluated in RECOLFACA (n = 2,514), 804 had a diagnosis of HF with hypertensive etiology (31.9%). These patients were less frequently males and had a significantly older age and lower prevalence of comorbidities than those with HF of other etiologies. Additionally, patients with hypertensive HF had a higher prevalence of HF with preserved ejection fraction (HFpEF) (34.1% vs. 28.3%; p = 0.004). Finally, type 2 diabetes mellitus, chronic obstructive pulmonary disease diagnosis, and NYHA class IV were classified as independent mortality risk factors. CONCLUSIONS: Hypertensive HF represents about one-third of the total number of patients with HF in RECOLFACA. Compared with HF of other etiologies, it presents a differential clinical profile - older age and a higher prevalence of HFpEF. RECOLFACA has become a useful tool to characterize patients with HF in Colombia, with which it has been possible to carry out a more specific search and reach the diagnosis of this pathology in our population, and it has served as an example to stimulate registries of patients with HF in other countries in the region.


Subject(s)
Heart Failure , Hypertension , Registries , Humans , Heart Failure/etiology , Heart Failure/epidemiology , Male , Female , Hypertension/epidemiology , Colombia/epidemiology , Aged , Middle Aged , Prevalence , Prognosis , Risk Factors , Aged, 80 and over , Comorbidity
5.
Emerg Infect Dis ; 29(7): 1349-1356, 2023 07.
Article in English | MEDLINE | ID: mdl-37347494

ABSTRACT

The effect of norovirus dose on outcomes such as virus shedding and symptoms after initial infection is not well understood. We performed a secondary analysis of a human challenge study by using Bayesian mixed-effects models. As the dose increased from 4.8 to 4,800 reverse transcription PCR units, the total amount of shed virus in feces increased from 4.5 × 1011 to 3.4 × 1012 genomic equivalent copies; in vomit, virus increased from 6.4 × 105 to 3.0 × 107 genomic equivalent copies. Onset time of viral shedding in feces decreased from 1.4 to 0.8 days, and time of peak viral shedding decreased from 2.3 to 1.5 days. Time to symptom onset decreased from 1.5 to 0.8 days. One type of symptom score increased. An increase in norovirus dose was associated with more rapid shedding and symptom onset and possibly increased severity. However, the effect on virus load and shedding was inconclusive.


Subject(s)
Caliciviridae Infections , Gastroenteritis , Norovirus , Humans , Norovirus/genetics , Bayes Theorem , Kinetics , Time Factors , Feces , Virus Shedding
6.
Crit Care Med ; 51(12): 1716-1726, 2023 12 01.
Article in English | MEDLINE | ID: mdl-37548506

ABSTRACT

OBJECTIVES: To determine whether multisite versus single-site dual-lumen (SSDL) cannulation is associated with outcomes for COVID-19 patients requiring venovenous extracorporeal membrane oxygenation (VV-ECMO). DESIGN: Retrospective analysis of the Extracorporeal Life Support Organization Registry. Propensity score matching (2:1 multisite vs SSDL) was used to control for confounders. PATIENTS: The matched cohort included 2,628 patients (1,752 multisite, 876 SSDL) from 170 centers. The mean ( sd ) age in the entire cohort was 48 (11) years, and 3,909 (71%) were male. Patients were supported with mechanical ventilation for a median (interquartile range) of 79 (113) hours before VV-ECMO support. INTERVENTIONS: None. MEASUREMENTS: The primary outcome was 90-day survival. Secondary outcomes included survival to hospital discharge, duration of ECMO support, days free of ECMO support at 90 days, and complication rates. MAIN RESULTS: There was no difference in 90-day survival (49.4 vs 48.9%, p = 0.66), survival to hospital discharge (49.8 vs 48.2%, p = 0.44), duration of ECMO support (17.9 vs 17.1 d, p = 0.82), or hospital length of stay after cannulation (28 vs 27.4 d, p = 0.37) between multisite and SSDL groups. More SSDL patients were extubated within 24 hours (4% vs 1.9%, p = 0.001). Multisite patients had higher ECMO flows at 24 hours (4.5 vs 4.1 L/min, p < 0.001) and more ECMO-free days at 90 days (3.1 vs 2.0 d, p = 0.02). SSDL patients had higher rates of pneumothorax (13.9% vs 11%, p = 0.03). Cannula site bleeding (6.4% vs 4.7%, p = 0.03), oxygenator failure (16.7 vs 13.4%, p = 0.03), and circuit clots (5.5% vs 3.4%, p = 0.02) were more frequent in multisite patients. CONCLUSIONS: In this retrospective study of COVID-19 patients requiring VV-ECMO, 90-day survival did not differ between patients treated with a multisite versus SSDL cannulation strategy and there were only modest differences in major complication rates. These findings do not support the superiority of either cannulation strategy in this setting.


Subject(s)
COVID-19 , Extracorporeal Membrane Oxygenation , Respiratory Insufficiency , Adult , Humans , Male , Middle Aged , Female , Extracorporeal Membrane Oxygenation/adverse effects , Retrospective Studies , Catheterization , Respiratory Insufficiency/therapy
7.
Appl Environ Microbiol ; 89(7): e0012823, 2023 07 26.
Article in English | MEDLINE | ID: mdl-37310232

ABSTRACT

Essential food workers experience elevated risks of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection due to prolonged occupational exposures in food production and processing areas, shared transportation (car or bus), and employer-provided shared housing. Our goal was to quantify the daily cumulative risk of SARS-CoV-2 infection for healthy susceptible produce workers and to evaluate the relative reduction in risk attributable to food industry interventions and vaccination. We simulated daily SARS-CoV-2 exposures of indoor and outdoor produce workers through six linked quantitative microbial risk assessment (QMRA) model scenarios. For each scenario, the infectious viral dose emitted by a symptomatic worker was calculated across aerosol, droplet, and fomite-mediated transmission pathways. Standard industry interventions (2-m physical distancing, handwashing, surface disinfection, universal masking, ventilation) were simulated to assess relative risk reductions from baseline risk (no interventions, 1-m distance). Implementation of industry interventions reduced an indoor worker's relative infection risk by 98.0% (0.020; 95% uncertainty interval [UI], 0.005 to 0.104) from baseline risk (1.00; 95% UI, 0.995 to 1.00) and an outdoor worker's relative infection risk by 94.5% (0.027; 95% UI, 0.013 to 0.055) from baseline risk (0.487; 95% UI, 0.257 to 0.825). Integrating these interventions with two-dose mRNA vaccinations (86 to 99% efficacy), representing a worker's protective immunity to infection, reduced the relative infection risk from baseline for indoor workers by 99.9% (0.001; 95% UI, 0.0002 to 0.005) and outdoor workers by 99.6% (0.002; 95% UI, 0.0003 to 0.005). Consistent implementation of combined industry interventions, paired with vaccination, effectively mitigates the elevated risks from occupationally acquired SARS-CoV-2 infection faced by produce workers. IMPORTANCE This is the first study to estimate the daily risk of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) infection across a variety of indoor and outdoor environmental settings relevant to food workers (e.g., shared transportation [car or bus], enclosed produce processing facility and accompanying breakroom, outdoor produce harvesting field, shared housing facility) through a linked quantitative microbial risk assessment framework. Our model has demonstrated that the elevated daily SARS-CoV-2 infection risk experienced by indoor and outdoor produce workers can be reduced below 1% when vaccinations (optimal vaccine efficacy, 86 to 99%) are implemented with recommended infection control strategies (e.g., handwashing, surface disinfection, universal masking, physical distancing, and increased ventilation). Our novel findings provide scenario-specific infection risk estimates that can be utilized by food industry managers to target high-risk scenarios with effective infection mitigation strategies, which was informed through more realistic and context-driven modeling estimates of the infection risk faced by essential food workers daily. Bundled interventions, particularly if they include vaccination, yield significant reductions (>99%) in daily SARS-CoV-2 infection risk for essential food workers in enclosed and open-air environments.


Subject(s)
COVID-19 , Occupational Exposure , Humans , SARS-CoV-2 , COVID-19/prevention & control , Respiratory Aerosols and Droplets , Occupational Exposure/prevention & control , Infection Control
8.
Artif Organs ; 47(8): 1371-1385, 2023 Aug.
Article in English | MEDLINE | ID: mdl-37042612

ABSTRACT

BACKGROUND: Controlled donation after circulatory determination of death (cDCD) seems an effective way to mitigate the critical shortage of available organs for transplant worldwide. As a recently developed procedure for organ retrieval, some questions remain unsolved such as the uncertainty regarding the effect of functional warm ischemia time (FWIT) on organs´ viability. METHODS: We developed a multicenter prospective cohort study collecting all data from evaluated organs during cDCD from 2017 to 2020. All the procedures related to cDCD were performed with normothermic regional perfusion. The analysis included organ retrieval as endpoint and FWIT as exposure of interest. The effect of FWIT on the likelihood for organ retrieval was evaluated with Relative distribution analysis. RESULTS: A total amount of 507 organs´ related information was analyzed from 95 organ donors. Median donor age was 62 years, and 63% of donors were male. Stroke was the most common diagnosis before withdrawal of life-sustaining therapy (61%), followed by anoxic encephalopathy (21%). This analysis showed that length of FWIT was inversely associated with organ retrieval rates for liver, kidneys, and pancreas. No statistically significant association was found for lungs. CONCLUSIONS: Results showed an inverse association between functional warm ischemia time (FWIT) and retrieval rate. We also have postulated optimal FWIT's thresholds for organ retrieval. FWIT for liver retrieval remained between 6 and less than 11 min and in case of kidneys and pancreas, the optimal FWIT for retrieval was 6 to 12 min. These results could be valuable to improve organ utilization and for future analysis.


Subject(s)
Extracorporeal Membrane Oxygenation , Tissue and Organ Procurement , Humans , Male , Middle Aged , Female , Warm Ischemia , Prospective Studies , Organ Preservation/methods , Perfusion/methods , Death , Graft Survival
9.
Appl Opt ; 62(10): 2560-2568, 2023 Apr 01.
Article in English | MEDLINE | ID: mdl-37132804

ABSTRACT

When experimental photoelasticity images are acquired, the spectral interaction between the light source and the sensor used affect the visual information of the fringe patterns in the produced images. Such interaction can lead to fringe patterns with an overall high quality, but also can lead to images with indistinguishable fringes, and bad stress field reconstruction. We introduce a strategy to evaluate such interaction that relies on measuring the value of four handcrafted descriptors: contrast, an image descriptor that accounts simultaneously for blur and noise, a Fourier-based descriptor to measure image quality, and image entropy. The utility of the proposed strategy was validated by measuring the selected descriptors on computational photoelasticity images, and the fringe orders achieved when evaluating the stress field, from 240 spectral configurations: 24 light sources and 10 sensors. We found that high values of the selected descriptors can be related to spectral configurations that lead to better stress field reconstruction. Overall, the results show that the selected descriptors can be useful to identify bad and good spectral interactions, which could help to design better protocols for acquiring photoelasticity images.

10.
BMC Public Health ; 23(1): 1533, 2023 08 12.
Article in English | MEDLINE | ID: mdl-37568075

ABSTRACT

BACKGROUND: Despite Colombia's robust well-child visits program, Colombian children and mothers still suffer from anemia, especially in populations of lower socioeconomic status. In this study, we aimed to quantify the prevalence and risk factors among mothers and their children attending their well-child visits in Apartadó, a municipality in the Urabá region of the Colombian Caribbean. METHODS: There were 100 mother-child pairs enrolled in this secondary data-analysis study from a health facility in the municipality of Apartadó, Urabá, Colombia, during well-child visits. Self-reported data included child illnesses in the past two weeks (diarrheal, fever, or respiratory symptoms), child feeding practices (breastfeeding, complementary feeding), child vaccinations, and demographic characteristics (mother's and child's age, mother's education, marital status, race, and child sex) and socioeconomic status. Mother and child anthropometry data were collected via standardized weight and height measurements. Mother or child anemia status was collected via a blood test. Chi-squared tests and multivariable logistic regression were used to assess associations between risk factors and anemia. RESULT: The anemia prevalence in children (74%) and mothers (47%) was higher than the Colombian national prevalence. Reported child comorbidities in the preceding two weeks were not significantly associated with child anemia and included respiratory illnesses (60%), fever (46%), and diarrhea (30%). Stunting (8%) was not significantly associated with anemia. Wasting (0%) was not observed in this study. Reported child breastfeeding and complementary feeding were also not significantly associated with child anemia. In adjusted models, the child's significant risk factors for anemia included the mother's "Mestiza" race (OR: 4.681; 95% CI: 1.258, 17.421) versus the Afro-Colombian race. Older children (25-60 months) were less likely to develop anemia than younger (6-24 months) children (OR: 0.073; 95% CI: 0.015, 0.360). CONCLUSIONS: The finding of high anemia prevalence in this study advances our understanding of child and maternal anemia in populations of low socioeconomic status where health care is regularly accessed through well-child programs.


Subject(s)
Anemia , Mothers , Infant , Female , Humans , Child , Adolescent , Colombia/epidemiology , Prevalence , Risk Factors , Caribbean Region/epidemiology , Anemia/epidemiology , Mother-Child Relations , Socioeconomic Factors
11.
Cancer ; 128(4): 697-707, 2022 Feb 15.
Article in English | MEDLINE | ID: mdl-34674226

ABSTRACT

BACKGROUND: A high frequency of primary central nervous system (CNS) sarcomas was observed in Peru. This article describes the clinical characteristics, biological characteristics, and outcome of 70 pediatric patients. METHODS: Data from 70 pediatric patients with primary CNS sarcomas diagnosed between January 2005 and June 2018 were analyzed. DNA methylation profiling from 28 tumors and gene panel sequencing from 27 tumors were available. RESULTS: The median age of the patients was 6 years (range, 2-17.5 years), and 66 of 70 patients had supratentorial tumors. DNA methylation profiling classified 28 of 28 tumors as primary CNS sarcoma, DICER1 mutant. DICER1 mutations were found in 26 of 27 cases, TP53 mutations were found in 22 of 27 cases, and RAS-pathway gene mutations (NF1, KRAS, and NRAS) were found in 19 of 27 tumors, all of which were somatic (germline control available in 19 cases). The estimated incidence in Peru was 0.19 cases per 100,000 children (<18 years old) per year, which is significantly higher than the estimated incidence in Germany (0.007 cases per 100,000 children [<18 years] per year; P < .001). Patients with nonmetastatic disease (n = 46) that were treated with a combination therapy had a 2-year progression-free survival (PFS) rate of 58% (95% CI, 44%-76%) and a 2-year overall survival rate of 71% (95% CI, 57%-87%). PFS was the highest in patients treated with chemotherapy with ifosfamide, carboplatin, and etoposide (ICE) after upfront surgery followed by radiotherapy and ICE (2-year PFS, 79% [59%-100%], n = 18). CONCLUSIONS: Primary CNS sarcoma with DICER1 mutation has an aggressive clinical course. A combination of surgery, chemotherapy, and radiotherapy seems beneficial. An underlying cancer predisposition syndrome explaining the increased incidence in Peruvian patients has not been identified so far. LAY SUMMARY: A high incidence of primary pediatric central nervous system sarcomas in the Peruvian population is described. Using sequencing technologies and DNA methylation profiling, it is confirmed that these tumors molecularly belong to the recently proposed entity "primary central nervous system sarcomas, DICER1 mutant." Unexpectedly, DICER1 mutations as well as all other defining tumor mutations (TP53 mutations and RAS-pathway mutations) were not inherited in all 19 patients where analyzation was possible. These tumors have an aggressive clinical course. Multimodal combination therapy based on surgery, ifosfamide, carboplatin, and etoposide chemotherapy, and local radiotherapy leads to superior outcomes.


Subject(s)
Central Nervous System Neoplasms , Sarcoma , Adolescent , Central Nervous System/pathology , Central Nervous System Neoplasms/drug therapy , Central Nervous System Neoplasms/genetics , Child , Child, Preschool , DEAD-box RNA Helicases/genetics , Humans , Mutation , Peru/epidemiology , Ribonuclease III/genetics , Sarcoma/drug therapy , Sarcoma/genetics
12.
Pediatr Blood Cancer ; 69(10): e29770, 2022 10.
Article in English | MEDLINE | ID: mdl-35593532

ABSTRACT

BACKGROUND: Medulloblastoma is the most common malignant brain tumor in children. While survival has improved in high-income countries (HIC), the outcomes for patients in low-to-middle-income countries (LMIC) are unclear. Therefore, we sought to determine the survival of children with medulloblastoma at the Instituto Nacional de Enfermedades Neoplasicas (INEN) between 1997 and 2013 in Peru. METHODS: Between 1997 and 2013, data from 103 children older than 3 years with medulloblastoma were analyzed. Fourteen patients were excluded. The patients were split into two distinct cohorts, 1997-2008 and 2009-2013, corresponding with chemotherapy regimen changes. Event-free (EFS) and overall survival (OS) were calculated using the Kaplan-Meier method, whereas prognostic factors were determined by univariate analysis (log-rank test). RESULTS: Eighty-nine patients were included; median age was 8.1 years (range: 3-13.9 years). The 5-year OS was 62% (95% CI: 53%-74%), while EFS was 57% (95% CI: 48%-69%). The variables adversely affecting survival were anaplastic histology (compared to desmoplastic; OS: HR = 3.4, p = .03), metastasis (OS: HR = 3.5, p = .01; EFS: HR = 4.3, p = .004), delay in radiation therapy of 31-60 days (compared to ≤30 days; EFS: HR = 2.1, p = .04), and treatment 2009-2013 cohort (OS: HR = 2.2, p = .02; EFS: HR = 2.0, p = .03). CONCLUSIONS: Outcomes for medulloblastoma at INEN were low compared with HIC. Anaplastic subtype, metastasis at diagnosis, delay in radiation therapy, and treatment in the period 2009-2013 negatively affected the outcomes in our study. Multidisciplinary teamwork, timely delivery of treatment, and partnerships with loco-regional groups and colleagues in HIC is likely beneficial.


Subject(s)
Brain Neoplasms , Cerebellar Neoplasms , Medulloblastoma , Adolescent , Cerebellar Neoplasms/pathology , Child , Child, Preschool , Disease-Free Survival , Humans , Medulloblastoma/pathology , Peru/epidemiology , Prognosis , Risk Factors
13.
J Thromb Thrombolysis ; 53(1): 103-112, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34272635

ABSTRACT

Coagulopathy is a key feature of COVID-19 and D-dimer has been reported as a predictor of severity. However, because D-dimer test results vary considerably among assays, resolving harmonization issues is fundamental to translate findings into clinical practice. In this retrospective multicenter study (BIOCOVID study), we aimed to analyze the value of harmonized D-dimer levels upon admission for the prediction of in-hospital mortality in COVID-19 patients. All-cause in-hospital mortality was defined as endpoint. For harmonization of D-dimer levels, we designed a model based on the transformation of method-specific regression lines to a reference regression line. The ability of D-dimer for prediction of death was explored by receiver operating characteristic curves analysis and the association with the endpoint by Cox regression analysis. Study population included 2663 patients. In-hospital mortality rate was 14.3%. Harmonized D-dimer upon admission yielded an area under the curve of 0.66, with an optimal cut-off value of 0.945 mg/L FEU. Patients with harmonized D-dimer ≥ 0.945 mg/L FEU had a higher mortality rate (22.4% vs. 9.2%; p < 0.001). D-dimer was an independent predictor of in-hospital mortality, with an adjusted hazard ratio of 1.709. This is the first study in which a harmonization approach was performed to assure comparability of D-dimer levels measured by different assays. Elevated D-dimer levels upon admission were associated with a greater risk of in-hospital mortality among COVID-19 patients, but had limited performance as prognostic test.


Subject(s)
COVID-19 , Fibrin Fibrinogen Degradation Products/analysis , Biomarkers/blood , COVID-19/diagnosis , Humans , Prognosis , Registries , Retrospective Studies , Severity of Illness Index , Spain/epidemiology
14.
Appl Opt ; 61(7): D50-D62, 2022 Mar 01.
Article in English | MEDLINE | ID: mdl-35297828

ABSTRACT

Quantifying the stress field induced into a piece when it is loaded is important for engineering areas since it allows the possibility to characterize mechanical behaviors and fails caused by stress. For this task, digital photoelasticity has been highlighted by its visual capability of representing the stress information through images with isochromatic fringe patterns. Unfortunately, demodulating such fringes remains a complicated process that, in some cases, depends on several acquisitions, e.g., pixel-by-pixel comparisons, dynamic conditions of load applications, inconsistence corrections, dependence of users, fringe unwrapping processes, etc. Under these drawbacks and taking advantage of the power results reported on deep learning, such as the fringe unwrapping process, this paper develops a deep convolutional neural network for recovering the stress field wrapped into color fringe patterns acquired through digital photoelasticity studies. Our model relies on an untrained convolutional neural network to accurately demodulate the stress maps by inputting only one single photoelasticity image. We demonstrate that the proposed method faithfully recovers the stress field of complex fringe distributions on simulated images with an averaged performance of 92.41% according to the SSIM metric. With this, experimental cases of a disk and ring under compression were evaluated, achieving an averaged performance of 85% in the SSIM metric. These results, on the one hand, are in concordance with new tendencies in the optic community to deal with complicated problems through machine-learning strategies; on the other hand, it creates a new perspective in digital photoelasticity toward demodulating the stress field for a wider quantity of fringe distributions by requiring one single acquisition.

15.
Food Control ; 133: 108632, 2022 Mar.
Article in English | MEDLINE | ID: mdl-34703082

ABSTRACT

The SARS-CoV-2 global pandemic poses significant health risks to workers who are essential to maintaining the food supply chain. Using a quantitative risk assessment model, this study characterized the impact of risk reduction strategies for controlling SARS-CoV-2 transmission (droplet, aerosol, fomite-mediated) among front-line workers in a representative indoor fresh fruit and vegetable manufacturing facility. We simulated: 1) individual and cumulative SARS-CoV-2 infection risks from close contact (droplet and aerosols at 1-3 m), aerosol, and fomite-mediated exposures to a susceptible worker following exposure to an infected worker during an 8 h-shift; and 2) the relative reduction in SARS-CoV-2 infection risk attributed to infection control interventions (physical distancing, mask use, ventilation, surface disinfection, hand hygiene, vaccination). Without mitigation measures, the SARS-CoV-2 infection risk was largest for close contact (droplet and aerosol) at 1 m (0.96, 5th - 95th percentile: 0.67-1.0). In comparison, risk associated with fomite (0.26, 5th - 95th percentile: 0.10-0.56) or aerosol exposure alone (0.05, 5th - 95th percentile: 0.01-0.13) at 1 m distance was substantially lower (73-95%). At 1 m, droplet transmission predominated over aerosol and fomite-mediated transmission, however, this changed by 3 m, with aerosols comprising the majority of the exposure dose. Increasing physical distancing reduced risk by 84% (1-2 m) and 91% (1-3 m). Universal mask use reduced infection risk by 52-88%, depending on mask type. Increasing ventilation (from 0.1 to 2-8 air changes/hour) resulted in risk reductions of 14-54% (1 m) and 55-85% (2 m). Combining these strategies, together with handwashing and surface disinfection, resulted in <1% infection risk. Partial or full vaccination of the susceptible worker resulted in risk reductions of 73-92% (1 m risk range: 0.08-0.26). However, vaccination paired with other interventions (ACH 2, mask use, or distancing) was necessary to achieve infection risks <1%. Current industry SARS-CoV-2 risk reduction strategies, particularly when bundled, provide significant protection to essential food workers.

16.
Food Control ; 136: 108845, 2022 Jun.
Article in English | MEDLINE | ID: mdl-35075333

ABSTRACT

Countries continue to debate the need for decontamination of cold-chain food packaging to reduce possible severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) fomite transmission among frontline workers. While laboratory-based studies demonstrate persistence of SARS-CoV-2 on surfaces, the likelihood of fomite-mediated transmission under real-life conditions is uncertain. Using a quantitative microbial risk assessment model of a frozen food packaging facility, we simulated 1) SARS-CoV-2 fomite-mediated infection risks following worker exposure to contaminated plastic packaging; and 2) reductions in these risks from masking, handwashing, and vaccination. In a frozen food facility without interventions, SARS-CoV-2 infection risk to a susceptible worker from contact with contaminated packaging was 1.5 × 10-3 per 1h-period (5th - 95th percentile: 9.2 × 10-6, 1.2 × 10-2). Standard food industry infection control interventions, handwashing and masking, reduced risk (99.4%) to 8.5 × 10-6 risk per 1h-period (5th - 95th percentile: 2.8 × 10-8, 6.6 × 10-5). Vaccination of the susceptible worker (two doses Pfizer/Moderna, vaccine effectiveness: 86-99%) with handwashing and masking reduced risk to 5.2 × 10-7 risk per 1h-period (5th - 95th percentile: 1.8 × 10-9, 5.4 × 10-6). Simulating increased transmissibility of current and future variants (Delta, Omicron), (2-, 10-fold viral shedding) among a fully vaccinated workforce, handwashing and masking continued to mitigate risk (1.4 × 10-6 - 8.8 × 10-6 risk per 1h-period). Additional decontamination of frozen food plastic packaging reduced infection risks to 1.2 × 10-8 risk per 1h-period (5th - 95th percentile: 1.9 × 10-11, 9.5 × 10-8). Given that standard infection control interventions reduced risks well below 1 × 10-4 (World Health Organization water quality risk thresholds), additional packaging decontamination suggest no marginal benefit in risk reduction. Consequences of this decontamination may include increased chemical exposures to workers, food quality and hazard risks to consumers, and unnecessary added costs to governments and the global food industry.

17.
Int J Mol Sci ; 23(10)2022 May 12.
Article in English | MEDLINE | ID: mdl-35628216

ABSTRACT

Alzheimer's disease (AD) constitutes the most prominent form of dementia among elderly individuals worldwide. Disease modeling using murine transgenic mice was first initiated thanks to the discovery of heritable mutations in amyloid precursor protein (APP) and presenilins (PS) genes. However, due to the repeated failure of translational applications from animal models to human patients, along with the recent advances in genetic susceptibility and our current understanding on disease biology, these models have evolved over time in an attempt to better reproduce the complexity of this devastating disease and improve their applicability. In this review, we provide a comprehensive overview about the major pathological elements of human AD (plaques, tauopathy, synaptic damage, neuronal death, neuroinflammation and glial dysfunction), discussing the knowledge that available mouse models have provided about the mechanisms underlying human disease. Moreover, we highlight the pros and cons of current models, and the revolution offered by the concomitant use of transgenic mice and omics technologies that may lead to a more rapid improvement of the present modeling battery.


Subject(s)
Alzheimer Disease , Aged , Alzheimer Disease/metabolism , Amyloid beta-Protein Precursor/genetics , Animals , Disease Models, Animal , Humans , Mice , Mice, Transgenic , Plaque, Amyloid
18.
Emerg Infect Dis ; 27(4): 1229-1231, 2021 04.
Article in English | MEDLINE | ID: mdl-33755002

ABSTRACT

Severe acute respiratory syndrome coronavirus 2 can persist on surfaces, suggesting possible surface-mediated transmission of this pathogen. We found that fomites might be a substantial source of transmission risk, particularly in schools and child daycares. Combining surface cleaning and decontamination with mask wearing can help mitigate this risk.


Subject(s)
COVID-19 , Disease Transmission, Infectious/prevention & control , Fomites/virology , Infection Control , SARS-CoV-2/isolation & purification , Aged , Basic Reproduction Number , COVID-19/epidemiology , COVID-19/prevention & control , COVID-19/transmission , COVID-19/virology , Child , Child Day Care Centers/standards , Decontamination/methods , Equipment Contamination/prevention & control , Hand Disinfection/methods , Humans , Infection Control/instrumentation , Infection Control/methods , Masks , Nursing Homes/standards , Schools/standards , United States/epidemiology
19.
J Med Virol ; 93(6): 3557-3563, 2021 06.
Article in English | MEDLINE | ID: mdl-33017074

ABSTRACT

Noroviruses (NoV) are a leading cause of epidemic gastroenteritis. Human challenge studies have been used to examine the infectivity, pathogenicity, and host immune response to NoV as well as vaccine efficacy. The goal of this study was to conduct a meta-analysis of data from five previously completed human challenge trials and compare the response to the secondary NV inoculum (8fIIb) to its precursor (8fIIa). We investigated a total of 158 subjects: 76 subjects were experimentally challenged with NV inoculum 8fIIa, and 82 subjects were challenged with 8fIIb. We compared demographic characteristics, infection, illness, mean severity score, blood types, and duration of viral shedding between the two groups of subjects. There were no statistically significant differences in overall infection and illness rates between subjects inoculated with 8fIIa and 8fIIb. However, individuals challenged with 8fIIa had significantly higher severity scores (5.05 vs. 3.22, p = .008) compared with those challenged with 8fIIb. We also observed that infection with 8fIIb was associated with significantly longer duration of viral shedding compared with 8fIIa (11.0 days vs. 5.0 days, p = .0005). These results have serious implications for the development of new NoV inocula for human challenge studies to test candidate vaccine efficacy-where illness severity and duration of viral shedding are important outcomes.


Subject(s)
Caliciviridae Infections/virology , Norwalk virus/classification , Norwalk virus/pathogenicity , Virus Shedding , Adolescent , Adult , Caliciviridae Infections/immunology , Dose-Response Relationship, Immunologic , Female , Gastroenteritis/virology , Healthy Volunteers , Human Experimentation/statistics & numerical data , Humans , Male , Middle Aged , Norwalk virus/genetics , Norwalk virus/immunology , Severity of Illness Index , Young Adult
20.
Eur J Clin Invest ; 51(6): e13532, 2021 Jun.
Article in English | MEDLINE | ID: mdl-33660278

ABSTRACT

BACKGROUND: Myocardial injury is a common finding in COVID-19 strongly associated with severity. We analysed the prevalence and prognostic utility of myocardial injury, characterized by elevated cardiac troponin, in a large population of COVID-19 patients, and further evaluated separately the role of troponin T and I. METHODS: This is a multicentre, retrospective observational study enrolling patients with laboratory-confirmed COVID-19 who were hospitalized in 32 Spanish hospitals. Elevated troponin levels were defined as values above the sex-specific 99th percentile upper reference limit, as recommended by international guidelines. Thirty-day mortality was defined as endpoint. RESULTS: A total of 1280 COVID-19 patients were included in this study, of whom 187 (14.6%) died during the hospitalization. Using a nonspecific sex cut-off, elevated troponin levels were found in 344 patients (26.9%), increasing to 384 (30.0%) when a sex-specific cut-off was used. This prevalence was significantly higher (42.9% vs 21.9%; P < .001) in patients in whom troponin T was measured in comparison with troponin I. Sex-specific elevated troponin levels were significantly associated with 30-day mortality, with adjusted odds ratios (ORs) of 3.00 for total population, 3.20 for cardiac troponin T and 3.69 for cardiac troponin I. CONCLUSION: In this multicentre study, myocardial injury was a common finding in COVID-19 patients. Its prevalence increased when a sex-specific cut-off and cardiac troponin T were used. Elevated troponin was an independent predictor of 30-day mortality, irrespective of cardiac troponin assay and cut-offs to detect myocardial injury. Hence, the early measurement of cardiac troponin may be useful for risk stratification in COVID-19.


Subject(s)
COVID-19/blood , Cardiomyopathies/blood , Mortality , Troponin I/blood , Troponin T/blood , Aged , Aged, 80 and over , Female , Humans , Male , Middle Aged , Odds Ratio , Prognosis , Retrospective Studies , SARS-CoV-2 , Severity of Illness Index
SELECTION OF CITATIONS
SEARCH DETAIL