ABSTRACT
BACKGROUND: Obesity in childhood has deleterious consequences for health while improving physical fitness can significantly reduce health risks related to high body mass index. We aimed to examine the evolution of disparities in physical fitness based on weight status among 7-15-year-olds in Slovenia between 1989 and 2019 and compare these trends across sex, age and socioeconomic status. METHODS: We used annual data collected within the SLOfit monitoring system in the period between 1989 and 2019, totalling 4,256,930 participants (about 137 000 per year). We examined cardiorespiratory fitness (600-m run test), muscular fitness (60-s sit-ups, bent arm hang, and standing broad jump test) and skill-related fitness (backwards obstacle course, 60-m dash, arm plate tapping). We grouped children according to the IOTF cut-offs for BMI to those living with normal weight or excess weight and estimated changes in physical fitness over time by fitting quantile regression models separately by sex and age group, and then using segmented regression to identify the patterns of trends over time. RESULTS: Weight-based disparities in physical fitness were large in 1989 and have further increased by 2019. The increase in disparities was generally around 5 percentiles larger in boys, and 10-15-year-olds compared to younger children. It was particularly pronounced for body core strength and speed in boys (up to 15 and 19 percentiles, respectively) and upper body strength and speed among girls (up to 13 percentiles). Most of the increase in disparities in health-related fitness accumulated during the 2010s, when the fitness of children generally improved, but much less so in children living with excess weight. CONCLUSIONS: Despite recent improvements in population fitness levels, children with excess weight seem to be left behind, which adds to existing health inequalities. Our results should encourage policymakers to redesign policies aimed at promoting physical activity and enhancing physical fitness to make them more equitable and ultimately lead to reducing inequalities in fitness.
Subject(s)
Health Status Disparities , Physical Fitness , Humans , Slovenia/epidemiology , Male , Female , Child , Physical Fitness/physiology , Adolescent , Body Weight , Pediatric Obesity/epidemiology , Pediatric Obesity/prevention & control , Body Mass IndexABSTRACT
OBJECTIVES: (1) To develop reference values for health-related fitness in European children and adolescents aged 6-18 years that are the foundation for the web-based, open-access and multilanguage fitness platform (FitBack); (2) to provide comparisons across European countries. METHODS: This study builds on a previous large fitness reference study in European youth by (1) widening the age demographic, (2) identifying the most recent and representative country-level data and (3) including national data from existing fitness surveillance and monitoring systems. We used the Assessing Levels of PHysical Activity and fitness at population level (ALPHA) test battery as it comprises tests with the highest test-retest reliability, criterion/construct validity and health-related predictive validity: the 20 m shuttle run (cardiorespiratory fitness); handgrip strength and standing long jump (muscular strength); and body height, body mass, body mass index and waist circumference (anthropometry). Percentile values were obtained using the generalised additive models for location, scale and shape method. RESULTS: A total of 7 966 693 test results from 34 countries (106 datasets) were used to develop sex-specific and age-specific percentile values. In addition, country-level rankings based on mean percentiles are provided for each fitness test, as well as an overall fitness ranking. Finally, an interactive fitness platform, including individual and group reporting and European fitness maps, is provided and freely available online (www.fitbackeurope.eu). CONCLUSION: This study discusses the major implications of fitness assessment in youth from health, educational and sport perspectives, and how the FitBack reference values and interactive web-based platform contribute to it. Fitness testing can be conducted in school and/or sport settings, and the interpreted results be integrated in the healthcare systems across Europe.
Subject(s)
Hand Strength , Physical Fitness , Male , Female , Humans , Adolescent , Child , Reference Values , Reproducibility of Results , Exercise , Exercise Test/methods , Body Mass IndexABSTRACT
We study bias-reduced estimators of exponentially transformed parameters in general linear models (GLMs) and show how they can be used to obtain bias-reduced conditional (or unconditional) odds ratios in matched case-control studies. Two options are considered and compared: the explicit approach and the implicit approach. The implicit approach is based on the modified score function where bias-reduced estimates are obtained by using iterative procedures to solve the modified score equations. The explicit approach is shown to be a one-step approximation of this iterative procedure. To apply these approaches for the conditional analysis of matched case-control studies, with potentially unmatched confounding and with several exposures, we utilize the relation between the conditional likelihood and the likelihood of the unconditional logit binomial GLM for matched pairs and Cox partial likelihood for matched sets with appropriately setup data. The properties of the estimators are evaluated by using a large Monte Carlo simulation study and an illustration of a real dataset is shown. Researchers reporting the results on the exponentiated scale should use bias-reduced estimators since otherwise the effects can be under or overestimated, where the magnitude of the bias is especially large in studies with smaller sample sizes.
Subject(s)
Odds Ratio , Bias , Computer Simulation , Case-Control Studies , ProbabilityABSTRACT
ABSTRACT: Blagus, R, Jurak, G, Starc, G, and Leskosek, B. Centile reference curves of the SLOfit physical fitness tests for school-aged children and adolescents. J Strength Cond Res 37(2): 328-336, 2023-The study provides sex- and age-specific centile norms of Slovenian children and youth. Physical fitness was assessed using the SLOfit test battery on population data, including 185,222 children, aged 6-19 years, measured in April and May 2018. Centile curves for both sexes and 12 test items were constructed using the generalized additive models for location, scale, and shape (GAMLSS). Boys generally achieved higher scores in most of the physical fitness tests, except in stand and reach, but this was not consistent throughout childhood and adolescence, nor did it pertain to the entire range of performance. Girls outperformed boys in the arm-plate tapping test throughout childhood; the poorest performing girls outperformed the poorest performing boys in the 600-m run, 60-m dash, backward obstacle course, and standing broad jump. The shapes and trends of physical fitness curves adequately reflect the effects of growth and development on boys' and girls' physical performance. Comparing the existing reference fitness curves showed that Slovenian children and adolescents display higher fitness levels than their peers from other countries. This study provides the most up-to-date sex- and age-specific reference fitness centile curves of Slovenian children, which can be used as benchmark values for health and fitness monitoring and surveillance systems.
Subject(s)
Exercise , Physical Fitness , Male , Female , Humans , Adolescent , Child , Exercise Test , Reference ValuesABSTRACT
AIMS/HYPOTHESIS: A large proportion of people with diabetes do not receive proper foot screening due to insufficiencies in healthcare systems. Introducing an effective risk prediction model into the screening protocol would potentially reduce the required screening frequency for those considered at low risk for diabetic foot complications. The main aim of the study was to investigate the value of individualised risk assignment for foot complications for optimisation of screening. METHODS: From 2015 to 2020, 11,878 routine follow-up foot investigations were performed in the tertiary diabetes clinic. From these, 4282 screening investigations with complete data containing all of 18 designated variables collected at regular clinical and foot screening visits were selected for the study sample. Penalised logistic regression models for the prediction of loss of protective sensation (LOPS) and loss of peripheral pulses (LPP) were developed and evaluated. RESULTS: Using leave-one-out cross validation (LOOCV), the penalised regression model showed an AUC of 0.84 (95% CI 0.82, 0.85) for prediction of LOPS and 0.80 (95% CI 0.78, 0.83) for prediction of LPP. Calibration analysis (based on LOOCV) presented consistent recall of probabilities, with a Brier score of 0.08 (intercept 0.01 [95% CI -0.09, 0.12], slope 1.00 [95% CI 0.92, 1.09]) for LOPS and a Brier score of 0.05 (intercept 0.01 [95% CI -0.12, 0.14], slope 1.09 [95% CI 0.95, 1.22]) for LPP. In a hypothetical follow-up period of 2 years, the regular screening interval was increased from 1 year to 2 years for individuals at low risk. In individuals with an International Working Group on the Diabetic Foot (IWGDF) risk 0, we could show a 40.5% reduction in the absolute number of screening examinations (3614 instead of 6074 screenings) when a 10% risk cut-off was used and a 26.5% reduction (4463 instead of 6074 screenings) when the risk cut-off was set to 5%. CONCLUSIONS/INTERPRETATION: Enhancement of the protocol for diabetic foot screening by inclusion of a prediction model allows differentiation of individuals with diabetes based on the likelihood of complications. This could potentially reduce the number of screenings needed in those considered at low risk of diabetic foot complications. The proposed model requires further refinement and external validation, but it shows the potential for improving compliance with screening guidelines.
Subject(s)
Diabetic Foot/diagnosis , Mass Screening/statistics & numerical data , Aged , Female , Follow-Up Studies , Humans , Male , Middle Aged , Models, Theoretical , Practice Guidelines as Topic , Probability , Prospective Studies , Risk AssessmentABSTRACT
BACKGROUND: In binary logistic regression data are 'separable' if there exists a linear combination of explanatory variables which perfectly predicts the observed outcome, leading to non-existence of some of the maximum likelihood coefficient estimates. A popular solution to obtain finite estimates even with separable data is Firth's logistic regression (FL), which was originally proposed to reduce the bias in coefficient estimates. The question of convergence becomes more involved when analyzing clustered data as frequently encountered in clinical research, e.g. data collected in several study centers or when individuals contribute multiple observations, using marginal logistic regression models fitted by generalized estimating equations (GEE). From our experience we suspect that separable data are a sufficient, but not a necessary condition for non-convergence of GEE. Thus, we expect that generalizations of approaches that can handle separable uncorrelated data may reduce but not fully remove the non-convergence issues of GEE. METHODS: We investigate one recently proposed and two new extensions of FL to GEE. With 'penalized GEE' the GEE are treated as score equations, i.e. as derivatives of a log-likelihood set to zero, which are then modified as in FL. We introduce two approaches motivated by the equivalence of FL and maximum likelihood estimation with iteratively augmented data. Specifically, we consider fully iterated and single-step versions of this 'augmented GEE' approach. We compare the three approaches with respect to convergence behavior, practical applicability and performance using simulated data and a real data example. RESULTS: Our simulations indicate that all three extensions of FL to GEE substantially improve convergence compared to ordinary GEE, while showing a similar or even better performance in terms of accuracy of coefficient estimates and predictions. Penalized GEE often slightly outperforms the augmented GEE approaches, but this comes at the cost of a higher burden of implementation. CONCLUSIONS: When fitting marginal logistic regression models using GEE on sparse data we recommend to apply penalized GEE if one has access to a suitable software implementation and single-step augmented GEE otherwise.
Subject(s)
Models, Statistical , Bias , Computer Simulation , Humans , Likelihood Functions , Logistic ModelsABSTRACT
This repeat cross-sectional study investigated the impact of lockdown in Europe in Winter (January and February 2021) on children's and adolescent's physical activity (PA) and recreational screen time (RST), and compared PA to the lockdown in Spring 2020. An online survey was administered (n = 24 302; 6-18 years; 51.7% boys) in nine countries. PA and RST were assessed by 7-day recall. In total, 9.3% of children (95% confidence interval: 6.9-11.7) met WHO PA recommendation, which was half of the proportion observed in Spring 2020 [19.0% (18.2-19.9)]. Sixty percent exceeded the RST recommendations. This suggests that winter lockdown could have a more negative impact on PA than in spring.
Subject(s)
COVID-19 , Adolescent , COVID-19/epidemiology , Child , Communicable Disease Control , Cross-Sectional Studies , Europe/epidemiology , Exercise , Female , Humans , MaleABSTRACT
BACKGROUND: To date, few data on the quality and quantity of online physical education (P.E.) during the COVID-19 pandemic have been published. We assessed activity in online classes and reported allocated curriculum time for P.E. in a multi-national sample of European children (6-18 years). METHODS: Data from two online surveys were analysed. A total of 8395 children were included in the first round (May-June 2020) and 24â302 in the second round (January-February 2021). RESULTS: Activity levels during P.E. classes were low in spring 2020, particularly among the youngest children and in certain countries. 27.9% of students did not do any online P.E. and 15.7% were hardly ever very active. Only 18.4% were always very active and 14.9% reported being very active quite often. In winter 2020, we observed a large variability in the allocated curriculum time for P.E. In many countries, this was lower than the compulsory requirements. Only 65.7% of respondents had the same number of P.E. lessons than before pandemic, while 23.8% had less P.E., and 6.8% claimed to have no P.E. lessons. Rates for no P.E. were especially high among secondary school students, and in large cities and megapolises. CONCLUSIONS: During the COVID-19 pandemic, European children were provided much less P.E. in quantity and quality than before the pandemic. Countermeasures are needed to ensure that these changes do not become permanent. Particular attention is needed in large cities and megapolises. The critical role of P.E. for students' health and development must be strengthened in the school system.
Subject(s)
COVID-19 , Education, Distance , Child , Humans , Pandemics , Physical Education and Training , SchoolsABSTRACT
BACKGROUND AND PURPOSE: The characteristics and long-term outcome of Lyme neuroborreliosis (LNB) according to diagnostic certainty (definite vs. possible) are incompletely understood. METHODS: In this retrospective cohort study of adults with definite or possible LNB, clinical and microbiological characteristics and long-term outcome over 12 months were evaluated at a single medical center. Severity of acute disease and long-term outcome were assessed using a composite clinical score encompassing clinical findings and symptoms and by the probability of incomplete recovery. RESULTS: Amongst 311 adult patients enrolled from 2008 to 2017, 139 (44.7%) had definite LNB and 172 (55.3%) had possible LNB. The most frequent LNB manifestation was cranial neuropathy with or without meningitis (53.4%). Patients with definite LNB more often had Bannwarth syndrome (53.2% vs. 18.6%), more severe disease (6 points vs. 4 points), longer pre-treatment duration (median 21 days vs. 13.5 days), higher cerebrospinal fluid pleocytosis (median 139 × 106 /L vs. 11 × 106 /L) and higher rate of Borrelia seropositivity (84.2% vs. 68.6%) than those with possible LNB. Ceftriaxone was prescribed more often than oral doxycycline in definite LNB than in possible LNB (96.4% vs. 65.7%). Unfavorable outcomes decreased during follow-up, being higher in patients with more severe disease at enrollment and in those with possible LNB, but were not associated with antibiotic therapy. CONCLUSIONS: Early LNB, most often presenting as cranial neuropathy, was definitively diagnosed in less than half of cases. A better diagnostic approach is needed to confirm borrelial etiology. Ceftriaxone was not superior to doxycycline in the treatment of early LNB, regardless of diagnostic certainty. In this retrospective cohort study of 311 adults with Lyme neuroborreliosis (LNB), allocated according to diagnostic certainty, early LNB was definitively diagnosed in less than half of cases and the most frequent LNB manifestation was cranial neuropathy with or without meningitis. Patients with definite LNB more often had Bannwarth syndrome, more severe disease, longer pre-treatment duration, higher cerebrospinal fluid pleocytosis and higher rate of Borrelia seropositivity than those with possible LNB. A better diagnostic approach is needed to confirm borrelial etiology. Ceftriaxone was not superior to doxycycline in the treatment of early LNB, regardless of diagnostic certainty.
Subject(s)
Lyme Neuroborreliosis , Adult , Anti-Bacterial Agents/therapeutic use , Doxycycline , Europe , Humans , Lyme Neuroborreliosis/diagnosis , Lyme Neuroborreliosis/drug therapy , Lyme Neuroborreliosis/epidemiology , Retrospective StudiesABSTRACT
BACKGROUND: For finite samples with binary outcomes penalized logistic regression such as ridge logistic regression has the potential of achieving smaller mean squared errors (MSE) of coefficients and predictions than maximum likelihood estimation. There is evidence, however, that ridge logistic regression can result in highly variable calibration slopes in small or sparse data situations. METHODS: In this paper, we elaborate this issue further by performing a comprehensive simulation study, investigating the performance of ridge logistic regression in terms of coefficients and predictions and comparing it to Firth's correction that has been shown to perform well in low-dimensional settings. In addition to tuned ridge regression where the penalty strength is estimated from the data by minimizing some measure of the out-of-sample prediction error or information criterion, we also considered ridge regression with pre-specified degree of shrinkage. We included 'oracle' models in the simulation study in which the complexity parameter was chosen based on the true event probabilities (prediction oracle) or regression coefficients (explanation oracle) to demonstrate the capability of ridge regression if truth was known. RESULTS: Performance of ridge regression strongly depends on the choice of complexity parameter. As shown in our simulation and illustrated by a data example, values optimized in small or sparse datasets are negatively correlated with optimal values and suffer from substantial variability which translates into large MSE of coefficients and large variability of calibration slopes. In contrast, in our simulations pre-specifying the degree of shrinkage prior to fitting led to accurate coefficients and predictions even in non-ideal settings such as encountered in the context of rare outcomes or sparse predictors. CONCLUSIONS: Applying tuned ridge regression in small or sparse datasets is problematic as it results in unstable coefficients and predictions. In contrast, determining the degree of shrinkage according to some meaningful prior assumptions about true effects has the potential to reduce bias and stabilize the estimates.
Subject(s)
Logistic Models , Bias , Computer Simulation , Humans , ProbabilityABSTRACT
When building classifiers, it is natural to require that the classifier correctly estimates the event probability (Constraint 1), that it has equal sensitivity and specificity (Constraint 2) or that it has equal positive and negative predictive values (Constraint 3). We prove that in the balanced case, where there is equal proportion of events and non-events, any classifier that satisfies one of these constraints will always satisfy all. Such unbiasedness of events and non-events is much more difficult to achieve in the case of rare events, i.e. the situation in which the proportion of events is (much) smaller than 0.5. Here, we prove that it is impossible to meet all three constraints unless the classifier achieves perfect predictions. Any non-perfect classifier can only satisfy at most one constraint, and satisfying one constraint implies violating the other two constraints in a specific direction. Our results have implications for classifiers optimized using g-means or F1-measure, which tend to satisfy Constraints 2 and 1, respectively. Our results are derived from basic probability theory and illustrated with simulations based on some frequently used classifiers.
Subject(s)
Computer Simulation , Databases, Factual , Diagnosis, Computer-Assisted/classification , Models, Biological , Humans , SoftwareABSTRACT
Methods for the determination of the postmortem interval (PMI) include methods that monitor the postmortem changes of cells and molecules in different tissues. The rate of pathological degradation of macromolecules in the extracellular matrix (ECM) of hyaline cartilage could be verified by assessing the intensity of collagen and proteoglycan (PG) staining. In the presented in vitro pilot study, this methodology was used for the first time to determine PMI. The osteochondral samples of three donors were stored at 11 °C and 35 °C and analyzed on day 1, day 12, and day 36 postmortem. The intensity of staining using Masson's trichrome and Sirius red for collagen, and Alcian blue and Safranin O dyes for PG was estimated ten times according to the modified Bern grading scale. Statistical analysis showed that the Safranin O without Fast green method is the most appropriate (raters agreement 0.5541) for up to 36 days postmortem, and that the influence of time is more important (p = 0.023) than the influence of temperature (p = 0.061) on the degradation of the ECM macromolecules. The described method, which is simple and can be performed in any histological laboratory, should be verified in corpore conditions, on a large number of donors, and using an objective method for assessing the intensity of cartilage macromolecule staining for PMI determination.
Subject(s)
Collagen/metabolism , Extracellular Matrix/metabolism , Hyaline Cartilage/metabolism , Phenazines , Postmortem Changes , Proteoglycans/metabolism , Staining and Labeling/methods , Adult , Alcian Blue , Azo Compounds , Coloring Agents , Eosine Yellowish-(YS) , Forensic Pathology/methods , Humans , Male , Methyl Green , Middle Aged , Pilot Projects , Specimen Handling , Young AdultABSTRACT
INTRODUCTION: We tested the hypothesis that individual susceptibility to freezing cold injury might be reflected in an attenuated cold-induced vasodilatation (CIVD) response by comparing the CIVD responses of an elite alpinist with a history of freezing cold injury in the feet (case alpinist) with those of an age- and ability- matched noninjured alpinists control group (controls). According to this hypothesis, the vasomotor responses to a CIVD test of the case alpinist would represent a pathophysiological response when compared with the normal physiological response of a noninjured cohort. METHODS: The case alpinist and the controls in the cohort group conducted a cold water immersion test comprising sequential immersion of a hand and foot for 5 min in 35°C water, followed by a 30-min immersion in 8°C water and a 10-min recovery period in room air. During this test we monitored the finger and toe skin temperatures. RESULTS: The case alpinist had a significantly attenuated CIVD response and a lower skin temperature in all injured and noninjured digits during immersion (â¼2°C lower than in the control group) and an attenuated recovery of finger skin temperatures (â¼6°C lower than in the control group). CONCLUSIONS: The attenuated CIVD response of the case alpinist may reflect a previously unrecognized enhanced susceptibility to frostbite. In addition to the poor vasomotor response observed in the injured toes, he also exhibited a poor vasomotor response in his noninjured fingers. The results of the present study indicate that a test of vasomotor activity during thermal stress may identify individuals predisposed to cold injury.
Subject(s)
Cold Temperature/adverse effects , Skin Temperature/physiology , Vasodilation/physiology , Adult , Case-Control Studies , Fingers/physiology , Frostbite/physiopathology , Humans , Immersion/physiopathology , Male , Mountaineering/physiology , Toes/injuries , Toes/physiologyABSTRACT
Objectives: To evaluate the impact of European Antibiotic Awareness Day (EAAD) on antibiotic consumption, improvements in general public awareness and antibiotic resistance in Slovenia. Methods: Outpatient data for the period from 2002 to 2016 and hospital antibiotic consumption data for 2004-16 were collected using the Anatomical Therapeutic Chemical (ATC) classification/DDDs. Outpatient antibiotic consumption data were expressed in DDDs/1000 inhabitants/day (DIDs), number of packages/1000 inhabitant-days and number of prescriptions/1000 inhabitants/year. Hospital consumption data were expressed in DIDs, number of DDDs/100 bed-days and number of DDDs/100 admissions. Segmented regression analysis of interrupted time series was used to estimate the effects of these interventions on antibiotic consumption. Results: During the 8 year period since establishing EAAD, a 9%-17% decrease in outpatient antibiotic consumption has been observed, depending on the measurement unit, which was a little more than in the 6 years prior to EAAD (7%-12%). The trend change in hospital consumption after EAAD was established remained small, with a highly non-significant P value. Eurobarometer data did not show an increase in knowledge on antibiotic use. Resistance of Streptococcus pneumoniae to penicillin and macrolides decreased during EAAD activities. Conclusions: EAAD activities were associated with a decreasing trend in community consumption. Owing to many other national activities on the prudent use of antimicrobials in outpatients and inpatients it is difficult to analyse the direct effect of EAAD.
Subject(s)
Anti-Bacterial Agents/therapeutic use , Awareness , Bacteria/drug effects , Bacterial Infections/drug therapy , Drug Resistance, Bacterial , Drug Utilization/trends , Attitude of Health Personnel , Drug Utilization/statistics & numerical data , Humans , Inpatients , Interrupted Time Series Analysis , Outpatients , SloveniaABSTRACT
Background: Several guidelines advocate the same treatment approaches for both early disseminated Lyme borreliosis, manifested as multiple erythema migrans (EM), and early localized Lyme borreliosis, manifested as solitary EM. Methods: Oral doxycycline (100 mg q12h) was compared on a non-inferiority premise with intravenous ceftriaxone (2 g q24h) for 14 days in 200 adult European patients with multiple EM in an open-label alternate-treatment observational trial performed in a single-centre university hospital. Treatment outcome was assessed at 14 days and at 2, 6 and 12 months post-enrolment. Non-specific symptoms in patients and 192 control subjects without a history of Lyme borreliosis were evaluated and compared. This trial was registered at http://clinicaltrials.gov (identifier NCT01163994). Results: At the 12 month visit, 4/82 (4.9%) multiple EM patients prescribed doxycycline and 6/88 (6.8%) multiple EM patients prescribed ceftriaxone showed incomplete response manifested predominantly as post-Lyme symptoms (1.9% difference, upper limit of 95% CI 5.1%). The upper limit of 95% CI for the difference in proportion of patients with incomplete response between doxycycline and ceftriaxone groups did not exceed the predetermined non-inferiority margin of 10%. The frequency of non-specific symptoms in patients was similar to that in controls. Conclusions: The 14 day oral doxycycline was not inferior to the 14 day intravenous ceftriaxone in treatment of adult European patients with early disseminated Lyme borreliosis manifested as multiple EM. The frequency of non-specific symptoms in patients was similar to that in controls without a history of Lyme borreliosis.
Subject(s)
Anti-Bacterial Agents/administration & dosage , Ceftriaxone/administration & dosage , Doxycycline/administration & dosage , Lyme Disease/drug therapy , Administration, Intravenous , Administration, Oral , Adult , Female , Follow-Up Studies , Hospitals, University , Humans , Male , Middle Aged , Treatment OutcomeABSTRACT
BACKGROUND: Community acquired Clostridioides (Clostridium) difficile infection (CA-CDI) is a significant health problem in human and veterinary medicine. Animals are often considered as potential reservoirs for CA-CDI. In Europe, family farming is the most predominant farming operation, with a complex interaction between animals and the community. Therefore, it is pertinent to evaluate transmission patterns of C. difficile on such prominent European farming model. Fecal samples from calves (n = 2442) were collected biweekly over a period of one year on 20 mid-size family dairy farms. Environmental samples (n = 475) were collected in a three month interval. Clostridioides difficile was detected using qPCR in 243 fecal samples (243/2442); positive samples were then quantified. Association between prevalence/load of C. difficile and age of the calves was estimated with logistic regression model. Most common C. difficile isolate from calves (n = 76) and the environment (n = 14) was C. difficile ribotype 033, which was further analyzed using multilocus variable-number tandem-repeat analysis (MLVA) to assess intra- and between-farm relatedness. RESULTS: Clostridioides difficile was detected in feces of calves less than 24 h old. Results showed a non-linear statistically significant decrease in shedding load of C. difficile with age (P < 0.0001). A nonlinear relationship was also established between the number of calves and the farm C. difficile prevalence, whereas the prevalence of C. difficile ribotype 033 increased linearly with the number of calves. MLVA revealed close intra-farm relatedness among C. difficile ribotypes 033. It also revealed that the between-farms close relatedness of C. difficile ribotypes 033 can be a direct result of farm to farm trade of calves. CONCLUSIONS: Implementation of better hygiene and management measures on farms may help decrease the risk of spreading CA-CDI between animals and the community. Trading calves older than 3 weeks would decrease the possibility C. difficile dissemination in the community because of lower prevalence and lower load of C. difficile in feces.
Subject(s)
Cattle Diseases/microbiology , Clostridioides difficile/isolation & purification , Clostridium Infections/veterinary , Feces/microbiology , Age Factors , Animals , Cattle , Cattle Diseases/epidemiology , Cattle Diseases/transmission , Clostridioides difficile/genetics , Clostridium Infections/epidemiology , Clostridium Infections/microbiology , Clostridium Infections/transmission , Dairying , Multilocus Sequence Typing/veterinary , Ribotyping , Slovenia/epidemiologyABSTRACT
Animal studies suggest that dynamic predictors remain useful in patients with pneumoperitoneum, but human data is conflicting. Our aim was to determine predictive values of pulse pressure variation (PPV) and stroke volume variation (SVV) in patients with pneumoperitoneum using LiDCORapid™ haemodynamic monitor. Standardised fluid challenges of colloid were administered to patients undergoing laparoscopic procedures, one fluid challenge per patient. Intra-abdominal pressure was automatically held at 12 mmHg. Fluid responsiveness was defined as an increase in nominal stroke index (nSI) ≥ 10%. Linear regression was used to assess the ability of PPV and SVV to track the changes of nSI and logistic regression and area under the receiver operating curve (AUROC) to assess the predictive value of PPV and SVV for fluid responsiveness. Threshold values for PPV and SVV were obtained using the "gray zone" approach. A p < 0.05 was considered as statistically significant. 56 patients were included in analysis. 41 patients (73%) responded to fluids. Both PPV and SVV tracked changes in nSI (Spearman correlation coefficients 0.34 for PPV and 0.53 for SVV). Odds ratio for fluid responsiveness for PPV was 1.163 (95% CI 1.01-1.34) and for SVV 1.341 (95% CI 1.10-1.63). PPV achieved an AUROC of 0.674 (95% CI 0.518-0.830) and SVV 0.80 (95% CI 0.668-0.932). The gray zone of PPV ranged between 6.5 and 20.5% and that of SVV between 7.5 and 13%. During pneumoperitoneum, as measured by LiDCORapid™, PPV and SVV can predict fluid responsiveness, however their sensitivity is lower than the one reported in conditions without pneumoperitoneum. Trial registry number: (with the Australian New Zealand Clinical Trials Registry): ACTRN12612000456853.
Subject(s)
Fluid Therapy , Hemodynamic Monitoring/methods , Pneumoperitoneum/physiopathology , Pneumoperitoneum/therapy , Adult , Aged , Analysis of Variance , Blood Pressure/physiology , Female , Hemodynamic Monitoring/statistics & numerical data , Hemodynamics/physiology , Humans , Male , Middle Aged , Predictive Value of Tests , Prospective Studies , Stroke Volume/physiology , Tidal Volume/physiologyABSTRACT
BACKGROUND: The purpose of the study was to find out whether there is a difference in the early parameters of cardiotoxicity (left ventricular ejection fraction [LVEF] and N-terminal pro-B-type natriuretic peptide [NT-proBNP]) between the two groups of patients: the patients treated for left breast cancer (left breast cancer group) and those treated for the right breast cancer (right breast cancer group), after the treatment had been completed. PATIENTS AND METHODS: The study included 175 consecutive patients with human epidermal growth factor receptor-2 (HER2) positive early breast cancer, treated concurrently with trastuzumab and radiotherapy (RT), between June 2005 and December 2010. Echocardiography with LVEF measurement was performed before adjuvant RT (LVEF0) and after the completed treatment (LVEF1,). After the treatment NT-proBNP measurement was done as well. The difference (Δ) between LVEF0 and LVEF1 was analysed (Δ LVEF = LVEF0 - LVEF1) and compared between the two groups. RESULTS: There were 84 patients in the left and 91 in the right breast cancer group. Median observation time was 57 (37-71) months. Mean Δ LVEF (%) was -1.786% in the left and -2.607% in the right breast cancer group (p = 0.562, CI: -2.004 to 3.648). Median NT-proBNP were 111.0 ng/l in the left and 90.0 ng/l in the right breast cancer group (p = 0.545). Echocardiography showed that the patients in the left breast cancer group did not have significantly worse systolic and diastolic left ventricular function in comparison with the patients in the right breast cancer group, but, they had higher incidence of pericardial effusion (9 [11%] vs. 1 [1%]) (p = 0.007). CONCLUSIONS: We did not find any significant differences in the early parameters of cardiotoxicity (LVEF, NT-proBNP) between the observed groups. Patients who received left breast/chest wall irradiation had higher incidence of pericardial effusion.
ABSTRACT
BACKGROUND: Commercial enteral formulas are generally recommended for gastrostomy feeding in patients with severe neurologic impairment. However, pureed food diets are still widely used and even gaining popularity among certain groups. We tried to compare the effectiveness of gastrostomy feeding for treatment of severe malnutrition with either enteral formulas or pureed feeds. PATIENTS AND METHODS: A 6-month nutritional intervention was made with 37 malnourished children, adolescents and young adults (2-26 years old) with severe neurologic impairment (Gross Motor Function Classification system [GMFCS] grade V). The individual needs were calculated. Participants were fed by gastrostomy with either enteral formulas (n = 17) or pureed food (n = 20). Measurements to assess nutritional status were made at the beginning and at the end of intervention. RESULTS: The Z scores for weight-for-age and for the body-mass index increased more in enteral formula than in pureed food group (2.07 vs. 0.70, p = 0.0012; and 3.75 vs. 0.63, p = 0.0014, respectively). Fat mass index increased more in enteral formula than in pureed food group (1.12 kg/m2vs. 0.38 kg/m2; p = 0.0012). Patients in the enteral formula group showed increase in lean body mass expressed as fat-free mass index (0.70 kg/m2), while those in pureed food group did not (-0.06 kg/m2) (p = 0.0487). CONCLUSIONS: The results suggest that even professionally planned pureed food diet is less effective than commercial enteral formula for nutritional rehabilitation of malnourished patients with severe neurologic impairment. However, larger and if possible randomised clinical studies should be made to confirm our findings.