ABSTRACT
Conventionally, the optimization of glucose biosensors is achieved by varying the concentrations of the individual reagents used to immobilize the enzyme. In this work, the effect and interaction between glucose oxidase enzyme (GOx), ferrocene methanol (Fc), and multi-walled carbon nanotubes (MWCNTs) at different concentrations were investigated by a design of experiments (DoE). For this analysis, a factorial design with three factors and two levels each was used with the software RStudio for statistical analysis. The data were obtained by electrochemical experiments on the immobilization of GOx-Fc/MWCNT at different concentrations. The results showed that the factorial DoE method was confirmed by the non-normality of the residuals and the outliers of the experiment. When examining the effects of the variables, analyzing the half-normal distribution and the effects and contrasts for GOx-Fc/MWCNT, the factors that showed the greatest influence on the electrochemical response were GOx, MWCNT, Fc, and MWCNT:Fc, and there is a high correlation between the factors GOx, MWCNT, Fc, and MWCNT:Fc, as shown by the analysis of homoscedasticity and multicollinearity. With these statistical analyses and experimental designs, it was possible to find the optimal conditions for different factors: 10 mM mL-1 GOx, 2 mg mL-1 Fc, and 15 mg mL-1 MWCNT show a greater amperometric response in the glucose oxidation. This work contributes to advancing enzyme immobilization strategies for glucose biosensor applications. Systematic investigation of DoE leads to optimized immobilization for GOx, enables better performance as a glucose biosensor, and allows the prediction of some outcomes.
ABSTRACT
Corrosion deterioration of materials is a major problem affecting economic, safety, and logistical issues, especially in the aeronautical sector. Detecting the correct corrosion type in metal alloys is very important to know how to mitigate the corrosion problem. Electrochemical noise (EN) is a corrosion technique used to characterize the behavior of different alloys and determine the type of corrosion in a system. The objective of this research is to characterize by EN technique different aeronautical alloys (Al, Ti, steels, and superalloys) using different analysis methods such as time domain (visual analysis, statistical), frequency domain (power spectral density (PSD)), and frequency-time domain (wavelet decomposition, Hilbert Huang analysis, and recurrence plots (RP)) related to the corrosion process. Optical microscopy (OM) is used to observe the surface of the tested samples. The alloys were exposed to 3.5 wt.% NaCl and H2SO4 solutions at room temperature. The results indicate that HHT and recurrence plots are the best options for determining the corrosion type compared with the other methods due to their ability to analyze dynamic and chaotic systems, such as corrosion. Corrosion processes such as passivation and localized corrosion can be differentiated when analyzed using HHT and RP methods when a passive system presents values of determinism between 0.5 and 0.8. Also, to differentiate the passive system from the localized system, it is necessary to see the recurrence plot due to the similarity of the determinism value. Noise impedance (Zn) is one of the best options for determining the corrosion kinetics of one system, showing that Ti CP2 and Ti-6Al-4V presented 742,824 and 939,575 Ω·cm2, while Rn presented 271,851 and 325,751 Ω·cm2, being the highest when exposed to H2SO4.
ABSTRACT
Low-cost sensors integrated with the Internet of Things can enable real-time environmental monitoring networks and provide valuable water quality information to the public. However, the accuracy and precision of the values measured by the sensors are critical for widespread adoption. In this study, 19 different low-cost sensors, commonly found in the literature, from four different manufacturers are tested for measuring five water quality parameters: pH, dissolved oxygen, oxidation-reduction potential, turbidity, and temperature. The low-cost sensors are evaluated for each parameter by calculating the error and precision compared to a typical multiparameter probe assumed as a reference. The comparison was performed in a controlled environment with simultaneous measurements of real water samples. The relative error ranged from - 0.33 to 33.77%, and most of them were ≤ 5%. The pH and temperature were the ones with the most accurate results. In conclusion, low-cost sensors are a complementary alternative to quickly detect changes in water quality parameters. Further studies are necessary to establish a guideline for the operation and maintenance of low-cost sensors.
Subject(s)
Environmental Monitoring , Water Quality , Environmental Monitoring/methods , Environmental Monitoring/instrumentation , Hydrogen-Ion Concentration , Temperature , Water Pollutants, Chemical/analysis , Oxygen/analysisABSTRACT
Pragmatic trials aim to assess intervention efficacy in usual patient care settings, contrasting with explanatory trials conducted under controlled conditions. In aging research, pragmatic trials are important designs for obtaining real-world evidence in elderly populations, which are often underrepresented in trials. In this review, we discuss statistical considerations from a frequentist approach for the design and analysis of pragmatic trials. When choosing the dependent variable, it is essential to use an outcome that is highly relevant to usual medical care while also providing sufficient statistical power. Besides traditionally used binary outcomes, ordinal outcomes can provide pragmatic answers with gains in statistical power. Cluster randomization requires careful consideration of sample size calculation and analysis methods, especially regarding missing data and outcome variables. Mixed effects models and generalized estimating equations (GEEs) are recommended for analysis to account for center effects, with tools available for sample size estimation. Multi-arm studies pose challenges in sample size calculation, requiring adjustment for design effects and consideration of multiple comparison correction methods. Secondary analyses are common but require caution due to the risk of reduced statistical power and false-discovery rates. Safety data collection methods should balance pragmatism and data quality. Overall, understanding statistical considerations is crucial for designing rigorous pragmatic trials that evaluate interventions in elderly populations under real-world conditions. In conclusion, this review focuses on various statistical topics of interest to those designing a pragmatic clinical trial, with consideration of aspects of relevance in the aging research field.
ABSTRACT
Meat products are known for their lipid profile rich in saturated fats and cholesterol, and also for the formation of oxidation compounds; therefore, a reduction in animal fat may result in a product less harmful to health. Pijuayo is an Amazon fruit known for its nutritional properties, such as its fiber and lipid content. For these reasons, it is an attractive fruit to replace animal fat in meat products. The present work used pijuayo pulp and peel flours to partially replace animal fat in beef-based burgers at 25% and 50% levels, considering sensory and physicochemical outcomes evaluated by Principal Component Analysis (PCA), Correspondence Analysis (CA) and Multiple Factor Analysis (MFA). Pijuayo flour affected the physicochemical characteristics evaluated by PCA, where the samples with greater fat replacement were characterized by a high carbohydrate content and instrumental yellowness. The minimal fat replacement did not abruptly affect the PCA's instrumental texture and color, proximal composition, yield properties, and lipid oxidation. The overall liking was greater in the 25% fat reduction treatments, even greater than the control, in which positive sensory attributes for liking were highlighted for those treatments. A small segment of consumers (11% of total consumers) preferred the treatment with greater replacement of fat with pijuayo peel flour, which these consumers tended to characterize as seasoned. However, this treatment had the lowest liking. The MFA showed that the sensory characteristics tender and tasty were strongly correlated with overall liking and were highlighted in the samples of 25% fat reduction, suggesting that the pijuayo improves the tenderness and flavor of reduced-fat burgers. Other inclusion levels between 25% and 50% of fat replacement could be explored, and optimization studies are needed. In addition, the sensory characteristics and flavor-enhancing compounds of the fruit, as well as the nutritional aspects of the inclusion of pijuayo, should be studied, such as the fatty acid profile. These characteristics will be informative to explore pijuayo as a fat replacer at a pilot scale and industrial scale.
ABSTRACT
Hydraulic losses are a crucial variable in hydroelectric ventures as they can cause significant reductions in power generation. This article analyses the impact of hydraulic losses on the Retiro Small Hydroelectric Power Plant (SHPP), investigating their effect on monthly average power and, consequently, the efficiency of electricity generation. This study examines the historical series of load losses at the Retiro SHPP from 1990 to 2022. The calculations are based on flow data available in HidroWeb. The study considered the maximum and minimum flow rates in the historical flow series as constraints for hydropower generation. We used a multivariable function to calculate the efficiency of the hydraulic turbine, relating the turbine flow rate and the net water head. We developed mathematical relationships for head losses that occur in the grid, at the water intake, and due to friction from the intake to the turbine of the Retiro SHPP. The article presents a comparison between the actual monthly average power of the Small Hydroelectric Power Plant (SHPP) and the simulated monthly average power. We normalized the data for turbine flow and load loss to conduct statistical analysis. Kernel probability density was applied to understand the distribution shape of the data. Findings show that average monthly capacity is lowest in September, at approximately 0.81 MW. In March, the highest power occurs, approximately 14.19 MW. During the high flow period, the simulated average power, which accounts for load losses, closely matched the actual average power generated at the Retiro SHPP. In the months from July to October, despite being the period with the lowest head losses, it is the time when there is a greater opportunity to maximize energy generation at the power plant. An inefficient power generation system experiences significant load losses during specific periods of the year. To minimize this effect, it is crucial to understand the behavior of hydraulic losses and consider implementing mitigating measures.
ABSTRACT
OBJECTIVES: This study aimed to verify the prevalence of sarcopenia and its associations with sociodemographic, clinical and psychological factors in community-dwelling older adults. STUDY DESIGN: A randomized cross-sectional study was extracted from a probabilistic cluster conducted on individuals aged 65 years or older residing in the community. METHODS: Sarcopenia was defined according to the criteria of the European Working Group on Sarcopenia in Older People (EWGSOP2). Body composition was assessed using dual-energy X-ray absorptiometry (DXA). Associations were analyzed using networks based on mixed graphical models. Predictability indices of the estimated networks were assessed using the proportion of explained variance for numerical variables and the proportion of correct classification for categorical variables. RESULTS: The study included 278 participants, with a majority being female (61 %). The prevalence of sarcopenia was 39.57 %. Among those with sarcopenia, 67 % were women and 33 % were men. In the network model, age, race, education, family income, bone mass, depression, cardiovascular disease, diabetes, total cholesterol levels and rheumatism were associated with sarcopenia. The covariates demonstrated a high accuracy (62.9 %) in predicting sarcopenia categories. CONCLUSION: The prevalence of sarcopenia was high, especially in women. In addition, network analysis proved useful in visualizing complex relationships between sociodemographic and clinical factors with sarcopenia. The results suggest early screening of sarcopenia for appropriate treatment of this common geriatric syndrome in older adults in Brazil.
Subject(s)
Absorptiometry, Photon , Sarcopenia , Humans , Sarcopenia/epidemiology , Sarcopenia/diagnosis , Female , Male , Aged , Brazil/epidemiology , Cross-Sectional Studies , Prevalence , Aged, 80 and over , Body Composition , Independent Living/statistics & numerical data , Risk Factors , Geriatric Assessment/methods , Geriatric Assessment/statistics & numerical dataABSTRACT
In the coffee industry, the use of natural coffee extracts with differentiated attributes is desirable to drive new product development. This study evaluates the impact of ultrafiltration membrane processing on the sensory, metabolic, and physicochemical attributes of four commercially available coffee extracts: cold brew, lightly roasted, freeze concentrated and evaporated standard. The sensory analysis revealed an increase in acidity in the permeate across all extracts, with the most significant profile changes observed in the lightly roasted evaporated and evaporated extracts, accompanied by an enhancement of fruity and floral attributes. Furthermore, the permeate showed reduced total dissolved solids, while the caffeine concentration increased. Metabolomic analysis highlighted key coffee-related metabolites like cinnamic and coumaric acids, explaining observed variations due to their passage through the membrane. Our findings emphasize the potential of permeate as a coffee-based ingredient for ready-to-drink products development, providing a unique coffee experience with organoleptic profiles distinct from traditional beverages.
Subject(s)
Coffea , Coffee , Plant Extracts , Taste , Ultrafiltration , Plant Extracts/chemistry , Coffee/chemistry , Coffea/chemistry , Humans , Food Handling , Caffeine/analysis , Caffeine/metabolismABSTRACT
Introducción: El análisis estadístico implicativo es un método basado en las técnicas estadísticas multivariadas, la teoría de la cuasi-implicación, la inteligencia artificial y el álgebra booleana. Se utiliza para modelar interrelaciones entre sujetos y variables que permiten la estructuración del conocimiento en forma de normas y reglas generalizadas. Objetivo: Caracterizar el análisis estadístico implicativo como herramienta del tratamiento de la información estadística en ciencias de la salud. Métodos: Se realizó una búsqueda de fuentes bibliográficas para caracterizar el método, y el uso en factores pronósticos y perfiles de organización funcional visual en patologías extrapolables a distintos tamaños de muestras. Desarrollo: El análisis estadístico implicativo organiza la información, favorece el tratamiento estadístico adecuado en el análisis de los datos y permite graficar los resultados. Igualmente, las reglas obtenidas conllevan a hipótesis de causalidad sin restringir el número de variables y el tamaño de la muestra. Su uso ha contribuido a estudios de factores pronósticos en patologías como el cáncer y de perfiles en el procesamiento visual en disléxicos. Conclusiones: El análisis estadístico implicativo crea hipótesis de causalidad a través de reglas metodológicas de relación entre las variables de estudio. Además, permite estructurar, analizar y comprender vínculos entre sujetos y variables de la investigación en salud.
Introduction: Implicative statistical analysis is a method based on multivariate statistical techniques, quasi-implication theory, artificial intelligence, and Boolean algebra. It is used to model interrelationships between subjects and variables that allow the structuring of knowledge in the form of generalized norms and rules. Objective: To characterize implicative statistical analysis as a tool for the processing of statistical information in health sciences. Methods: A search of bibliographic sources was carried out to characterize the method, and its use in prognostic factors and profiles of visual functional organization in pathologies that can be extrapolated to different sample sizes. Development: Implicative statistical analysis organizes information, favors the appropriate statistical treatment in the analysis of the data, and allows the results to be graphed. Likewise, the rules obtained lead to hypotheses of causality without restricting the number of variables and the size of the sample. Its use has contributed to studies of prognostic factors in pathologies such as cancer and profiles in visual processing in dyslexics. Conclusions: Implicative statistical analysis creates hypotheses of causality through methodological rules of relationship between study variables. In addition, it makes it possible to structure, analyze and understand links between subjects and variables of health research.
ABSTRACT
This article presents a study on the tensile properties of knitted fabrics commonly employed in polymeric matrix textile composites. The key mechanical parameters investigated include stress (Pa), strain, Young's modulus (Pa), and work of rupture (J). The knitted fabrics were developed using the Cixing Knitting System software and subsequently manufactured using a double jersey (electronic) flat knitting machine. The primary objective of this research was to explore the impact of various factors on the mechanical behavior of these knitted fabrics. The factors studied were wale and course directions, float stitch density, loop length (cm), and the type of synthetic knitting yarns used (100% polyester and 100% polyamide) along with different combinations of knitting yarns (100% cotton and 67% polyester/33% cotton hybrid). The adopted ASTM D 5034 standard, Response Surface Methodology (RSM), and Analysis of Variance (ANOVA) were employed to evaluate the mechanical performance of these fabric structures. The findings of the study revealed that the statistical adjustment of the data set for stress, strain, Young's modulus, and work of rupture in knitted fabric structures significantly reduced the standard deviations for mechanical responses. This information holds particular significance as it pertains to the frequent use of these knitted fabric structures as reinforcement in textile-reinforced composite materials. Overall, this study sheds light on the mechanical behavior in structures of knitted fabrics used in polymeric matrix composites, providing valuable insights for the design and optimization of advanced textile-based materials.
ABSTRACT
Atmospheric Particulate Matter (PM) is a pollutant with diverse origins, exhibiting varying chemical compositions, and undergoes several molecular transformations in the atmosphere. In this study, PM samples (PM2.5, PM10 and TSP) were collected in five Brazilian cities (Camboriú-SC; Catalão-GO; Florianópolis-SC; Limeira-SP and Novo Hamburgo-RS) during the four seasons of the year. Analysis of Variance (ANOVA) was used to evaluate the differences between each city and season in PM concentration. PM10 average concentrations were higher in the city of Limeira, compared to the other (ANOVA p-values and Tukey's test). Moreover, Tukey's test demonstrated differences between the average PM10 concentrations in summer and winter. Regarding TSP and PM2.5, Tukey's test showed differences between winter and warm seasons (spring and summer). Moreover, polar compounds from the samples collected in the summer (February) and winter (August) periods were analyzed (Ultra-High-Performance Liquid Chromatography coupled to a Quadrupole Time-of-Flight Mass Spectrometer) following a non-targeted approach and annotated. This is the first study to carry out this type of analysis in these five Brazilian cities. Despite the differences in PM concentrations, profiles of polar organic compounds, showed similarities between samples/and, in general, the same compounds were present, albeit with different intensities. The annotated compounds are associated with vehicle emissions and plastics, which are considered important global air polluters. Therefore, there is an urgent necessity for comprehensive studies aimed at investigating the non-targeted compounds existing in the atmosphere. Such research can provide invaluable insights to policymakers, enabling them to formulate effective guidelines and policies to mitigate particulate matter concentration and enhance overall air quality.
Subject(s)
Air Pollutants , Air Pollution , Particulate Matter/analysis , Air Pollutants/analysis , Brazil , Environmental Monitoring/methods , Air Pollution/analysis , Cities , Seasons , ChinaABSTRACT
Water-soluble polymers provide an alternative to organic solvent requirements in membrane manufacture, aiming at accomplishing the Green Chemistry principles. Poly(vinyl alcohol) (PVA) is a biodegradable and non-toxic polymer renowned for its solubility in water. However, PVA is little explored in membrane processes due to its hydrophilicity, which reduces its stability and performance. Crosslinking procedures through an esterification reaction with carboxylic acids can address this concern. For this, experimental design methodology and statistical analysis were employed to achieve the optimal crosslinking conditions of PVA with citric acid as a crosslinker, aiming at the best permeate production and sodium diclofenac (DCF) removal from water. The membranes were produced following an experimental design and characterized using multiple techniques to understand the effect of crosslinking on the membrane performance. Characterization and filtration results demonstrated that crosslinking regulates the membranes' properties, and the optimized conditions (crosslinking at 110 °C for 110 min) produced a membrane able to remove 44% DCF from water with a permeate production of 2.2 L m-2 h-1 at 3 bar, comparable to commercial loose nanofiltration membranes. This study contributes to a more profound knowledge of green membranes to make water treatment a sustainable practice in the near future.
ABSTRACT
Resumen: Objetivo: Cuando es estudiada la validez de contenido en dos grupos independientes de jueces expertos, se requiere hacer una prueba formal de las diferencias entre sus juicios, dado que es posible obtener distintos juicios de validez de contenido. Pero, generalmente, la investigación de la validez de contenido no examina esta posible fuente de discrepancias. El presente reporte describe la implementación de un método para evaluar la diferencia de coeficientes V de Aiken aplicado al trabajo investigativo en ciencias del deporte. Metodología: El procedimiento aplica una adaptación para construir el intervalo de confianza de la diferencia entre coeficientes V de Aiken y también implementa un estimador estandarizado del tamaño de la distinción entre los coeficientes V, específicamente, la transformación arcoseno de coeficientes V. Resultados: Se desarrollan dos ejemplos, en un marco de análisis secundario de datos, y se demuestra la diferencia entre la conclusión con base impresionista y la conclusión con base empírica y evaluación formal. Se detectaron distinciones estadísticas no observadas previamente. Conclusiones e implicaciones: El método que estima diferencias de coeficientes de validez de contenido V de Aiken para la investigación permite un avance en la metodología que valida instrumentos de medición. Se valora la aplicabilidad de este procedimiento en el contexto de ciencias del deporte y ciencias de la educación, así como en el diseño de la investigación.
Abstract: Objective: When two independent groups of expert judges study content validity, a formal test of the differences between their judgments is required, since different content validity judgments can be obtained. But generally, content validity research does not examine this likely source of discrepancies. This report describes the implementation of a method to evaluate the difference in Aiken's V coefficients applied to research work in sports science. Methodology: The procedure applies an adaptation to construct the confidence interval of the difference between Aiken's V coefficients and also implements a standardized estimator of the size of the difference between the V coefficients, specifically the arcsine transformation of V coefficients. Results: In a secondary data analysis framework, two examples are developed, extracting data from both publications, and the difference between the impressionist-based conclusion and the empirical-based conclusion and formal evaluation is demonstrated. Statistical differences not previously observed were detected. Conclusions and implications: The method to estimate differences in Aiken's content validity coefficients for research allows an advance in the methodology to validate measurement instruments. The applicability of this procedure in the context of sports sciences and education sciences, as well as in the research design involved, is assessed.
Resumo: Objetivo: Quando a validade do conteúdo é estudada em dois grupos independentes de juízes especialistas, é necessário um teste formal das diferenças entre seus julgamentos, uma vez que é possível obter diferentes julgamentos de validade de conteúdo. Mas a pesquisa de validade do conteúdo geralmente não examina esta possível fonte de discrepâncias. Este relatório descreve a implementação de um método para avaliar a diferença dos coeficientes V de Aiken aplicada ao trabalho de investigação das ciências do desporto. Metodologia: O procedimento aplica uma adaptação para construir o intervalo de confiança da diferença entre os coeficientes V de Aiken, e também implementa um estimador estandardizado do tamanho da diferença entre os coeficientes V, especificamente a transformação arco-seno dos coeficientes V. Resultados: Dois exemplos são desenvolvidos em uma estrutura secundária de análise de dados, e a diferença entre a conclusão baseada no impressionismo e a conclusão baseada no empirismo com avaliação formal é demonstrada. Diferenças estatísticas não observadas anteriormente foram detectadas. Conclusões e implicações: O método para estimar as diferenças nos coeficientes de validade do conteúdo V de Aiken para a pesquisa permite um avanço na metodologia de validação dos instrumentos de medição. A aplicabilidade deste procedimento no contexto da ciência do esporte e da ciência educacional, assim como no projeto de pesquisa, é avaliada.
Subject(s)
Psychological Tests , Sports , Confidence Intervals , Psychometrics , Data Interpretation, Statistical , Validation StudyABSTRACT
In this article, we propose a comparative study between two models that can be used by researchers for the analysis of survival data: (i) the Weibull regression model and (ii) the random survival forest (RSF) model. The models are compared considering the error rate, the performance of the model through the Harrell C-index, and the identification of the relevant variables for survival prediction. A statistical analysis of a data set from the Heart Institute of the University of São Paulo, Brazil, has been carried out. In the study, the length of stay of patients undergoing cardiac surgery, within the operating room, was used as the response variable. The obtained results show that the RSF model has less error rate for the training and testing data sets, at 23.55% and 20.31%, respectively, than the Weibull model, which has an error rate of 23.82%. Regarding the Harrell C-index, we obtain the values 0.76, 0.79, and 0.76, for the RSF and Weibull models, respectively. After the selection procedure, the Weibull model contains variables associated with the type of protocol and type of patient being statistically significant at 5%. The RSF model chooses age, type of patient, and type of protocol as relevant variables for prediction. We employ the randomForestSRC package of the R software to perform our data analysis and computational experiments. The proposal that we present has many applications in biology and medicine, which are discussed in the conclusions of this work.
ABSTRACT
Chronic kidney disease (CKD) is a serious public health issue affecting thousands of people worldwide. CKD diagnosis is usually made by Estimated Glomerular Filtration Rate (eGFR) and albuminuria, which limit the knowledge of the mechanisms behind CKD progression. The aim of the present study was to identify changes in the metabolomic profile that occur as CKD advances. In this sense, 77 plasma samples from patients with CDK were evaluated by 1D and 2D Nuclear Magnetic Resonance Spectroscopy (NMR). The NMR data showed significant changes in the metabolomic profile of CKD patients and the control group. Principal component analysis (PCA) clustered CKD and control patients into three distinct groups, control, stage 1 (G1)-stage 4 (G4) and stage 5 (G5). Lactate, glucose, acetate and creatinine were responsible for discriminating the control group from all the others CKD stages. Valine, alanine, glucose, creatinine, glutamate and lactate were responsible for the clustering of G1-G4 stages. G5 was discriminated by calcium ethylenediamine tetraacetic acid, magnesium ethylenediamine tetraacetic acid, creatinine, betaine/choline/trimethylamine N-oxide (TMAO), lactate and acetate. CKD G5 plasma pool which was submitted in MetaboAnalyst 4.0 platform (MetPA) analysis and showed 13 metabolic pathways involved in CKD physiopathology. Metabolic changes associated with glycolysis and gluconeogenesis allowed discriminating between CKD and control patients. The determination of involved molecules in TMAO generation in G5 suggests an important role in this uremic toxin linked to CKD and cardiovascular diseases. The aforementioned results propose the feasibility of metabolic assessment of CKD by NMR during treatment and disease progression.
Subject(s)
Renal Insufficiency, Chronic , Humans , Proton Magnetic Resonance Spectroscopy , Creatinine , Renal Insufficiency, Chronic/diagnosis , Magnetic Resonance Spectroscopy , Lactates , EthylenediaminesABSTRACT
Fundamento: el análisis estadístico implicativo surgió en los años 80 para resolver problemas de la didáctica de las matemáticas. Recientemente se fundamentó su empleo en las Ciencias Médicas para identificar factores de riesgo y pronósticos. Objetivo: evaluar la utilidad del análisis estadístico implicativo en la identificación de los factores pronósticos que más inciden en la mortalidad por linfomas en niños y adolescentes. Método: se realizó un estudio de casos y controles en niños y adolescentes con diagnóstico de linfoma Hodgkin y no Hodgkin atendidos en el Hospital Docente Pediátrico Sur Dr. Antonio María Béguez César de Santiago de Cuba en el período de enero 2008 a enero 2021. Se analizó como variable dependiente el estado del paciente fallecido o vivo al momento del estudio y como covariables se tomaron: el estadio de mal pronóstico, la presencia de síntomas B, el subtipo histológico, la presencia de tres o más sitios extraganglionares, la metástasis, edad y presencia de masa tumoral. Se aplicaron dos técnicas estadísticas, la regresión logística binaria y el análisis estadístico implicativo. Resultados en los casos fue más frecuente el linfoma no Hodgkin mientras que en los controles predominó el Hodgkin. Ambas técnicas reconocieron el subtipo histológico y la afectación extraganglionar como factores pronósticos desfavorables. El análisis estadístico implicativo reconoció además el estadio y la presencia de metástasis. Conclusión: el análisis estadístico implicativo es una técnica que complementa la regresión logística binaria en la identificación de factores pronósticos, lo que permite mejor comprensión de la causalidad.
Background: the implicative statistical analysis arose in the 80s to solve problems in the didactics of mathematics. Its use in the Medical Sciences to identify risk factors and prognoses was recently founded. Objective: to evaluate the usefulness of the implicative statistical analysis in the identification of the prognostic factors that most affect mortality from lymphomas in children and adolescents. Method: a case-control study was carried out in children and adolescents diagnosed with Hodgkin and non-Hodgkin lymphoma treated at the Dr. Antonio María Béguez César Sur Pediatric Teaching Hospital in Santiago de Cuba from January 2008 to January 2021. The state of the deceased or alive patient at the time of the study was analyzed as the dependent variable and the following were taken as covariates: poor prognosis stage, presence of B symptoms, histological subtype, presence of three or more extranodal sites, metastasis, age and presence of tumor mass. Two statistical techniques were applied: binary logistic regression and implicative statistical analysis. Results: non-Hodgkin's lymphoma was more frequent in the cases, while Hodgkin's lymphoma predominated in the controls. Both techniques recognized the histological subtype and extranodal involvement as unfavorable prognostic factors. The implicative statistical analysis also recognized the stage and the presence of metastases. Conclusion: the implicative statistical analysis is a technique that complements the binary logistic regression in the identification of prognostic factors, which allows a better understanding of causality.
ABSTRACT
The degradation of biopolymers such as polylactic acid (PLA) has been studied for several years; however, the results regarding the mechanism of degradation are not completely understood yet. PLA is easily processed by traditional techniques including injection molding, blow molding, extrusion, and thermoforming; in this research, the extrusion and injection molding processes were used to produce PLA samples for accelerated destructive testing. The methodology employed consisted of carrying out material testing under the guidelines of several ASTM standards; this research hypothesized that the effects of UV light, humidity, and temperature exposure have a statistical difference in the PLA degradation rate. The multivariate analysis of non-parametric data is presented as an alternative to multivariate analysis, in which the data do not satisfy the essential assumptions of a regular MANOVA, such as multivariate normality. A package in the R software that allows the user to perform a non-parametric multivariate analysis when necessary was used. This paper presents a study to determine if there is a significant difference in the degradation rate after 2000 h of accelerated degradation of a biopolymer using the multivariate and non-parametric analyses of variance. The combination of the statistical techniques, multivariate analysis of variance and repeated measures, provided information for a better understanding of the degradation path of the biopolymer.
ABSTRACT
Prokopidis et al have conducted a meta-analysis of randomized, placebo-controlled clinical trials to assess the effects of oral creatine supplementation on memory performance of healthy individuals. However, concerns were raised regarding the validity of their statistical analyses, which may have led to misleading conclusions. In this letter, we describe the statistical issue at hand and its potential implications.
Subject(s)
Creatine , Dietary Supplements , Humans , Creatine/pharmacology , Randomized Controlled Trials as Topic , Research Design , Health StatusABSTRACT
The use of non-conventional carbon sources for biosurfactant-producing microorganisms is a promising alternative in fermentation to substitute costly substrates. So, the current research used pineapple peel as a cost-effective and renewable substrate because of its rich composition in minerals and sugars and high availability. Following a 22 full factorial design, a bacterial strain of Bacillus subtilis produced biosurfactants in fermentative media containing different concentrations of glucose and concentrated pineapple peel juice (CPPJ). The influence of these two independent variables was evaluated according to three different responses: surface tension reduction rate (STRR), emulsification index (EI24), and concentration of semi-purified biosurfactant (SPB). The maximum value for STRR (57.63%) was obtained in media containing 0.58% glucose (w/v) and 5.82% CPPJ (v/v), while the highest EI24 response (58.60%) was observed at 2% glucose (w/v) and 20% CPPJ (v/v) and maximum SPB (1.28 g/L) at 3.42% glucose (w/v) and 34.18% CPPJ (v/v). Statistical analysis indicated that the CPPJ variable mostly influenced the STRR and SPB responses, whereas the EI24 was significantly influenced by pineapple peel juice and glucose contents.
Subject(s)
Ananas , Research Design , Surface-Active Agents/chemistry , Bacillus subtilis , Glucose , Surface TensionABSTRACT
BACKGROUND: Cardiac amyloidosis (CA) is an under-diagnosed disease presenting as a restrictive cardiomyopathy with high morbidity and mortality. Wild-type transthyretin amyloid cardiomyopathy (ATTR-CM) is mostly seen in elderly patients, with increasing prevalence as life expectancy is growing. New diagnostic imaging techniques and treatments allow for a better prognosis, but lack of clinical awareness delays timely diagnosis and appropriate management. Our purpose was to investigate the knowledge of clinicians regarding ATTR-CM and to assess the availability of imaging resources in the Latin-American region. METHODS AND RESULTS: Two online surveys were distributed among clinicians and nuclear medicine professionals, respectively: one asking about awareness of CA in different clinical scenarios, and the other about the availability of diagnostic resources and studies performed. 406 responses were received for the first survey and 82 for the second, representing 17 and 14 countries, respectively. A significant lack of awareness was identified among clinicians, although appropriate diagnostic resources are generally available. Survey data showed that very few patients are evaluated for ATTR-CM in most Latin-American countries. CONCLUSIONS: The surveys demonstrated the need for educational programs and other measures to increase clinical awareness and early detection of CA, so patients receive timely treatment and management of the disease.