Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 651
Filtrar
Mais filtros

Coleção Fiocruz
Intervalo de ano de publicação
1.
Cytotherapy ; 26(4): 383-392, 2024 04.
Artigo em Inglês | MEDLINE | ID: mdl-38349312

RESUMO

BACKGROUND AIMS: The appearance of genetically variant populations in human pluripotent stem cell (hPSC) cultures represents a concern for research and clinical applications. Genetic variations may alter hPSC differentiation potential or cause phenotype variation in differentiated cells. Further, variants may have properties such as proliferative rate, or response to the culture environment, that differ from wild-type cells. As such, understanding the behavior of these variants in culture, and any potential operational impact on manufacturing processes, will be necessary to control quality of putative hPSC-based products that include a proportion of variant threshold in their quality specification. METHODS: Here we show a computational model that mathematically describes the growth dynamics between commonly occurring genetically variant hPSCs and their counterpart wild-type cells in culture. RESULTS: We show that our model is capable of representing the growth behaviors of both wild-type and variant hPSCs in individual and co-culture systems. CONCLUSIONS: This representation allows us to identify three critical process parameters that drive critical quality attributes when genetically variant cells are present within the system: total culture density, proportion of variant cells within the culture system and variant cell overgrowth. Lastly, we used our model to predict how the variability of these parameters affects the prevalence of both populations in culture.


Assuntos
Técnicas de Cultura de Células , Células-Tronco Pluripotentes , Humanos , Diferenciação Celular/genética , Técnicas de Cocultura
2.
Biotechnol Bioeng ; 121(1): 53-70, 2024 01.
Artigo em Inglês | MEDLINE | ID: mdl-37691172

RESUMO

Recombinant adeno-associated virus (rAAV) is rapidly emerging as the preferred delivery vehicle for gene therapies, with promising advantages in safety and efficacy. Key challenges in systemic in-vivo rAAV gene therapy applications are the gap in production capabilities versus potential market demand and complex production process. This review summarizes current available information on rAAV upstream manufacturing processes and proposed optimizations for production. The advancements in rAAV production media were reviewed with proposals to speed up the cell culture process development. Furthermore, major methods for genetic element delivery to host cells were summarized with their advantages, limitations, and future directions for optimization. In addition, culture vessel selection criteria were listed based on production cell system, scale, and development stage. Process control at the production step was also outlined with an in-depth understanding of production kinetics and quality control.


Assuntos
Dependovirus , Vetores Genéticos , Vetores Genéticos/genética , Dependovirus/genética , Técnicas de Cultura de Células , Terapia Genética
3.
Biotechnol Bioeng ; 121(2): 655-669, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38031493

RESUMO

A physics-based model for predicting cell culture fluid properties inside a stirred tank bioreactor with embedded PID controller logic is presented. The model evokes a time-accurate solution to the fluid velocity field and overall volumetric mass transfer coefficient, as well as the ongoing effects of interfacial mass transfer, species mixing, and aqueous chemical reactions. The modeled system also includes a direct coupling between process variables and system control variables via embedded controller logic. Satisfactory agreement is realized between the model prediction and measured bioreactor data in terms of the steady-state operating conditions and the response to setpoint changes. Simulation runtimes are suitable for industrial research and design timescales.


Assuntos
Reatores Biológicos , Oxigênio , Oxigênio/química , Técnicas de Cultura de Células , Simulação por Computador , Concentração de Íons de Hidrogênio
4.
Biotechnol Bioeng ; 121(10): 3114-3127, 2024 Oct.
Artigo em Inglês | MEDLINE | ID: mdl-38938008

RESUMO

Ethanol production is a significant industrial bioprocess for energy. The primary objective of this study is to control the process reactor temperature to get the desired product, that is, ethanol. Advanced model-based control systems face challenges due to model-process mismatch, but Reinforcement Learning (RL) is a class of machine learning which can help by allowing agents to learn policies directly from the environment. Hence a RL algorithm called twin delayed deep deterministic policy gradient (TD3) is employed. The control of reactor temperature is categorized into two categories namely unconstrained and constrained control approaches. The TD3 with various reward functions are tested on a nonlinear bioreactor model. The results are compared with existing popular RL algorithm, namely, deep deterministic policy gradient (DDPG) algorithm with a performance measure such as mean squared error (MSE). In the unconstrained control of the bioreactor, the TD3 based controller designed with the integral absolute error (IAE) reward yields a lower MSE of 0.22, whereas the DDPG produces an MSE of 0.29. Similarly, in the case of constrained controller, TD3 based controller designed with the IAE reward yields a lower MSE of 0.38, whereas DDPG produces an MSE of 0.48. In addition, the TD3 trained agent successfully rejects the disturbances, namely, input flow rate and inlet temperature in addition to a setpoint change with better performance metrics.


Assuntos
Reatores Biológicos , Etanol , Fermentação , Aprendizado de Máquina , Temperatura , Etanol/metabolismo , Modelos Biológicos , Algoritmos
5.
Biotechnol Bioeng ; 121(4): 1271-1283, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38258490

RESUMO

"Giving the cells exactly what they need, when they need it" is the core idea behind the proposed bioprocess control strategy: operating bioprocess based on the physiological behavior of the microbial population rather than exclusive monitoring of environmental parameters. We are envisioning to achieve this through the use of genetically encoded biosensors combined with online flow cytometry (FCM) to obtain a time-dependent "physiological fingerprint" of the population. We developed a biosensor based on the glnA promoter (glnAp) and applied it for monitoring the nitrogen-related nutritional state of Escherichia coli. The functionality of the biosensor was demonstrated through multiple cultivation runs performed at various scales-from microplate to 20 L bioreactor. We also developed a fully automated bioreactor-FCM interface for on-line monitoring of the microbial population. Finally, we validated the proposed strategy by performing a fed-batch experiment where the biosensor signal is used as the actuator for a nitrogen feeding feedback control. This new generation of process control, -based on the specific needs of the cells, -opens the possibility of improving process development on a short timescale and therewith, the robustness and performance of fermentation processes.


Assuntos
Reatores Biológicos , Técnicas Biossensoriais , Fermentação , Escherichia coli , Nitrogênio
6.
Biotechnol Bioeng ; 121(9): 2868-2880, 2024 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-38812405

RESUMO

Reinforcement learning (RL), a subset of machine learning (ML), could optimize and control biomanufacturing processes, such as improved production of therapeutic cells. Here, the process of CAR T-cell activation by antigen-presenting beads and their subsequent expansion is formulated in silico. The simulation is used as an environment to train RL-agents to dynamically control the number of beads in culture to maximize the population of robust effector cells at the end of the culture. We make periodic decisions of incremental bead addition or complete removal. The simulation is designed to operate in OpenAI Gym, enabling testing of different environments, cell types, RL-agent algorithms, and state inputs to the RL-agent. RL-agent training is demonstrated with three different algorithms (PPO, A2C, and DQN), each sampling three different state input types (tabular, image, mixed); PPO-tabular performs best for this simulation environment. Using this approach, training of the RL-agent on different cell types is demonstrated, resulting in unique control strategies for each type. Sensitivity to input-noise (sensor performance), number of control step interventions, and advantages of pre-trained RL-agents are also evaluated. Therefore, we present an RL framework to maximize the population of robust effector cells in CAR T-cell therapy production.


Assuntos
Aprendizado de Máquina , Linfócitos T , Linfócitos T/imunologia , Humanos , Simulação por Computador , Ativação Linfocitária , Receptores de Antígenos Quiméricos/imunologia , Imunoterapia Adotiva/métodos , Técnicas de Cultura de Células/métodos
7.
Pharm Res ; 41(5): 983-1006, 2024 May.
Artigo em Inglês | MEDLINE | ID: mdl-38561580

RESUMO

OBJECTIVE: This research aims to elucidate critical impurities in process validation batches of tacrolimus injection formulations, focusing on identification and characterization of previously unreported impurity at RRT 0.42, identified as the tacrolimus alcohol adduct. The potential root causes for the formation of new impurity was determined using structured risk assessment by cause and effect fishbone diagram. The primary objective was to propose mitigation plan and demonstrate the control of impurities with 6 month accelerated stability results in development batches. METHODS: The investigation utilizes method validation and characterization studies to affirm the accuracy of quantifying the tacrolimus alcohol adduct. The research methodology employed different characterization techniques like rotational rheometer, ICP‒MS, MALDI-MS, 1H NMR, 13C NMR, and DEPT-135 NMR for structural elucidation. Additionally, the exact mass of the impurity is validated using electrospray ionization mass spectra. RESULTS: Results indicate successful identification and characterization of the tacrolimus alcohol adduct. The study further explores the transformation of Tacrolimus monohydrate under various conditions, unveiling the formation of Tacrolimus hydroxy acid and proposing the existence of a novel degradation product, the Tacrolimus alcohol adduct. Six-month data from development lots utilizing Manufacturing Process II demonstrate significantly lower levels of alcohol adducts. CONCLUSIONS: Manufacturing Process II, selectively locates Tacrolimus within the micellar core of HCO-60, this prevent direct contact of ethanol with Tacrolimus which minimizes impurity alcohol adduct formation. This research contributes to the understanding of tacrolimus formulations, offering ways to safeguard product integrity and stability during manufacturing and storage.


Assuntos
Contaminação de Medicamentos , Imunossupressores , Tacrolimo , Contaminação de Medicamentos/prevenção & controle , Tacrolimo/química , Tacrolimo/análise , Imunossupressores/química , Imunossupressores/análise , Estabilidade de Medicamentos , Álcoois/química , Álcoois/análise , Composição de Medicamentos/métodos , Espectroscopia de Ressonância Magnética/métodos
8.
Clin Chem Lab Med ; 62(12): 2451-2460, 2024 Nov 26.
Artigo em Inglês | MEDLINE | ID: mdl-38748888

RESUMO

OBJECTIVES: Patient-based real-time quality control (PBRTQC) is an alternative tool for laboratories that has gained increasing attention. Despite the progress made by using various algorithms, the problems of data volume imbalance between in-control and out-of-control results, as well as the issue of variation remain challenges. We propose a novel integrated framework using anomaly detection and graph neural network, combining clinical variables and statistical algorithms, to improve the error detection performance of patient-based quality control. METHODS: The testing results of three representative analytes (sodium, potassium, and calcium) and eight independent variables of patients (test date, time, gender, age, department, patient type, and reference interval limits) were collected. Graph-based anomaly detection network was modeled and used to generate control limits. Proportional and random errors were simulated for performance evaluation. Five mainstream PBRTQC statistical algorithms were chosen for comparison. RESULTS: The framework of a patient-based graph anomaly detection network for real-time quality control (PGADQC) was established and proven feasible for error detection. Compared with classic PBRTQC, the PGADQC showed a more balanced performance for both positive and negative biases. For different analytes, the average number of patient samples until error detection (ANPed) of PGADQC decreased variably, and reductions could reach up to approximately 95 % at a small bias of 0.02 taking calcium as an example. CONCLUSIONS: The PGADQC is an effective framework for patient-based quality control, integrating statistical and artificial intelligence algorithms. It improves error detection in a data-driven fashion and provides a new approach for PBRTQC from the data science perspective.


Assuntos
Algoritmos , Controle de Qualidade , Humanos , Redes Neurais de Computação , Feminino , Masculino , Sódio/análise , Sódio/sangue , Cálcio/análise , Cálcio/sangue , Potássio/análise , Potássio/sangue , Adulto
9.
Environ Res ; 252(Pt 4): 119133, 2024 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-38735379

RESUMO

Phosphorus in wastewater poses a significant environmental threat, leading to water pollution and eutrophication. However, it plays a crucial role in the water-energy-resource recovery-environment (WERE) nexus. Recovering Phosphorus from wastewater can close the phosphorus loop, supporting circular economy principles by reusing it as fertilizer or in industrial applications. Despite the recognized importance of phosphorus recovery, there is a lack of analysis of the cyber-physical framework concerning the WERE nexus. Advanced methods like automatic control, optimal process technologies, artificial intelligence (AI), and life cycle assessment (LCA) have emerged to enhance wastewater treatment plants (WWTPs) operations focusing on improving effluent quality, energy efficiency, resource recovery, and reducing greenhouse gas (GHG) emissions. Providing insights into implementing modeling and simulation platforms, control, and optimization systems for Phosphorus recovery in WERE (P-WERE) in WWTPs is extremely important in WWTPs. This review highlights the valuable applications of AI algorithms, such as machine learning, deep learning, and explainable AI, for predicting phosphorus (P) dynamics in WWTPs. It emphasizes the importance of using AI to analyze microbial communities and optimize WWTPs for different various objectives. Additionally, it discusses the benefits of integrating mechanistic and data-driven models into plant-wide frameworks, which can enhance GHG simulation and enable simultaneous nitrogen (N) and Phosphorus (P) removal. The review underscores the significance of prioritizing recovery actions to redirect Phosphorus from effluent to reusable products for future considerations.


Assuntos
Fósforo , Eliminação de Resíduos Líquidos , Águas Residuárias , Fósforo/análise , Águas Residuárias/química , Águas Residuárias/análise , Eliminação de Resíduos Líquidos/métodos , Inteligência Artificial , Purificação da Água/métodos , Poluentes Químicos da Água/análise , Poluentes Químicos da Água/química
10.
J Appl Clin Med Phys ; 25(2): e14154, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-37683120

RESUMO

BACKGROUND: Tolerance limit is defined on pre-treatment patient specific quality assurance results to identify "out of the norm" dose discrepancy in plan. An out-of-tolerance plan during measurement can often cause treatment delays especially if replanning is required. In this study, we aim to develop an outlier detection model to identify out-of-tolerance plan early during treatment planning phase to mitigate the above-mentioned risks. METHODS: Patient-specific quality assurance results with portal dosimetry for stereotactic body radiotherapy measured between January 2020 and December 2021 were used in this study. Data were divided into thorax and pelvis sites and gamma passing rates were recorded using 2%/2 mm, 2%/1 mm, and 1%/1 mm gamma criteria. Statistical process control method was used to determine six different site and criterion-specific tolerance and action limits. Using only the inliers identified with our determined tolerance limits, we trained three different outlier detection models using the plan complexity metrics extracted from each treatment field-robust covariance, isolation forest, and one class support vector machine. The hyperparameters were optimized using the F1-score calculated from both the inliers and validation outliers' data. RESULTS: 308 pelvis and 200 thorax fields were used in this study. The tolerance (action) limits for 2%/2 mm, 2%/1 mm, and 1%/1 mm gamma criteria in the pelvis site are 99.1% (98.1%), 95.8% (91.1%), and 91.7% (86.1%), respectively. The tolerance (action) limits in the thorax site are 99.0% (98.7%), 97.0% (96.2%), and 91.5% (87.2%). One class support vector machine performs the best among all the algorithms. The best performing model in the thorax (pelvis) site achieves a precision of 0.56 (0.54), recall of 1.0 (1.0), and F1-score of 0.72 (0.70) when using the 2%/2 mm (2%/1 mm) criterion. CONCLUSION: The model will help the planner to identify an out-of-tolerance plan early so that they can refine the plan further during the planning stage without risking late discovery during measurement.


Assuntos
Radiocirurgia , Radioterapia de Intensidade Modulada , Humanos , Planejamento da Radioterapia Assistida por Computador/métodos , Dosagem Radioterapêutica , Algoritmos , Pelve , Radiometria/métodos , Radioterapia de Intensidade Modulada/métodos , Garantia da Qualidade dos Cuidados de Saúde
11.
Sensors (Basel) ; 24(19)2024 Oct 02.
Artigo em Inglês | MEDLINE | ID: mdl-39409441

RESUMO

Corn syrup is a cost-effective sweetener ingredient for the food industry. In producing syrup from corn, process control to enhance and/or maintain a constant dextrose equivalent value (DE) is a constant challenge, especially in semi-automated/batch production settings, which are common in small to medium-size factories. Existing work has focused on continuous process control to keep parameter values within a setpoint. The machine learning method applied is for time series data. This study focuses on building process control models to enable semi-automation in small to medium-size factories in which the data are not as time dependent. Correlation coefficients were used to identify key process parameters that contribute to feed pH value and DE. Artificial neural network (ANN), support vector machine (SVM), and linear regression (LR) models were built to predict feed pH and DE. The results suggest (1) model accuracy ranges from 91% to 96%; (2) the ANN models yielded about 1% to 3% higher accuracy than the SVM and LR models and the prediction accuracy is robust even with as few as six data sets; (3) both the SVM and ANN models have noise tolerant properties, but ANN has a higher noise tolerance than SVM; (4) SVM performance can be hindered when using high-dimensional data sets; (5) the LR model yields higher variation in accuracy prediction than ANN and SVM; (6) distribution fitting is a good approach for generating data; however, fidelity of fitting can greatly impact accuracy; and (7) multi-stage models yield higher accuracy than single-stage models, but there are pros and cons to each approach.

12.
Sensors (Basel) ; 24(12)2024 Jun 14.
Artigo em Inglês | MEDLINE | ID: mdl-38931654

RESUMO

Conveyor belts serve as the primary mode of ore transportation in mineral processing plants. Feeders, comprised of shorter conveyors, regulate the material flow from silos to longer conveyor belts by adjusting their velocity. This velocity manipulation is facilitated by automatic controllers that gauge the material weight on the conveyor using scales. However, due to positioning constraints of these scales, a notable delay ensues between measurement and the adjustment of the feeder speed. This dead time poses a significant challenge in control design, aiming to prevent oscillations in material levels on the conveyor belt. This paper contributes in two key areas: firstly, through a simulation-based comparison of various control techniques addressing this issue across diverse scenarios; secondly, by implementing the Smith predictor solution in an operational plant and contrasting its performance with that of a single PID controller. Evaluation spans both the transient flow rate during step change setpoints and a month-long assessment. The experimental results reveal a notable increase in production by 355 t/h and a substantial reduction in flow rate oscillations on the conveyor belt, evidenced by a 55% decrease in the standard deviation.

13.
Sensors (Basel) ; 24(14)2024 Jul 12.
Artigo em Inglês | MEDLINE | ID: mdl-39065909

RESUMO

This research proposes advanced model-based control strategies for a countercurrent flow plate heat exchanger in a virtual environment. A virtual environment with visual and auditory effects is designed, which requires a mathematical model describing the real dynamics of the process; this allows parallel fluid movement in different directions with hot and cold temperatures at the outlet, incorporating control monitoring interfaces as communication links between the virtual heat exchanger and control applications. A multivariable and non-linear process like the plate and countercurrent flow heat exchanger requires analysis in the controller design; therefore, this work proposes and compares two control strategies to identify the best-performing one. The first controller is based on the inverse model of the plant, with linear algebra techniques and numerical methods; the second controller is a model predictive control (MPC), which presents optimal control actions that minimize the steady-state errors and aggressive variations in the actuators, respecting the temperature constraints and the operating limits, incorporating a predictive model of the plant. The controllers are tested for different setpoint changes and disturbances, determining that they are not overshot and that the MPC controller has the shortest settling time and lowest steady-state error.

14.
Sensors (Basel) ; 24(10)2024 May 15.
Artigo em Inglês | MEDLINE | ID: mdl-38793986

RESUMO

In this paper, a dispersion of glass beads of different sizes in an ammonium nitrate solution is investigated with the aid of Raman spectroscopy. The signal losses caused by the dispersion are quantified by an additional scattered light measurement and used to correct the measured ammonium nitrate concentration. Each individual glass bead represents an interface at which the excitation laser is deflected from its direction causing distortion in the received Raman signal. It is shown that the scattering losses measured with the scattered light probe correlate with the loss of the Raman signal, which means that the data obtained can be used to correct the measured values. The resulting correction function considers different particle sizes in the range of 2-99 µm as well as ammonium nitrate concentrations of 0-20 wt% and delivers an RMSEP of 1.952 wt%. This correction provides easier process access to dispersions that were previously difficult or impossible to measure.

15.
Sensors (Basel) ; 24(11)2024 May 30.
Artigo em Inglês | MEDLINE | ID: mdl-38894312

RESUMO

To evaluate the suitability of an analytical instrument, essential figures of merit such as the limit of detection (LOD) and the limit of quantification (LOQ) can be employed. However, as the definitions k nown in the literature are mostly applicable to one signal per sample, estimating the LOD for substances with instruments yielding multidimensional results like electronic noses (eNoses) is still challenging. In this paper, we will compare and present different approaches to estimate the LOD for eNoses by employing commonly used multivariate data analysis and regression techniques, including principal component analysis (PCA), principal component regression (PCR), as well as partial least squares regression (PLSR). These methods could subsequently be used to assess the suitability of eNoses to help control and steer processes where volatiles are key process parameters. As a use case, we determined the LODs for key compounds involved in beer maturation, namely acetaldehyde, diacetyl, dimethyl sulfide, ethyl acetate, isobutanol, and 2-phenylethanol, and discussed the suitability of our eNose for that dertermination process. The results of the methods performed demonstrated differences of up to a factor of eight. For diacetyl, the LOD and the LOQ were sufficiently low to suggest potential for monitoring via eNose.


Assuntos
Cerveja , Nariz Eletrônico , Limite de Detecção , Análise de Componente Principal , Cerveja/análise , Análise dos Mínimos Quadrados , Compostos Orgânicos Voláteis/análise
16.
Sensors (Basel) ; 24(2)2024 Jan 05.
Artigo em Inglês | MEDLINE | ID: mdl-38257407

RESUMO

In the present study, the influence of disperse systems on Raman scattering was investigated. How an increasing particle concentration weakens the quantitative signal of the Raman spectrum is shown. Furthermore, the change in the position of the optimal measurement point in the fluid was considered in detail. Additional transmission measurements can be used to derive a simple and robust correction method that allows the actual concentration of the continuous phase to be determined with a standard deviation of 2.6%.

17.
Molecules ; 29(13)2024 Jun 26.
Artigo em Inglês | MEDLINE | ID: mdl-38998979

RESUMO

To reduce unwanted fat bloom in the manufacturing and storage of chocolates, detailed knowledge of the chemical composition and molecular mobility of the oils and fats contained is required. Although the formation of fat bloom on chocolate products has been studied for many decades with regard to its prevention and reduction, questions on the molecular level still remain to be answered. Chocolate products with nut-based fillings are especially prone to undesirable fat bloom. The chemical composition of fat bloom is thought to be dominated by the triacylglycerides of the chocolate matrix, which migrate to the chocolate's surface and recrystallize there. Migration of oils from the fillings into the chocolate as driving force for fat bloom formation is an additional factor in the discussion. In this work, the migration was studied and confirmed by MRI, while the chemical composition of the fat bloom was measured by NMR spectroscopy and HPLC-MS, revealing the most important triacylglycerides in the fat bloom. The combination of HPLC-MS with NMR spectroscopy at 800 MHz allows for detailed chemical structure determination. A rapid routine was developed combining the two modalities, which was then applied to investigate the aging, the impact of chocolate composition, and the influence of hazelnut fillings processing parameters, such as the degree of roasting and grinding of the nuts or the mixing time, on fat bloom formation.


Assuntos
Chocolate , Espectroscopia de Ressonância Magnética , Chocolate/análise , Cromatografia Líquida de Alta Pressão/métodos , Espectroscopia de Ressonância Magnética/métodos , Espectrometria de Massas/métodos , Triglicerídeos/análise , Triglicerídeos/química , Cacau/química , Análise de Alimentos/métodos , Corylus/química , Espectrometria de Massa com Cromatografia Líquida
18.
Behav Res Methods ; 56(3): 1459-1475, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-37118646

RESUMO

Retrospective analyses of experience sampling (ESM) data have shown that changes in mean and variance levels may serve as early warning signs of an imminent depression. Detecting such early warning signs prospectively would pave the way for timely intervention and prevention. The exponentially weighted moving average (EWMA) procedure seems a promising method to scan ESM data for the presence of mean changes in real-time. Based on simulation and empirical studies, computing and monitoring day averages using EWMA works particularly well. We therefore expand this idea to the detection of variance changes and propose to use EWMA to prospectively scan for mean changes in day variability statistics (i.e., s 2 , s , ln( s )). When both mean and variance changes are of interest, the multivariate extension of EWMA (MEWMA) can be applied to both the day averages and a day statistic of variability. We evaluate these novel approaches to detecting variance changes by comparing them to EWMA-type procedures that have been specifically developed to detect a combination of mean and variance changes in the raw data: EWMA- S 2 , EWMA-ln( S 2 ), and EWMA- X ¯ - S 2 . We ran a simulation study to examine the performance of the two approaches in detecting mean, variance, or both types of changes. The results indicate that monitoring day statistics using (M)EWMA works well and outperforms EWMA- S 2 and EWMA-ln( S 2 ); the performance difference with EWMA- X ¯ - S 2 is smaller but notable. Based on the results, we provide recommendations on which statistic of variability to monitor based on the type of change (i.e., variance increase or decrease) one expects.


Assuntos
Avaliação Momentânea Ecológica , Modelos Estatísticos , Humanos , Estudos Retrospectivos , Simulação por Computador
19.
Environ Monit Assess ; 196(3): 231, 2024 Feb 03.
Artigo em Inglês | MEDLINE | ID: mdl-38308016

RESUMO

Across the globe, governments are developing policies and strategies to reduce carbon emissions to address climate change. Monitoring the impact of governments' carbon reduction policies can significantly enhance our ability to combat climate change and meet emissions reduction targets. One promising area in this regard is the role of artificial intelligence (AI) in carbon reduction policy and strategy monitoring. While researchers have explored applications of AI on data from various sources, including sensors, satellites, and social media, to identify areas for carbon emissions reduction, AI applications in tracking the effect of governments' carbon reduction plans have been limited. This study presents an AI framework based on long short-term memory (LSTM) and statistical process control (SPC) for the monitoring of variations in carbon emissions, using UK annual CO2 emission (per capita) data, covering a period between 1750 and 2021. This paper used LSTM to develop a surrogate model for the UK's carbon emissions characteristics and behaviours. As observed in our experiments, LSTM has better predictive abilities than ARIMA, Exponential Smoothing and feedforward artificial neural networks (ANN) in predicting CO2 emissions on a yearly prediction horizon. Using the deviation of the recorded emission data from the surrogate process, the variations and trends in these behaviours are then analysed using SPC, specifically Shewhart individual/moving range control charts. The result shows several assignable variations between the mid-1990s and 2021, which correlate with some notable UK government commitments to lower carbon emissions within this period. The framework presented in this paper can help identify periods of significant deviations from a country's normal CO2 emissions, which can potentially result from the government's carbon reduction policies or activities that can alter the amount of CO2 emissions.


Assuntos
Poluentes Atmosféricos , Aprendizado Profundo , Humanos , Poluentes Atmosféricos/análise , Dióxido de Carbono/análise , Carbono/análise , Inteligência Artificial , Monitoramento Ambiental , Governo , Políticas
20.
J Food Sci Technol ; 61(12): 2223-2234, 2024 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-39431196

RESUMO

Advancements in coffee processing technologies have led to improved efficiency in field operations, but challenges still exist in their practical implementation. Various alternatives and solutions have been proposed to enhance processing efficiency and address issues related to safety, standardization, and quality improvement in coffee production. A literature review using SciMAT and ScientoPy software highlighted advancements in fermentation tanks and the emergence of novel fermentation methodologies. However, these innovations lack sufficient scientific evidence. Researchers are now focusing on systematic approaches, such as controlled fermentations and evaluating the influence of microorganisms and process conditions on sensory attributes and coffee composition. Brazil is the leader in coffee bean fermentation research, but the number of published papers in the field has recently decreased. Despite this, efforts continue to improve process control and optimize product quality. The study emphasizes the need for further innovation in coffee fermentation technologies to increase efficiency, sustainability, and profitability while minimizing environmental impact. Implementing these advancements promises a more sustainable and quality-driven future for the coffee industry.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA