RESUMO
Advances in upstream production of biologics-particularly intensified fed-batch processes beyond 10% cell solids-have severely strained harvest operations, especially depth filtration. Bioreactors containing high amounts of cell debris (more than 40% particles <10 µm in diameter) are increasingly common and drive the need for more robust depth filtration processes, while accelerated timelines emphasize the need for predictive tools to accelerate development. Both needs are constrained by the current limited mechanistic understanding of the harvest filter-feedstream system. Historically, process development relied on screening scale-down depth filter devices and conditions to define throughput before fouling, indicated by increasing differential pressure and/or particle breakthrough (measured via turbidity). This approach is straightforward, but resource-intensive, and its results are inherently limited by the variability of the feedstream. Semi-empirical models have been developed from first principles to describe various mechanisms of filter fouling, that is, pore constriction, pore blocking, and/or surface deposit. Fitting these models to experimental data can assist in identifying the dominant fouling mechanism. Still, this approach sees limited application to guide process development, as it is descriptive, not predictive. To address this gap, we developed a hybrid modeling approach. Leveraging historical bench scale filtration process data, we built a partial least squares regression model to predict particle breakthrough from filter and feedstream attributes, and leveraged the model to demonstrate prediction of filter performance a priori. The fouling models are used to interpret and provide physical meaning to these computational models. This hybrid approach-combining the mechanistic insights of fouling models and the predictive capability of computational models-was used to establish a robust platform strategy for depth filtration of Chinese hamster ovary cell cultures. As new data continues to teach the computational models, in silico tools will become an essential part of harvest process development by enabling prospective experimental design, reducing total experimental load, and accelerating development under strict timelines.
Assuntos
Produtos Biológicos , Reatores Biológicos , Cricetulus , Filtração , Filtração/métodos , Animais , Células CHO , Modelos BiológicosRESUMO
Digitalization has paved the way for new paradigms such as digital shadows and digital twins for fermentation processes, opening the door for real-time process monitoring, control, and optimization. With a digital shadow, real-time model adaptation to accommodate complex metabolic phenomena such as metabolic shifts of a process can be monitored. Despite the many benefits of digitalization, the potential has not been fully reached in the industry. This study investigates the development of a digital shadow for aâ¯very complex fungal fermentation process in terms of microbial physiology and fermentation operation on pilot-scale at Novonesis and the challenges thereof. The process has historically been difficult to optimize and control due to a lack of offline measurements and an absence of biomass measurements. Pilot-scale and lab-scale fermentations were conducted for model development and validation. With all available pilot-scale data, a data-driven soft sensor was developed to estimate the main substrate concentration (glucose) with a normalized root mean squared error (N-RMSE) of 2%. This robust data-driven soft sensor was able to estimate accurately in lab-scale (volume < 20× pilot) with a N-RMSE of 7.8%. A hybrid soft sensor was developed by combining the data-driven soft sensor with a mass balance to estimate the glycerol and biomass concentrations on pilot-scale data with N-RMSEs of 11% and 21%, respectively. A digital shadow modeling framework was developed by coupling a mechanistic model (MM) with the hybrid soft sensor. The digital shadow modeling framework significantly improved the predictability compared with the MM. The contribution of this study brings the application of digital shadows closer to industrial implementation. It demonstrates the high potential of using this type of modeling framework for scale-up and leads the way to a new generation of in silico-based process development.
Assuntos
Reatores Biológicos , Glucose , Fermentação , Reatores Biológicos/microbiologia , Glicerol , BiomassaRESUMO
The combination of physical equations with deep learning is becoming a promising methodology for bioprocess digitalization. In this paper, we investigate for the first time the combination of long short-term memory (LSTM) networks with first principles equations in a hybrid workflow to describe human embryonic kidney 293 (HEK293) culture dynamics. Experimental data of 27 extracellular state variables in 20 fed-batch HEK293 cultures were collected in a parallel high throughput 250 mL cultivation system in an industrial process development setting. The adaptive moment estimation method with stochastic regularization and cross-validation were employed for deep learning. A total of 784 hybrid models with varying deep neural network architectures, depths, layers sizes and node activation functions were compared. In most scenarios, hybrid LSTM models outperformed classical hybrid Feedforward Neural Network (FFNN) models in terms of training and testing error. Hybrid LSTM models revealed to be less sensitive to data resampling than FFNN hybrid models. As disadvantages, Hybrid LSTM models are in general more complex (higher number of parameters) and have a higher computation cost than FFNN hybrid models. The hybrid model with the highest prediction accuracy consisted in a LSTM network with seven internal states connected in series with dynamic material balance equations. This hybrid model correctly predicted the dynamics of the 27 state variables (R2 = 0.93 in the test data set), including biomass, key substrates, amino acids and metabolic by-products for around 10 cultivation days.
Assuntos
Memória de Curto Prazo , Redes Neurais de Computação , Humanos , Células HEK293 , RimRESUMO
PURPOSE: Our aim was to elicit a value set for Capability-Adjusted Life Years Sweden (CALY-SWE); a capability-grounded quality of life instrument intended for use in economic evaluations of social interventions with broad consequences beyond health. METHODS: Building on methods commonly used in the quality-adjusted life years EQ-5D context, we collected time-trade off (TTO) and discrete choice experiment (DCE) data through an online survey from a general population sample of 1697 Swedish participants. We assessed data quality using a score based on the severity of inconsistencies. For generating the value set, we compared different model features, including hybrid modeling of DCE and TTO versus TTO data only, censoring of TTO answers, varying intercept, and accommodating for heteroskedasticity. We also assessed the models' DCE logit fidelity to measure agreement with potentially less-biased DCE data. To anchor the best capability state to 1 on the 0 to 1 scale, we included a multiplicative scaling factor. RESULTS: We excluded 20% of the TTO answers of participants with the largest inconsistencies to improve data quality. A hybrid model with an anchor scale and censoring was chosen to generate the value set; models with heteroskedasticity considerations or individually varying intercepts did not offer substantial improvement. The lowest capability weight was 0.114. Health, social relations, and finance and housing attributes contributed the largest capability gains, followed by occupation, security, and political and civil rights. CONCLUSION: We elicited a value set for CALY-SWE for use in economic evaluations of interventions with broad social consequences.
Assuntos
Nível de Saúde , Qualidade de Vida , Humanos , Qualidade de Vida/psicologia , Anos de Vida Ajustados por Qualidade de Vida , Suécia , Inquéritos e QuestionáriosRESUMO
Human induced pluripotent stem (hiPS) cells have demonstrated promising potential in regenerative medical therapeutics. After successful clinical trials, the demand for hiPS cells has steadily increased. Therefore, the optimization of hiPS cell freezing processes for storage and transportation is essential. Here, we presented a computer-aided exploration of multiobjective optimal temperature profiles in slow freezing for hiPS cells. This study was based on a model that calculates cell survival rates after thawing, and the model was extended to evaluate cell potentials until 24 h after seeding. To estimate parameter values for this extension, freezing experiments were performed using constant cooling rates. Using quality and productivity indicators, we evaluated 16,206 temperature profiles using our model, and a promising profile was obtained. Finally, an experimental investigation of the profile was undertaken, and the contribution of the temperature profile to both quality and productivity was confirmed.
Assuntos
Sobrevivência Celular , Criopreservação , Congelamento , Células-Tronco Pluripotentes Induzidas , Humanos , Células-Tronco Pluripotentes Induzidas/citologia , Criopreservação/métodos , Temperatura , Simulação por ComputadorRESUMO
Global warming poses a threat to lizard populations by raising ambient temperatures above historical norms and reducing thermoregulation opportunities. Whereas the reptile fauna of desert systems is relatively well studied, the lizard fauna of saline environments has not received much attention and-to our knowledge-thermal ecology and the effects of global warming on lizards from saline environments have not been yet addressed. This pioneer study investigates the thermal ecology, locomotor performance and potential effects of climate warming on Liolaemus ditadai, a lizard endemic to one of the largest salt flats on Earth. We sampled L. ditadai using traps and active searches along its known distribution, as well as in other areas within Salinas Grandes and Salinas de Ambargasta, where the species had not been previously recorded. Using ensemble models (GAM, MARS, RandomForest), we modeled climatically suitable habitats for L. ditadai in the present and under a pessimistic future scenario (SSP585, 2070). L. ditadai emerges as an efficient thermoregulator, tolerating temperatures near its upper thermal limits. Our ecophysiological model suggests that available activity hours predict its distribution, and the projected temperature increase due to global climate change should minimally impact its persistence or may even have a positive effect on suitable thermal habitat. However, this theoretical increase in habitat could be linked to the distribution of halophilous scrub in the future. Our surveys reveal widespread distribution along the borders of Salinas Grandes and Salinas de Ambargasta, suggesting a potential presence along the entire border of both salt plains wherever halophytic vegetation exists. Optimistic model results, extended distribution, and no evidence of flood-related adverse effects offer insights into assessing the conservation status of L. ditadai, making it and the Salinas Grandes system suitable models for studying lizard ecophysiology in largely unknown saline environments.
Assuntos
Lagartos , Animais , Lagartos/fisiologia , Argentina , Regulação da Temperatura Corporal , Extremófilos/fisiologia , Ecossistema , Aquecimento Global , Mudança Climática , Modelos Biológicos , Temperatura AltaRESUMO
Global change ecology nowadays embraces ever-growing large observational datasets (big-data) and complex mathematical models that track hundreds of ecological processes (big-model). The rapid advancement of the big-data-big-model has reached its bottleneck: high computational requirements prevent further development of models that need to be integrated over long time-scales to simulate the distribution of ecosystems carbon and nutrient pools and fluxes. Here, we introduce a machine-learning acceleration (MLA) tool to tackle this grand challenge. We focus on the most resource-consuming step in terrestrial biosphere models (TBMs): the equilibration of biogeochemical cycles (spin-up), a prerequisite that can take up to 98% of the computational time. Through three members of the ORCHIDEE TBM family part of the IPSL Earth System Model, including versions that describe the complex interactions between nitrogen, phosphorus and carbon that do not have any analytical solution for the spin-up, we show that an unoptimized MLA reduced the computation demand by 77%-80% for global studies via interpolating the equilibrated state of biogeochemical variables for a subset of model pixels. Despite small biases in the MLA-derived equilibrium, the resulting impact on the predicted regional carbon balance over recent decades is minor. We expect a one-order of magnitude lower computation demand by optimizing the choices of machine learning algorithms, their settings, and balancing the trade-off between quality of MLA predictions and need for TBM simulations for training data generation and bias reduction. Our tool is agnostic to gridded models (beyond TBMs), compatible with existing spin-up acceleration procedures, and opens the door to a wide variety of future applications, with complex non-linear models benefit most from the computational efficiency.
Assuntos
Ecossistema , Modelos Teóricos , Carbono , Nitrogênio , Ciclo do CarbonoRESUMO
Constructing predictive models to simulate complex bioprocess dynamics, particularly time-varying (i.e., parameters varying over time) and history-dependent (i.e., current kinetics dependent on historical culture conditions) behavior, has been a longstanding research challenge. Current advances in hybrid modeling offer a solution to this by integrating kinetic models with data-driven techniques. This article proposes a novel two-step framework: first (i) speculate and combine several possible kinetic model structures sourced from process and phenomenological knowledge, then (ii) identify the most likely kinetic model structure and its parameter values using model-free Reinforcement Learning (RL). Specifically, Step 1 collates feasible history-dependent model structures, then Step 2 uses RL to simultaneously identify the correct model structure and the time-varying parameter trajectories. To demonstrate the performance of this framework, a range of in-silico case studies were carried out. The results show that the proposed framework can efficiently construct high-fidelity models to quantify both time-varying and history-dependent kinetic behaviors while minimizing the risks of over-parametrization and over-fitting. Finally, the primary advantages of the proposed framework and its limitation were thoroughly discussed in comparison to other existing hybrid modeling and model structure identification techniques, highlighting the potential of this framework for general bioprocess modeling.
Assuntos
CinéticaRESUMO
Effective treatment of sewage by wastewater treatment plants (WWTPs) are essential to protecting water environment as well as people's health worldwide. However, operation of WWTPs is usually intricate due to precarious influent characteristics and nonlinear sewage treatment processes. Effective modeling of WWTPs can provide valuable decision-making support to facilitate their daily operations and management. In this study, we have built a novel hybrid model by combining a process-based WWTP model (GPS-X) with a data-driven machine learning model (Random Forest) to improve the simulation of long-term hourly effluent ammonium-nitrogen concentration of a WWTP. Our study results have shown that the hybrid GPS-X-RF model performs the best with a coefficient of determination (R2) of 0.95 and root mean squared error (RMSE) of 0.23 mg/L, followed by the GPS-X model with a R2 of 0.93 and RMSE of 0.33 mg/L and last the Random Forest model with a R2 of 0.84 and RMSE of 0.41 mg/L. Capable of incorporating wastewater treatment mechanisms and utilizing superior data mining capabilities of machine learning, the hybrid model adapts better to the large fluctuations in influent and operating conditions of the WWTP. The proposed hybrid modeling framework may be easily extended to WWTPs of various size and types to simulate their operations under increasingly variable environmental and operating conditions.
Assuntos
Esgotos , Purificação da Água , Humanos , Eliminação de Resíduos Líquidos/métodos , Simulação por Computador , Aprendizado de Máquina , Purificação da Água/métodosRESUMO
Laser cutting belongs to non-contact processing, which is different from traditional turning and milling. In order to improve the machining accuracy of laser cutting, a thermal error prediction and dynamic compensation strategy for laser cutting is proposed. Based on the time-varying characteristics of the digital twin technology, a hybrid model combining the thermal elastic-plastic finite element (TEP-FEM) and T-XGBoost algorithms is established. The temperature field and thermal deformation under 12 common working conditions are simulated and analyzed with TEP-FEM. Real-time machining data obtained from TEP-FEM simulation is used in intelligent algorithms. Based on the XGBoost algorithm and the simulation data set as the training data set, a time-series-based segmentation algorithm (T-XGBoost) is proposed. This algorithm can reduce the maximum deformation at the slit by more than 45%. At the same time, by reducing the average volume strain under most working conditions, the lifting rate can reach 63% at the highest, and the machining result is obviously better than XGBoost. The strategy resolves the uncontrollable thermal deformation during cutting and provides theoretical solutions to the implementation of the intelligent operation strategies such as predictive machining and quality monitoring.
RESUMO
Disaster management systems require accurate disaster monitoring and prediction services to reduce damages caused by natural disasters. Digital twins of natural environments can provide the services for the systems with physics-based and data-driven disaster models. However, the digital twins might generate erroneous disaster prediction due to the impracticability of defining high-fidelity physics-based models for complex natural disaster behavior and the dependency of data-driven models on the training dataset. This causes disaster management systems to inappropriately use disaster response resources, including medical personnel, rescue equipment and relief supplies, to ensure that it may increase the damages from the natural disasters. This study proposes a digital twin architecture to provide accurate disaster prediction services with a similarity-based hybrid modeling scheme. The hybrid modeling scheme creates a hybrid disaster model that compensates for the errors of physics-based prediction results with a data-driven error correction model to enhance the prediction accuracy. The similarity-based hybrid modeling scheme reduces errors from the data dependency of the hybrid model by constructing a training dataset using similarity assessments between the target disaster and the historical disasters. Evaluations in wildfire scenarios show that the digital twin decreases prediction errors by approximately 50% compared with those of the existing schemes.
Assuntos
Planejamento em Desastres , Desastres , Desastres Naturais , Planejamento em Desastres/métodos , Pessoal de Saúde , HumanosRESUMO
Astaxanthin is a high-value compound commercially synthesized through Xanthophyllomyces dendrorhous fermentation. Using mixed sugars decomposed from biowastes for yeast fermentation provides a promising option to improve process sustainability. However, little effort has been made to investigate the effects of multiple sugars on X. dendrorhous biomass growth and astaxanthin production. Furthermore, the construction of a high-fidelity model is challenging due to the system's variability, also known as batch-to-batch variation. Two innovations are proposed in this study to address these challenges. First, a kinetic model was developed to compare process kinetics between the single sugar (glucose) based and the mixed sugar (glucose and sucrose) based fermentation methods. Then, the kinetic model parameters were modeled themselves as Gaussian processes, a probabilistic machine learning technique, to improve the accuracy and robustness of model predictions. We conclude that although the presence of sucrose does not affect the biomass growth kinetics, it introduces a competitive inhibitory mechanism that enhances astaxanthin accumulation by inducing adverse environmental conditions such as osmotic gradients. Moreover, the hybrid model was able to greatly reduce model simulation error and was particularly robust to uncertainty propagation. This study suggests the advantage of mixed sugar-based fermentation and provides a novel approach for bioprocess dynamic modeling.
Assuntos
Fermentação/fisiologia , Modelos Biológicos , Saccharomyces cerevisiae/metabolismo , Biomassa , Reatores Biológicos/microbiologia , Glucose/metabolismo , Cinética , Engenharia Metabólica , Incerteza , Xantofilas/análise , Xantofilas/metabolismoRESUMO
Mechanism-based kinetic models are rigorous tools to analyze enzymatic reactions, but their extension to actual conditions of the biocatalytic synthesis can be difficult. Here, we demonstrate (mechanistic-empirical) hybrid modeling for systematic optimization of the sucrose phosphorylase-catalyzed glycosylation of glycerol from sucrose, to synthesize the cosmetic ingredient α-glucosyl glycerol (GG). The empirical model part was developed to capture nonspecific effects of high sucrose concentrations (up to 1.5 M) on microscopic steps of the enzymatic trans-glycosylation mechanism. Based on verified predictions of the enzyme performance under initial rate conditions (Level 1), the hybrid model was expanded by microscopic terms of the reverse reaction to account for the full-time course of GG synthesis (Level 2). Lastly (Level 3), the application of the hybrid model for comprehensive window-of-operation analysis and constrained optimization of the GG production (~250 g/L) was demonstrated. Using two candidate sucrose phosphorylases (from Leuconostoc mesenteroides and Bifidobacterium adolescentis), we reveal the hybrid model as a powerful tool of "process decision making" to guide rational selection of the best-suited enzyme catalyst. Our study exemplifies a closing of the gap between enzyme kinetic models considered for mechanistic research and applicable in technologically relevant reaction conditions; and it highlights the important benefit thus realizable for biocatalytic process development.
Assuntos
Bifidobacterium adolescentis/metabolismo , Biocatálise , Glucosídeos/metabolismo , Leuconostoc mesenteroides/metabolismo , Modelos Biológicos , Sacarose/metabolismoRESUMO
The objective of this study is to present a practical example of a scale-independent design space development using a step-wise approach. A detailed description of the development process with a systematic outline of the main steps is provided. Design space is developed for film coating of tablets with moisture protective polyvinyl alcohol (PVA) based coating. The impact of scale-independent coating process parameters on the properties of film-coated tablets (FCT), i.e. water activity and film coating protection ability, and consequently on product long-term stability is explored. The main finding is that with model simplifications, a step-wise approach and rational development of scale-independent design space for the coating process, it is possible to efficiently predict, control, and optimize the long-term stability of a moisture sensitive product. However, the PVA moisture protective coating itself is recognized as having conflicting effects on product stability.
Assuntos
Composição de Medicamentos/métodos , Excipientes/química , Projetos de Pesquisa , Comprimidos/química , Química Farmacêutica , Umidade , Álcool de Polivinil/química , TemperaturaRESUMO
Limitations in the applicability, accuracy, and precision of individual structure characterization methods can sometimes be overcome via an integrative modeling approach that relies on information from all available sources, including all available experimental data and prior models. The open-source Integrative Modeling Platform (IMP) is one piece of software that implements all computational aspects of integrative modeling. To maximize the impact of integrative structures, the coordinates should be made publicly available, as is already the case for structures based on X-ray crystallography, NMR spectroscopy, and electron microscopy. Moreover, the associated experimental data and modeling protocols should also be archived, such that the original results can easily be reproduced. Finally, it is essential that the integrative structures are validated as part of their publication and deposition. A number of research groups have already developed software to implement integrative modeling and have generated a number of structures, prompting the formation of an Integrative/Hybrid Methods Task Force. Following the recommendations of this task force, the existing PDBx/mmCIF data representation used for atomic PDB structures has been extended to address the requirements for archiving integrative structural models. This IHM-dictionary adds a flexible model representation, including coarse graining, models in multiple states and/or related by time or other order, and multiple input experimental information sources. A prototype archiving system called PDB-Dev ( https://pdb-dev.wwpdb.org ) has also been created to archive integrative structural models, together with a Python library to facilitate handling of integrative models in PDBx/mmCIF format.
Assuntos
Modelos Moleculares , Conformação Molecular , Software , Conformação Proteica , Proteínas/química , Fluxo de TrabalhoRESUMO
Hybrid semi-parametric modeling, combining mechanistic and machine-learning methods, has proven to be a powerful method for process development. This paper proposes bootstrap aggregation to increase the predictive power of hybrid semi-parametric models when the process data are obtained by statistical design of experiments. A fed-batch Escherichia coli optimization problem is addressed, in which three factors (biomass growth setpoint, temperature, and biomass concentration at induction) were designed statistically to identify optimal cell growth and recombinant protein expression conditions. Synthetic data sets were generated applying three distinct design methods, namely, Box-Behnken, central composite, and Doehlert design. Bootstrap-aggregated hybrid models were developed for the three designs and compared against the respective non-aggregated versions. It is shown that bootstrap aggregation significantly decreases the prediction mean squared error of new batch experiments for all three designs. The number of (best) models to aggregate is a key calibration parameter that needs to be fine-tuned in each problem. The Doehlert design was slightly better than the other designs in the identification of the process optimum. Finally, the availability of several predictions allowed computing error bounds for the different parts of the model, which provides an additional insight into the variation of predictions within the model components.
Assuntos
Biomassa , Escherichia coli/crescimento & desenvolvimento , Modelos BiológicosRESUMO
The hotel industry is an important energy consumer that needs efficient energy management methods to guarantee its performance and sustainability. The new role of hotels as prosumers increases the difficulty in the design of these methods. Also, the scenery is more complex as renewable energy systems are present in the hotel energy mix. The performance of energy management systems greatly depends on the use of reliable predictions for energy load. This paper presents a new methodology to predict energy load in a hotel based on intelligent techniques. The model proposed is based on a hybrid intelligent topology implemented with a combination of clustering techniques and intelligent regression methods (Artificial Neural Network and Support Vector Regression). The model includes its own energy demand information, occupancy rate, and temperature as inputs. The validation was done using real hotel data and compared with time-series models. Forecasts obtained were satisfactory, showing a promising potential for its use in energy management systems in hotel resorts.
RESUMO
Indirect (S)QM/MM free energy simulations (FES) are vital to efficiently incorporating sufficient sampling and accurate (QM) energetic evaluations when estimating free energies of practical/experimental interest. Connecting between levels of theory, i.e., calculating Δ A l o w â h i g h , remains to be the most challenging step within an indirect FES protocol. To improve calculations of Δ A l o w â h i g h , we must: (1) compare the performance of all FES methods currently available; and (2) compile and maintain datasets of Δ A l o w â h i g h calculated for a wide-variety of molecules so that future practitioners may replicate or improve upon the current state-of-the-art. Towards these two aims, we introduce a new dataset, "HiPen", which tabulates Δ A g a s M M â 3 o b (the free energy associated with switching from an M M to an S C C - D F T B molecular description using the 3ob parameter set in gas phase), calculated for 22 drug-like small molecules. We compare the calculation of this value using free energy perturbation, Bennett's acceptance ratio, Jarzynski's equation, and Crooks' equation. We also predict the reliability of each calculated Δ A g a s M M â 3 o b by evaluating several convergence criteria including sample size hysteresis, overlap statistics, and bias metric ( Π ). Within the total dataset, three distinct categories of molecules emerge: the "good" molecules, for which we can obtain converged Δ A g a s M M â 3 o b using Jarzynski's equation; "bad" molecules which require Crooks' equation to obtain a converged Δ A g a s M M â 3 o b ; and "ugly" molecules for which we cannot obtain reliably converged Δ A g a s M M â 3 o b with either Jarzynski's or Crooks' equations. We discuss, in depth, results from several example molecules in each of these categories and describe how dihedral discrepancies between levels of theory cause convergence failures even for these gas phase free energy simulations.
Assuntos
Metabolismo Energético , Proteínas/metabolismo , Termodinâmica , Água/metabolismo , Entropia , Teoria QuânticaRESUMO
Although social support and social integration are key predictors of depression and exhibit racial/ethnic patterns in the US, previous research has not examined how they shape racial/ethnic disparities in depression. Applying hybrid models to data from the Americans' Changing Lives study from 1986 to 2002, this study analyzes how sources of social support (spouse and friend/relative) and types of social integration (informal/formal) explain black-white and Hispanic-white disparities in depression. We find that strong social support and high social integration are negatively associated with depression and that the patterns of social support and integration vary by race/ethnicity. The results of hybrid models show that social support from one's spouse and friend/relative account for over 25 percent of the black-white disparity, whereas formal social integration including religious groups widens the black-white differential by roughly 10 percent. However, Hispanic-white disparities in depression are mostly a result of the difference in socioeconomic status. The change in spousal support is the most powerful predictor for the change in depression across race/ethnicity groups. Our findings suggest that the racial/ethnic differences in sources of social support and types of social integration play important roles in shaping racial/ethnic disparities in depression.
Assuntos
Negro ou Afro-Americano/psicologia , Transtorno Depressivo/epidemiologia , Etnicidade/psicologia , Disparidades nos Níveis de Saúde , Hispânico ou Latino/psicologia , Apoio Social , População Branca/psicologia , Adulto , Negro ou Afro-Americano/estatística & dados numéricos , Idoso , Idoso de 80 Anos ou mais , Etnicidade/estatística & dados numéricos , Feminino , Hispânico ou Latino/estatística & dados numéricos , Humanos , Masculino , Pessoa de Meia-Idade , Fatores Sexuais , Classe Social , Fatores Socioeconômicos , Estados Unidos/epidemiologia , População Branca/estatística & dados numéricosRESUMO
Accurate structure determination from electron density maps at 3-5â¯Å resolution necessitates a balance between extensive global and local sampling of atomistic models, yet with the stereochemical correctness of backbone and sidechain geometries. Molecular Dynamics Flexible Fitting (MDFF), particularly through a resolution-exchange scheme, ReMDFF, provides a robust way of achieving this balance for hybrid structure determination. Employing two high-resolution density maps, namely that of ß-galactosidase at 3.2â¯Å and TRPV1 at 3.4â¯Å, we showcase the quality of ReMDFF-generated models, comparing them against ones submitted by independent research groups for the 2015-2016 Cryo-EM Model Challenge. This comparison offers a clear evaluation of ReMDFF's strengths and shortcomings, and those of data-guided real-space refinements in general. ReMDFF results scored highly on the various metric for judging the quality-of-fit and quality-of-model. However, some systematic discrepancies are also noted employing a Molprobity analysis, that are reproducible across multiple competition entries. A space of key refinement parameters is explored within ReMDFF to observe their impact within the final model. Choice of force field parameters and initial model seem to have the most significant impact on ReMDFF model-quality. To this end, very recently developed CHARMM36m force field parameters provide now more refined ReMDFF models than the ones originally submitted to the Cryo-EM challenge. Finally, a set of good-practices is prescribed for the community to benefit from the MDFF developments.