Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 474
Filter
1.
Article in English | MEDLINE | ID: mdl-38977033

ABSTRACT

PURPOSE: This study aimed to compare and evaluate the efficiency and accuracy of computerized adaptive testing (CAT) under two stopping rules (SEM 0.3 and 0.25) using both real and simulated data in medical examinations in Korea. METHODS: This study employed post-hoc simulation and real data analysis to explore the optimal stopping rule for CAT in medical examinations. The real data were obtained from the responses of 3rd-year medical students during examinations in 2020 at Hallym University College of Medicine. Simulated data were generated using estimated parameters from a real item bank in R. Outcome variables included the number of examinees' passing or failing with SEM values of 0.25 and 0.30, the number of items administered, and the correlation. The consistency of real CAT result was evaluated by examining consistency of pass or fail based on a cut score of 0.0. The efficiency of all CAT designs was assessed by comparing the average number of items administered under both stopping rules. RESULTS: Both SEM 0.25 and SEM 0.30 provided a good balance between accuracy and efficiency in CAT. The real data showed minimal differences in pass/fail outcomes between the 2 SEM conditions, with a high correlation (r = 0.99) between ability estimates. The simulation results confirmed these findings, indicating similar average item numbers between real and simulated data. CONCLUSION: The findings suggest that both SEM 0.25 and 0.30 are effective termination criteria in the context of the Rasch model, balancing accuracy and efficiency in CAT.


Subject(s)
Educational Measurement , Psychometrics , Students, Medical , Humans , Educational Measurement/methods , Educational Measurement/standards , Republic of Korea , Psychometrics/methods , Computer Simulation , Data Analysis , Education, Medical, Undergraduate/methods , Male , Female
2.
Psychiatry Investig ; 21(6): 672-679, 2024 Jun.
Article in English | MEDLINE | ID: mdl-38960445

ABSTRACT

OBJECTIVE: Borderline personality disorder (BPD) is known to share characteristics with a variety of personality disorders (PDs) and exhibits diverse patterns of defense mechanisms. To enhance our understanding of BPD, it's crucial to shift our focus from traditional categorical diagnostics to the dimensional traits shared with other PDs, as the borderline personality organization (BPO) model suggests. This approach illuminates the nuanced spectrum of BPD characteristics, offering deeper insights into its complexity. While studies have investigated the comorbidity of BPD with other PDs, research exploring the relationship between various personality factors and defense mechanisms within BPD itself has been scarce. The present study was undertaken to investigate the complex interrelationships between various personality factors and defense styles in individuals diagnosed with BPD. METHODS: Using a network analysis approach, data from 227 patients diagnosed with BPD were examined using the Defense Style Questionnaire and Personality Disorder Questionnaire-4+ for assessment. RESULTS: Intricate connections were observed between personality factors and defense styles. Significant associations were identified between various personality factors and defense styles, with immature defense styles, such as maladaptive and image-distorting being particularly prominent in BPD in the centrality analysis. The maladaptive defense style had the highest expected influence centrality. Furthermore, the schizotypal, dependent, and narcissistic personality factors demonstrated relatively high centrality within the network. CONCLUSION: Network analysis can effectively delineate the complexity of various PDs and defense styles. These findings are expected to facilitate a deeper understanding of why BPD exhibits various levels of organization and presents with heterogeneous characteristics, consistent with the perspectives proposed by the BPO.

4.
Chemosphere ; : 142701, 2024 Jun 24.
Article in English | MEDLINE | ID: mdl-38925516

ABSTRACT

A prediction model based on XGBoost is proposed for ultrasonic degradation of micropollutants' kinetic constants. After parameter optimization through iteration, the model achieves Evaluation metrics with R2 and SMAPE reaching 0.99 and 2.06%, respectively. The impact of design parameters on predicting kinetic constants for ultrasound degradation of trace pollutants was assessed using Shapley additive explanations (SHAP). Results indicate that power density and frequency significantly impact the predictive performance. The database was sorted based on power density and frequency values. Subsequently, 800 raw data were split into small databases of 200 each. After confirming that reducing the database size doesn't affect prediction accuracy, ultrasound degradation experiments were conducted for five pollutants, yielding experimental data. A small database with experimental conditions within the numerical range was selected. Data meeting both feature conditions were filtered, resulting in an optimized 60-data group. After incorporating experimental data, a model was trained for prediction. Degradation kinetic constants for experiments (kE) were compared with predicted constants (for 800 data-based model: kP-800 and for 60 data-based model: kP-60). Results showed ibuprofen, bisphenol A, carbamazepine, and 17ß-Estradiol performed better on the 60-data group (kP-60/kE: 1.00, 0.99, 1.00, 1.00), while caffeine suited the model trained on the 800-data group (kP-800/kE: 1.02).

5.
Food Chem ; 458: 140260, 2024 Jun 27.
Article in English | MEDLINE | ID: mdl-38944927

ABSTRACT

The study aimed to assess the extent to which protein aggregation, and even the modality of aggregation, can affect gastric digestion, down to the nature of the hydrolyzed peptide bonds. By controlling pH and ionic strength during heating, linear or spherical ovalbumin (OVA) aggregates were prepared, then digested with pepsin. Statistical analysis characterized the peptide bonds specifically hydrolyzed versus those not hydrolyzed for a given condition, based on a detailed description of all these bonds. Aggregation limits pepsin access to buried regions of native OVA, but some cleavage sites specific to aggregates reflect specific hydrolysis pathways due to the denaturation-aggregation process. Cleavage sites specific to linear aggregates indicate greater denaturation compared to spherical aggregates, consistent with theoretical models of heat-induced aggregation of OVA. Thus, the peptides released during the gastric phase may vary depending on the aggregation modality. Precisely tuned aggregation may therefore allow subtle control of the digestion process.

6.
Methods Mol Biol ; 2809: 101-113, 2024.
Article in English | MEDLINE | ID: mdl-38907893

ABSTRACT

HLA somatic mutations can alter the expression and function of HLA molecules, which in turn affect the ability of the immune system to recognize and respond to cancer cells. Therefore, it is crucial to accurately identify HLA somatic mutations to enhance our understanding of the interaction between cancer and the immune system and improve cancer treatment strategies. ALPHLARD-NT is a reliable tool that can accurately identify HLA somatic mutations as well as HLA genotypes from whole genome sequencing data of paired normal and tumor samples. Here, we provide a comprehensive guide on how to use ALPHLARD-NT and interpret the results.


Subject(s)
HLA Antigens , Histocompatibility Testing , Mutation , Neoplasms , Whole Genome Sequencing , Humans , Whole Genome Sequencing/methods , Histocompatibility Testing/methods , Neoplasms/genetics , Neoplasms/immunology , HLA Antigens/genetics , Software , Computational Biology/methods , Genotype , Genome, Human , High-Throughput Nucleotide Sequencing/methods , Alleles
7.
Heliyon ; 10(9): e30762, 2024 May 15.
Article in English | MEDLINE | ID: mdl-38765132

ABSTRACT

In survival and stochastic lifespan modeling, numerous families of distributions are sometimes considered unnatural, unjustifiable theoretically, and occasionally superfluous. Here, a novel parsimonious survival model is developed using the Bilal distribution (BD) and the Kavya-Manoharan (KM) parsimonious transformation family. In addition to other analytical properties, the forms of probability density function (PDF) and behavior of the distributions' hazard rates are analyzed. The insights are theoretical as well as practical. Theoretically, we offer explicit equations for the single and product moments of order statistics from Kavya-Manoharan Bilal Distribution. Practically, maximum likelihood (ML) technique, which is based on simple random sampling (SRS) and ranked set sampling (RSS) sample schemes, is employed to estimate the parameters. Numerical simulations are used as the primary methodology to compare the various sampling techniques.

8.
J Anat ; 2024 May 09.
Article in English | MEDLINE | ID: mdl-38720634

ABSTRACT

Characterizing the suture morphological variation is a crucial step to investigate the influence of sutures on infant head biomechanics. This study aimed to establish a comprehensive quantitative framework for accurately capturing the cranial suture and fontanelle morphologies in infants. A total of 69 CT scans of 2-4 month-old infant heads were segmented to identify semilandmarks at the borders of cranial sutures and fontanelles. Morphological characteristics, including length, width, sinuosity index (SI), and surface area, were measured. For this, an automatic method was developed to determine the junction points between sutures and fontanelles, and thin-plate-spline (TPS) was utilized for area calculation. Different dimensionality reduction methods were compared, including nonlinear and linear principal component analysis (PCA), as well as deep-learning-based variational autoencoder (VAE). Finally, the significance of various covariates was analyzed, and regression analysis was performed to establish a statistical model relating morphological parameters with global parameters. This study successfully developed a quantitative morphological framework and demonstrate its application in quantifying morphologies of infant sutures and fontanelles, which were shown to significantly relate to global parameters of cranial size, suture SI, and surface area for infants aged 2-4 months. The developed framework proved to be reliable and applicable in extracting infant suture morphology features from CT scans. The demonstrated application highlighted its potential to provide valuable insights into the morphologies of infant cranial sutures and fontanelles, aiding in the diagnosis of suture-related skull fractures. Infant suture, Infant fontanelle, Morphological variation, Morphology analysis framework, Statistical model.

9.
Health Econ Rev ; 14(1): 27, 2024 Apr 12.
Article in English | MEDLINE | ID: mdl-38607501

ABSTRACT

BACKGROUND: Based on the legal framework laid down in section 130b (9) of Book V of the German Social Code, various criteria are relevant for the negotiated price for new patented drugs in Germany. European reference prices (ERPs) are one criterion. The ERP is based on the ex-factory prices (EFPs) of the countries included in the European country basket. However, in some of these countries, the EFP is not published due to confidential wholesale margins. Wholesale margins must therefore be estimated and deducted from purchase prices. In this context literature-based estimates to date do not assume regressive margins with higher pharmaceutical prices. This assumption is questionable and can lead to systematically underestimated country prices, especially for high-priced drugs. Percentage wholesale margins in the majority of European countries develop to a comparable extent regressively with increasing prices. It should therefore be examined (1) whether statistical models can predict the margins of individual countries, in principle and especially for countries where margins are unknown and regressive trends are likely, and (2) to what extent the estimation of margins improves when regressive statistical models are used to estimate margins instead of cross-price averages published in the literature. METHODS: Qualitative preliminary research explores the basic wholesale pricing mechanisms in countries with confidential wholesale margins. Wholesale margins for reimbursable drugs were then modeled for regulated European countries. Estimation quality and impact of the model was compared to estimations based on average margins. RESULTS: In both regulated countries and in countries with confidential wholesale margins, percentage margins of wholesalers develop regressively as drug prices rise. Regressive courses of margins can be resiliently modeled for the regulated countries using a power distribution with significantly lower mean squared errors in a linear mixed model in comparison to literature-based estimations with country-specific cross-price averages. CONCLUSION: If there is reason to believe that margins are regressive, confidential wholesale margins are expected to be better estimated by the power function based on margins of regulated countries than by the published country-specific average margins, reducing significantly inaccurate effects on margin estimations of high-price drugs.

10.
J Vasc Access ; : 11297298241237830, 2024 Apr 24.
Article in English | MEDLINE | ID: mdl-38658814

ABSTRACT

OBJECTIVE: Failure-to-mature and early stenosis remains the Achille's heel of hemodialysis arteriovenous fistula (AVF) creation. The maturation and patency of an AVF can be influenced by a variety of demographic, comorbidity, and anatomical factors. This study aims to review the prediction models of AVF maturation and patency with various risk scores and machine learning models. DATA SOURCES AND REVIEW METHODS: Literature search was performed on PubMed, Scopus, and Embase to identify eligible articles. The quality of the studies was assessed using the Prediction model Risk Of Bias ASsessment (PROBAST) Tool. The performance (discrimination and calibration) of the included studies were extracted. RESULTS: Fourteen studies (seven studies used risk score approaches; seven studies used machine learning approaches) were included in the review. Among them, 12 studies were rated as high or unclear "risk of bias." Six studies were rated as high concern or unclear for "applicability." C-statistics (Model discrimination metric) was reported in five studies using risk score approach (0.70-0.886) and three utilized machine learning methods (0.80-0.85). Model calibration was reported in three studies. Failure-to-mature risk score developed by one of the studies has been externally validated in three different patient populations, however the model discrimination degraded significantly (C-statistics: 0.519-0.53). CONCLUSION: The performance of existing predictive models for AVF maturation/patency is underreported. They showed satisfactory performance in their own study population. However, there was high risk of bias in methodology used to build some of the models. The reviewed models also lack external validation or had reduced performance in external cohort.

11.
Sensors (Basel) ; 24(8)2024 Apr 21.
Article in English | MEDLINE | ID: mdl-38676265

ABSTRACT

A systematic study of the nonlinear response of Silicon Photomultipliers (SiPMs) was conducted through Monte Carlo (MC) simulations. The MC code was validated against experimental data for two different SiPMs. Nonlinearity mainly depends on the balance between the photon rate and the pixel recovery time. Additionally, nonlinearity has been found to depend on the light pulse shape, the correlated noise, the overvoltage dependence of the photon detection efficiency, and the impedance of the readout circuit. Correlated noise has been shown to have a minor impact on nonlinearity, but it can significantly affect the shape of the SiPM output current. Considering these dependencies and a previous statistical analysis of the nonlinear response of SiPMs, two phenomenological fitting models were proposed for exponential-like and finite light pulses, explaining the roles of their various terms and parameters. These models provide an accurate description of the nonlinear responses of SiPMs at the level of a few percentages for a wide range of situations.

12.
Environ Sci Pollut Res Int ; 31(20): 30009-30025, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38598159

ABSTRACT

In this work, we present the water quality assessment of an urban river, the San Luis River, located in San Luis Province, Argentina. The San Luis River flows through two developing cities; hence, urban anthropic activities affect its water quality. The river was sampled spatially and temporally, evaluating ten physicochemical variables on each water sample. These data were used to calculate a Simplified Index of Water Quality in order to estimate river water quality and infer possible contamination sources. Data were statistically analyzed with the opensource software R, 4.1.0 version. Principal component analysis, cluster analysis, correlation matrices, and heatmap analysis were performed. Results indicated that water quality decreases in areas where anthropogenic activities take place. Robust inferential statistical analysis was performed, employing an alternative of multivariate analysis of variance (MANOVA), MANOVA.wide function. The most statistically relevant physicochemical variables associated with water quality decrease were used to develop a multiple linear regression model to estimate organic matter, reducing the variables necessary for continuous monitoring of the river and, hence, reducing costs. Given the limited information available in the region about the characteristics and recovery of this specific river category, the model developed is of vital importance since it can quickly detect anthropic alterations and contribute to the environmental management of the rivers. This model was also used to estimate organic matter at sites located in other similar rivers, obtaining satisfactory results.


Subject(s)
Environmental Monitoring , Rivers , Water Quality , Rivers/chemistry , Argentina , Environmental Monitoring/methods , Multivariate Analysis , Cities , Water Pollutants, Chemical/analysis , Principal Component Analysis
13.
Regul Toxicol Pharmacol ; 149: 105612, 2024 May.
Article in English | MEDLINE | ID: mdl-38570022

ABSTRACT

Chemical equivalence testing can be used to assess the biocompatibility implications of a materials or manufacturing change for a medical device. This testing can provide a relatively facile means to evaluate whether the change may result in additional or different toxicological concerns. However, one of the major challenges in the interpretation of chemical equivalence data is the lack established criteria for determining if two sets of extractables data are effectively equivalent. To address this gap, we propose a two-part approach based upon a relatively simple statistical model. First, the probability of a false positive conclusion, wherein there is an incorrectly perceived increase for a given analyte in the comparator relative to the baseline device, can be reduced to a prescribed level by establishing an appropriate acceptance criterion for the ratio of the observed means. Second, the probability of a false negative conclusion, where an actual increase in a given analyte cannot be discerned from the test results, can be minimized by specifying a limiting value of applicability based on the margin of safety (MoS) of the analyte. This approach provides a quantitative, statistically motivated method to interpret chemical equivalence data, despite the relatively high intrinsic variability and small number of replicates typically associated with a chemical characterization evaluation.


Subject(s)
Equipment and Supplies , Equipment and Supplies/standards , Humans , Models, Statistical , Materials Testing/methods , Biocompatible Materials/chemistry , Risk Assessment , Equipment Safety
14.
J Med Signals Sens ; 14: 2, 2024.
Article in English | MEDLINE | ID: mdl-38510673

ABSTRACT

Background: Optical coherence tomography (OCT) imaging has emerged as a promising diagnostic tool, especially in ophthalmology. However, speckle noise and downsampling significantly degrade the quality of OCT images and hinder the development of OCT-assisted diagnostics. In this article, we address the super-resolution (SR) problem of retinal OCT images using a statistical modeling point of view. Methods: In the first step, we utilized Weibull mixture model (WMM) as a comprehensive model to establish the specific features of the intensity distribution of retinal OCT data, such as asymmetry and heavy tailed. To fit the WMM to the low-resolution OCT images, expectation-maximization algorithm is used to estimate the parameters of the model. Then, to reduce the existing noise in the data, a combination of Gaussian transform and spatially constraint Gaussian mixture model is applied. Now, to super-resolve OCT images, the expected patch log-likelihood is used which is a patch-based algorithm with multivariate GMM prior assumption. It restores the high-resolution (HR) images with maximum a posteriori (MAP) estimator. Results: The proposed method is compared with some well-known super-resolution algorithms visually and numerically. In terms of the mean-to-standard deviation ratio (MSR) and the equivalent number of looks, our method makes a great superiority compared to the other competitors. Conclusion: The proposed method is simple and does not require any special preprocessing or measurements. The results illustrate that our method not only significantly suppresses the noise but also successfully reconstructs the image, leading to improved visual quality.

15.
J Hazard Mater ; 469: 133825, 2024 May 05.
Article in English | MEDLINE | ID: mdl-38430587

ABSTRACT

Permeable reactive barrier (PRB) is an effective in-situ technology for groundwater remediation. The important factors in PRB design are the width and reactive material. In this study, the beaded coal mine drainage sludge (BCMDS) was employed as the filling material to adsorb arsenic pollutants in groundwater, aiming to design the width of PRB. The design methods involving traditional continue column experiments and empirical formulas, as well as machine learning (ML) predictions and statistical methods, which are compared with each other. Traditional methods are determined based on breakthrough curves under several conditions. ML method has advantages in predicting the width of mass transfer zone (WMTZ), which simultaneously consider the characteristics of material, pollutant, and environmental conditions, with data collected from articles. After data preprocessing and model optimizing, selected the XGBoost algorithm based on the high accuracy, which shows good prediction for WMTZ (R2 = 0.97, RMSE = 0.15). The experimentally derived WMTZ values were also used to validate the predictions, demonstrating the ML low error rate of 7.04 % and the feasibility. Subsequent statistical analysis of multiple linear regression (MLR) showed the error rate of 39.43 %, interpret superiority of ML due to the complexity of influencing factors and the insufficient precision of math regression. Compared to traditional width design methods, ML can improve design efficiency and save experimental time and manpower. Further expansion of the dataset and optimization of algorithms could enhance the accuracy of ML, overcoming existing limitations and gaining broader applications.

16.
Pharmaceuticals (Basel) ; 17(3)2024 Feb 24.
Article in English | MEDLINE | ID: mdl-38543078

ABSTRACT

The antimicrobial quantitative structure-activity relationship of plant flavonoids against Gram-positive bacteria was established in our previous works, and the cell membrane was confirmed as a major site of action. To investigate whether plant flavonoids have similar antibacterial effects and mechanisms against both Gram-negative and Gram-positive bacteria, here, the minimum inhibitory concentrations (MICs) of 37 plant flavonoids against Escherichia coli were determined using the microdilution broth method, and then the correlation between their lipophilic parameter ACD/LogP or LogD7.40 value and their MIC was analyzed. Simultaneously, the correlation between the ACD/LogP or LogD7.40 value and the MIC of 46 plant flavonoids reported in the literature against E. coli was also analyzed. Both sets of results showed that there is a significant correlation between the LogP value and the MIC of plant flavonoids against Gram-negative bacteria. However, it is difficult to effectively predict the MIC of plant flavonoids against Gram-negative bacteria from their lipophilic parameters. By comparing two regression curves derived from plant flavonoids against Gram-negative and Gram-positive bacteria, it was further discovered that the antibacterial activities of most plant flavonoids against Gram-negative bacteria are stronger than those against Gram-positive bacteria when their LogP values are less than approximately 3.0, but the opposite is true when their LogP values are more than approximately 3.6. Moreover, this comparison also suggests that unlike mainly acting on the cell membrane of Gram-positive bacteria, plant flavonoids have multiple mechanisms against Gram-negative species, while the cell membrane is also an important action site among them. Combined with the correlation analyses between the enzyme inhibitory activity and the LogP value of the reported flavonoids, it was further suggested that DNA gyrase is another important target of plant flavonoids against Gram-negative bacteria.

17.
Materials (Basel) ; 17(6)2024 Mar 11.
Article in English | MEDLINE | ID: mdl-38541454

ABSTRACT

Thermal power plant slag is a waste that is presently obtained from many power stations all over the world. A possible method for its utilization is using it to produce concrete. This paper analyses the effect of thermal power plant slag on the technological properties of concrete mixtures and the mechanical properties of concrete subjected to heat-moisture processing. Quantitative estimates of the investigated factors' influence on the concrete mixture's water demand and the strength of steamed concrete were obtained. The influences of TPP slag content and its water demand on concrete composition features as well as concrete strength are shown. The novelty of the work lies in the use of an experimental-statistical model to optimize the composition of steamed concrete using slag from the viewpoint of maximum strength per kilogram of cement. It has been demonstrated that the optimal part of slag in aggregate, which provides maximum strength at 4 h and 28 days after steaming, is 0.5-0.55 and 0.45-0.55, respectively. A method for the design of concrete composition using slag from thermal power plants is proposed.

18.
Environ Sci Pollut Res Int ; 31(19): 28178-28197, 2024 Apr.
Article in English | MEDLINE | ID: mdl-38528221

ABSTRACT

The present paper considers the results of long-term (up to 17 years) in situ and laboratory research carried out on oiled French, Spanish, and Russian seacoasts. The objective of this research is to quantify the influence of geographical factors on the rates of natural transformation of the heavy fuel oil stranded ashore and to develop an empirical statistical model in order to evaluate the self-cleansing capacity of the coastal environment. In a number of field campaigns, 363 samples of weathered oil slicks and tar balls have been collected and analysed with the use of thin-layer chromatography combined with optical and gravimetric methods. The results obtained have been subjected to multiple nonlinear regression analyses. It has been shown that heavy fuel oil natural attenuation is more active in continental or estuarine environments influenced by nutrient-rich freshwater runoff and characterised by a higher number of sunny days, solar irradiation, and large temperature fluctuations. On the oceanic coasts, especially in sectors with low hydrodynamic energy, these processes take more time. The resulting model allows for the identification and mapping of the most vulnerable seacoasts, characterised by a low potential to degrade oil pollution. This information may be used in the contingency plans in order to optimise clean-up techniques and associated costs.


Subject(s)
Fuel Oils , Environmental Monitoring , Petroleum Pollution , Water Pollutants, Chemical/analysis , Models, Theoretical , Petroleum
19.
Microb Cell Fact ; 23(1): 67, 2024 Feb 24.
Article in English | MEDLINE | ID: mdl-38402403

ABSTRACT

BACKGROUND: In recent years, the production of inclusion bodies that retain substantial catalytic activity was demonstrated. These catalytically active inclusion bodies (CatIBs) are formed by genetic fusion of an aggregation-inducing tag to a gene of interest via short linker polypeptides. The resulting CatIBs are known for their easy and cost-efficient production, recyclability as well as their improved stability. Recent studies have outlined the cooperative effects of linker and aggregation-inducing tag on CatIB activities. However, no a priori prediction is possible so far to indicate the best combination thereof. Consequently, extensive screening is required to find the best performing CatIB variant. RESULTS: In this work, a semi-automated cloning workflow was implemented and used for fast generation of 63 CatIB variants with glucose dehydrogenase of Bacillus subtilis (BsGDH). Furthermore, the variant BsGDH-PT-CBDCell was used to develop, optimize and validate an automated CatIB screening workflow, enhancing the analysis of many CatIB candidates in parallel. Compared to previous studies with CatIBs, important optimization steps include the exclusion of plate position effects in the BioLector by changing the cultivation temperature. For the overall workflow including strain construction, the manual workload could be reduced from 59 to 7 h for 48 variants (88%). After demonstration of high reproducibility with 1.9% relative standard deviation across 42 biological replicates, the workflow was performed in combination with a Bayesian process model and Thompson sampling. While the process model is crucial to derive key performance indicators of CatIBs, Thompson sampling serves as a strategy to balance exploitation and exploration in screening procedures. Our methodology allowed analysis of 63 BsGDH-CatIB variants within only three batch experiments. Because of the high likelihood of TDoT-PT-BsGDH being the best CatIB performer, it was selected in 50 biological replicates during the three screening rounds, much more than other, low-performing variants. CONCLUSIONS: At the current state of knowledge, every new enzyme requires screening for different linker/aggregation-inducing tag combinations. For this purpose, the presented CatIB toolbox facilitates fast and simplified construction and screening procedures. The methodology thus assists in finding the best CatIB producer from large libraries in short time, rendering possible automated Design-Build-Test-Learn cycles to generate structure/function learnings.


Subject(s)
Automation, Laboratory , High-Throughput Screening Assays , Reproducibility of Results , Bayes Theorem , Inclusion Bodies , Automation
20.
Alzheimers Res Ther ; 16(1): 48, 2024 02 29.
Article in English | MEDLINE | ID: mdl-38424559

ABSTRACT

BACKGROUND: The clinical meaningfulness of the effects of recently approved disease-modifying treatments (DMT) in Alzheimer's disease is under debate. Available evidence is limited to short-term effects on clinical rating scales which may be difficult to interpret and have limited intrinsic meaning to patients. The main value of DMTs accrues over the long term as they are expected to cause a delay or slowing of disease progression. While awaiting such evidence, the translation of short-term effects to time delays or slowing of progression could offer a powerful and readily interpretable representation of clinical outcomes. METHODS: We simulated disease progression trajectories representing two arms, active and placebo, of a hypothetical clinical trial of a DMT. The placebo arm was simulated based on estimated mean trajectories of clinical dementia rating scale-sum of boxes (CDR-SB) recordings from amyloid-positive subjects with mild cognitive impairment (MCI) from Alzheimer's Disease Neuroimaging Initiative (ADNI). The active arm was simulated to show an average slowing of disease progression versus placebo of 20% at each visit. The treatment effects in the simulated trials were estimated with a progression model for repeated measures (PMRM) and a mixed model for repeated measures (MMRM) for comparison. For PMRM, the treatment effect is expressed in units of time (e.g., days) and for MMRM in units of the outcome (e.g., CDR-SB points). PMRM results were implemented in a health economics Markov model extrapolating disease progression and death over 15 years. RESULTS: The PMRM model estimated a 19% delay in disease progression at 18 months and 20% (~ 7 months delay) at 36 months, while the MMRM model estimated a 25% reduction in CDR-SB (~ 0.5 points) at 36 months. The PMRM model had slightly greater power compared to MMRM. The health economic model based on the estimated time delay suggested an increase in life expectancy (10 months) without extending time in severe stages of disease. CONCLUSION: PMRM methods can be used to estimate treatment effects in terms of slowing of progression which translates to time metrics that can be readily interpreted and appreciated as meaningful outcomes for patients, care partners, and health care practitioners.


Subject(s)
Alzheimer Disease , Cognitive Dysfunction , Humans , Alzheimer Disease/diagnostic imaging , Alzheimer Disease/drug therapy , Cognitive Dysfunction/diagnostic imaging , Cognitive Dysfunction/drug therapy , Disease Progression , Mental Status and Dementia Tests , Research Design , Clinical Trials as Topic , Models, Theoretical
SELECTION OF CITATIONS
SEARCH DETAIL
...