Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 191
Filtrar
1.
Front Neurosci ; 18: 1380150, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38560044

RESUMO

Background: The wheelchair is a widely used rehabilitation device, which is indispensable for people with limited mobility. In the process of using a wheelchair, they often face the situation of sitting for a long time, which is easy to cause fatigue of the waist muscles of the user. Therefore, this paper hopes to provide more scientific guidance and suggestions for the daily use of wheelchairs by studying the relationship between the development of muscle fatigue and sitting posture. Methods: First, we collected surface Electromyography (sEMG) of human vertical spine muscle and analyzed it in the frequency domain. The obtained Mean Power Frequency (MPF) was used as the dependent variable. Then, the pose information of the human body, including the percentage of pressure points, span, and center of mass as independent variables, was collected by the array of thin film pressure sensors, and analyzed by a multivariate nonlinear regression model. Results: When the centroid row coordinate of the cushion pressure point is about 16(range, 7.7-16.9), the cushion pressure area percentage is about 80%(range, 70.8%-89.7%), and the cushion pressure span range is about 27(range, 25-31), the backrest pressure point centroid row coordinate is about 15(range, 9.1-18.2), the backrest pressure area percentage is about 35%(range, 11.8%-38.7%), and the backrest pressure span range is about 16(range, 9-22). At this time, the MPF value of the subjects decreased by a small percentage, and the fatigue development of the muscles was slower. In addition, the pressure area percentage at the seat cushion is a more sensitive independent variable, too large or too small pressure area percentage will easily cause lumbar muscle fatigue. Conclusion: The results show that people should sit in the middle and back of the seat cushion when riding the wheelchair, so that the Angle of the hip joint can be in a natural state, and the thigh should fully contact the seat cushion to avoid the weight of the body concentrated on the buttocks; The back should be fully in contact with the back of the wheelchair to reduce the burden on the waist, and the spine posture can be adjusted appropriately according to personal habits, but it is necessary to avoid maintaining a chest sitting position for a long time, which will cause the lumbar spine to be in an unnatural physiological Angle and easily lead to fatigue of the waist muscles.

2.
Heliyon ; 10(7): e28775, 2024 Apr 15.
Artigo em Inglês | MEDLINE | ID: mdl-38617962

RESUMO

Focusing on the situation of the low helium content in natural gas resource in China and the high cost of helium extraction, the OPEX prediction model of helium extraction that based on the Response Surface Methodology (RSM) is proposed. This method applies ASPEN-HYSYS software to simulate the helium extraction process flow for a given product composition, pressure, and temperature; Applying the Design Expert module for Response Surface Methodology(RSM) parameter design, combined with OPEX of existing projects, determine the key influencing factors and upper and lower limits of OPEX, and obtaining the corresponding OPEX for different parameter values; Applying the Box Behnken Design (BBD) principle to optimize the helium extraction process parameters of RSM, based on fitting results and parameter significance verification of second-order regression function, the OPEX prediction model is built.This method is applied to a domestic helium extraction project, and the unit helium extraction cost is between 100 and 119.52 yuan/m3, IRR is 13.37%. The result shows the project has economic benefit, and the method presents a good perspective application.

3.
Heliyon ; 10(6): e27779, 2024 Mar 30.
Artigo em Inglês | MEDLINE | ID: mdl-38533045

RESUMO

Background and objective: Hypertension is a potentially dangerous health condition that can be detected by measuring blood pressure (BP). Blood pressure monitoring and measurement are essential for preventing and treating cardiovascular diseases. Cuff-based devices, on the other hand, are uncomfortable and prevent continuous BP measurement. Methods: In this study, a new non-invasive and cuff-less method for estimating Systolic Blood Pressure (SBP), Mean Arterial Pressure (MAP), and Diastolic Blood Pressure (DBP) has been proposed using characteristic features of photoplethysmogram (PPG) signals and nonlinear regression algorithms. PPG signals were collected from 219 participants, which were then subjected to preprocessing and feature extraction steps. Analyzing PPG and its derivative signals, a total of 46 time, frequency, and time-frequency domain features were extracted. In addition, the age and gender of each subject were also included as features. Further, correlation-based feature selection (CFS) and Relief F feature selection (ReliefF) techniques were used to select the relevant features and reduce the possibility of over-fitting the models. Finally, support vector regression (SVR), K-nearest neighbour regression (KNR), decision tree regression (DTR), and random forest regression (RFR) were established to develop the BP estimation model. Regression models were trained and evaluated on all features as well as selected features. The best regression models for SBP, MAP, and DBP estimations were selected separately. Results: The SVR model, along with the ReliefF-based feature selection algorithm, outperforms other algorithms in estimating the SBP, MAP, and DBP with the mean absolute error of 2.49, 1.62 and 1.43 mmHg, respectively. The proposed method meets the Advancement of Medical Instrumentation standard for BP estimations. Based on the British Hypertension Society standard, the results also fall within Grade A for SBP, MAP, and DBP. Conclusion: The findings show that the method can be used to estimate blood pressure non-invasively, without using a cuff or calibration, and only by utilizing the PPG signal characteristic features.

4.
Open Mind (Camb) ; 8: 235-264, 2024.
Artigo em Inglês | MEDLINE | ID: mdl-38528907

RESUMO

The dynamics of the mind are complex. Mental processes unfold continuously in time and may be sensitive to a myriad of interacting variables, especially in naturalistic settings. But statistical models used to analyze data from cognitive experiments often assume simplistic dynamics. Recent advances in deep learning have yielded startling improvements to simulations of dynamical cognitive processes, including speech comprehension, visual perception, and goal-directed behavior. But due to poor interpretability, deep learning is generally not used for scientific analysis. Here, we bridge this gap by showing that deep learning can be used, not just to imitate, but to analyze complex processes, providing flexible function approximation while preserving interpretability. To do so, we define and implement a nonlinear regression model in which the probability distribution over the response variable is parameterized by convolving the history of predictors over time using an artificial neural network, thereby allowing the shape and continuous temporal extent of effects to be inferred directly from time series data. Our approach relaxes standard simplifying assumptions (e.g., linearity, stationarity, and homoscedasticity) that are implausible for many cognitive processes and may critically affect the interpretation of data. We demonstrate substantial improvements on behavioral and neuroimaging data from the language processing domain, and we show that our model enables discovery of novel patterns in exploratory analyses, controls for diverse confounds in confirmatory analyses, and opens up research questions in cognitive (neuro)science that are otherwise hard to study.

5.
Proc Natl Acad Sci U S A ; 121(10): e2307876121, 2024 Mar 05.
Artigo em Inglês | MEDLINE | ID: mdl-38422017

RESUMO

During real-time language comprehension, our minds rapidly decode complex meanings from sequences of words. The difficulty of doing so is known to be related to words' contextual predictability, but what cognitive processes do these predictability effects reflect? In one view, predictability effects reflect facilitation due to anticipatory processing of words that are predictable from context. This view predicts a linear effect of predictability on processing demand. In another view, predictability effects reflect the costs of probabilistic inference over sentence interpretations. This view predicts either a logarithmic or a superlogarithmic effect of predictability on processing demand, depending on whether it assumes pressures toward a uniform distribution of information over time. The empirical record is currently mixed. Here, we revisit this question at scale: We analyze six reading datasets, estimate next-word probabilities with diverse statistical language models, and model reading times using recent advances in nonlinear regression. Results support a logarithmic effect of word predictability on processing difficulty, which favors probabilistic inference as a key component of human language processing.


Assuntos
Compreensão , Idioma , Humanos , Modelos Estatísticos
6.
Regen Med ; 19(1): 27-45, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38247346

RESUMO

Aim: Cell viability assays are critical for cell-based products. Here, we demonstrate a combined experimental and computational approach to identify fit-for-purpose cell assays that can predict changes in cell proliferation, a critical biological response in cell expansion. Materials & methods: Jurkat cells were systematically injured using heat (45 ± 1°C). Cell viability was measured at 0 h and 24 h after treatment using assays for membrane integrity, metabolic function and apoptosis. Proliferation kinetics for longer term cultures were modeled using the Gompertz distribution to establish predictive models between cell viability results and proliferation. Results & conclusion: We demonstrate an approach for ranking these assays as predictors of cell proliferation and for setting cell viability specifications when a particular proliferation response is required.


In recent years, there has been a surge in the amount of cellular therapy products which have been engineered to treat patients with severe diseases. These cellular products use living cells to treat the disease, and the quality of these cell products is critical for ensuring product safety and effectiveness. Throughout the process of engineering and manufacturing these cell products, many cells can die or be in the process of dying, and the amount of dead cells in the product can impact product yield and quality. In any given cell product at any given time during the manufacturing process, cells are exposed to stresses, and these stresses can injure the cells through several mechanisms, leading to a range of cell death events that can follow different timelines. There are many existing assays which evaluate the health of the cells, known as cell viability assays, and these assays can be based on many different cell features that indicate a cell has been injured (i.e., cell membrane permeability, changes in cell metabolism, molecular markers for cell death). These cell viability assays provide different insights into the state of cell health/injury based on what cell features are being evaluated and the timing at which the viability measurements are taken, and some viability assays may be more appropriate than others for specific applications. Therefore, a method is needed to appropriately select cell viability assays that are designed to evaluate injuries to cells that occur in specific bioprocess. In this series of studies, we used a range of analytical methods to study the number of living and dead cells in a series of cell populations that we treated to induce damage to the cells, reducing their ability to grow. We then used mathematical models to determine the relationship between cell viability measurements and cell growth over time, and used the results to determine the sensitivity of the viability assays to changes in cell growth. We used a specific cell line in this example, but this technique can be applied to any cell line or cell sample population and different types of injuries can be applied to the cells. This approach can be used by manufacturers of cell-based products and therapies to identify cell viability assays that are meaningful for monitoring the production of cells and characterizing product quality.


Assuntos
Apoptose , Humanos , Sobrevivência Celular , Proliferação de Células
7.
Micromachines (Basel) ; 15(1)2024 Jan 05.
Artigo em Inglês | MEDLINE | ID: mdl-38258221

RESUMO

This paper presents the measurement and evaluation of the surfaces of molds produced using additive technologies. This is an emerging trend in mold production. The surfaces of such molds must be treated, usually using laser-based alternative machining methods. Regular evaluation is necessary because of the gradually deteriorating quality of the mold surface. However, owing to the difficulty in scanning the original surface of the injection mold, it is necessary to perform surface replication. Therefore, this study aims to describe the production of surface replicas for in-house developed polymer molds together with the determination of suitable descriptive parameters, the method of comparing variances, and the mean values for the surface evaluation. Overall, this study presents a new summary of the evaluation process of replicas of the surfaces of polymer molds. The nonlinear regression methodology provides the corresponding functional dependencies between the relevant parameters. The statistical significance of a neural network with two hidden layers based on the principle of Rosenblatt's perceptron has been proposed and verified. Additionally, machine learning was utilized to better compare the original surface and its replica.

8.
Sci Total Environ ; 918: 170422, 2024 Mar 25.
Artigo em Inglês | MEDLINE | ID: mdl-38290674

RESUMO

Although mechanochemical remediation of organic-contaminated soil has received substantial attention in recent years, the effects of soil properties on soil remediation performance are not clear. In this work, the properties and elemental components of 16 soils were tested, and the mechanochemical degradation performance of lindane in these soils was investigated through experiments. Most importantly, the relationships between soil variables and the mechanochemical degradation rates of lindane in the additive-free and CaO systems were elucidated. The results showed that the mechanochemical degradation efficiencies of lindane in the 16 soils were significantly different without additives, with a range of 31.0 %-97.2 % after 4 h. The mechanochemical degradation rates of lindane in the 16 soils varied from 0.7 h-1 to 15 h-1 after the addition of 9 % CaO. Correlation analysis, redundancy analysis and the partial least squares path modeling results clearly showed that the main factors affecting the reaction rate (k1) without additives were organic matter (-) > clay (+) > bound water (-) > Si (+). After the addition of 9 % CaO, the order in which the main factors affected the reaction rate (k2) was organic matter (-) > bound water (-) > Ti/Fe/Al (-) > pH (+) > clay (+). The established and corrected multiple nonlinear regression equations can be used to accurately predict the mechanochemical degradation performance of hexachlorocyclohexanes in actual soils with and without additives.

9.
Biom J ; 66(1): e2200092, 2024 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-37068189

RESUMO

Quantifying drug potency, which requires an accurate estimation of dose-response relationship, is essential for drug development in biomedical research and life sciences. However, the standard estimation procedure of the median-effect equation to describe the dose-response curve is vulnerable to extreme observations in common experimental data. To facilitate appropriate statistical inference, many powerful estimation tools have been developed in R, including various dose-response packages based on the nonlinear least squares method with different optimization strategies. Recently, beta regression-based methods have also been introduced in estimation of the median-effect equation. In theory, they can overcome nonnormality, heteroscedasticity, and asymmetry and accommodate flexible robust frameworks and coefficients penalization. To identify a reliable estimation method(s) to estimate dose-response curves even with extreme observations, we conducted a comparative study to review 14 different tools in R and examine their robustness and efficiency via Monte Carlo simulation under a list of comprehensive scenarios. The simulation results demonstrate that penalized beta regression using the mgcv package outperforms other methods in terms of stable, accurate estimation, and reliable uncertainty quantification.


Assuntos
Simulação por Computador , Análise de Regressão , Incerteza , Método de Monte Carlo
10.
J Environ Manage ; 351: 119760, 2024 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-38086124

RESUMO

Saturated hydraulic conductivity (Ks) of the filler layer in grassed swales are varying in the changing environment. In most of the hydrological models, Ks is assumed as constant or decrease with a clogging factor. However, the Ks measured on site cannot be the input of the hydrological model directly. Therefore, in this study, an Ensemble Kalman Filter (EnKF) based approach was carried out to estimate the Ks of the whole systems in two monitored grassed swales at Enschede and Utrecht, the Netherlands. The relationship between Ks and possible influencing factors (antecedent dry period, temperature, rainfall, rainfall duration, total rainfall and seasonal factors) were studied and a Multivariate nonlinear function was established to optimize the hydrological model. The results revealed that the EnKF method was satisfying in the Ks estimation, which showed a notable decrease after long-term operation, but revealed a recovery in summer and winter. After the addition of Multivariate nonlinear function of the Ks into hydrological model, 63.8% of the predicted results were optimized among the validation events, and compared with constant Ks. A sensitivity analysis revealed that the effect of each influencing factors on the Ks varies depending on the type of grassed swale. However, these findings require further investigation and data support.


Assuntos
Poaceae , Solo , Países Baixos , Fenômenos Químicos , Hidrologia
11.
Clin Chem Lab Med ; 62(4): 635-645, 2024 Mar 25.
Artigo em Inglês | MEDLINE | ID: mdl-37982680

RESUMO

OBJECTIVES: Patient-based real-time quality control (PBRTQC), a laboratory tool for monitoring the performance of the testing process, has gained increasing attention in recent years. It has been questioned for its generalizability among analytes, instruments, laboratories, and hospitals in real-world settings. Our purpose was to build a machine learning, nonlinear regression-adjusted, patient-based real-time quality control (mNL-PBRTQC) with wide application. METHODS: Using computer simulation, artificial biases were added to patient population data of 10 measurands. An mNL-PBRTQC was created using eight hospital laboratory databases as a training set and validated by three other hospitals' independent patient datasets. Three different Patient-based models were compared on these datasets, the IFCC PBRTQC model, linear regression-adjusted real-time quality control (L-RARTQC), and the mNL-PBRTQC model. RESULTS: Our study showed that in the three independent test data sets, mNL-PBRTQC outperformed the IFCC PBRTQC and L-RARTQC for all measurands and all biases. Using platelets as an example, it was found that for 20 % bias, both positive and negative, the uncertainty of error detection for mNL-PBRTQC was smallest at the median and maximum values. CONCLUSIONS: mNL-PBRTQC is a robust machine learning framework, allowing accurate error detection, especially for analytes that demonstrate instability and for detecting small biases.


Assuntos
Aprendizado de Máquina , Humanos , Simulação por Computador , Controle de Qualidade
12.
Bone ; 180: 116999, 2024 03.
Artigo em Inglês | MEDLINE | ID: mdl-38158169

RESUMO

Bone Mineral Density (BMD) is an important parameter in the development of orthopedic fracture-healing methods. A recent article (Inoue, S., et al. Bone. 2023, 177, 116916) investigated the use of higher intensity ultrasound to promote murine bone formation by measuring BMD levels. In this work, we present the numerical values of BMD, which show sigmoid kinetics and hyperbolic asymptotic increase with the application of higher intensity ultrasound. Our analysis may provide a foundation for the understanding and application of ultrasound to the human body.


Assuntos
Densidade Óssea , Calcificação Fisiológica , Humanos , Camundongos , Animais , Ultrassonografia , Osso e Ossos/diagnóstico por imagem , Osteogênese
13.
Beilstein J Nanotechnol ; 14: 1200-1207, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-38116471

RESUMO

AFM sharp tips are used to characterize nanostructures and quantify the mechanical properties of the materials in several areas of research. The analytical results can show unpredicted errors if we do not know the exact values of the AFM tip radius. There are many techniques of in situ measurements for determining the actual AFM tip radius, but they are limited to uncoated tips. This paper presents an alternative and simple method to determine the radii of coated tips and an uncoated tip. Pt-coated, Cr/Au-coated, and uncoated Si tips were used to scan a calibration standard grating in AFM contact mode with sub-nanonewton load to obtain the curved scan profile of the edge corner of the grating structure. The data points of the curved profile of each tip were fitted with a nonlinear regression function to estimate the curvature radius of the tip. The results show that the estimated radius of the coated tips is in the range of nominal values provided by the tip manufacturer, while the estimated radius of the uncoated Si tip is bigger than the nominal radius because of tip blunting during the scan. However, this method yields an accurate estimate of the tip radius with a low root mean squared error of the curve fitting results.

14.
Environ Monit Assess ; 196(1): 53, 2023 Dec 18.
Artigo em Inglês | MEDLINE | ID: mdl-38110584

RESUMO

The soil contamination around smelting sites shows high spatial heterogeneity. This study investigated the impacts of distance, land use/cover types, land slopes, wind direction, and soil properties on the distribution and ecological risk of trace metals in the soil around a copper smelter. The results demonstrated that the average concentrations of As, Cd, Cu, Pb, and Zn were 248.0, 16.8, 502.4, 885.6, and 250.2 g mg kg-1, respectively, higher than their background values. The hotspots of trace metals were primarily distributed in the soil of smelting production areas, runoff pollution areas, and areas in the dominant wind direction. The concentrations of trace metals decreased with the distance to the smelting production area. An exponential decay regression revealed that, depending on the metal species, the influence distances of smelting emissions on trace metals in soil ranged from 450 to 1000 m. Land use/cover types and land slopes significantly affected trace element concentrations in the soil around the smelter. High concentrations of trace metals were observed in farmland, grassland, and flatland areas. The average concentrations of trace metals in the soil decreased in the order of flat land > gentle slope > steep slope. Soil pH values were significantly positively correlated with Cd, Cu, Pb, Zn, and As, and SOM was significantly positively correlated with Cd, Pb, and Zn in the soil. Trace metals in the soil of the study area posed a significant ecological risk. The primary factors influencing the distribution of ecological risk, as determined by the Ctree analysis, were land slope, soil pH, and distance to the source. These results can support the rapid identification of high-risk sites and facilitate risk prevention and control around smelting sites.


Assuntos
Metais Pesados , Poluentes do Solo , Oligoelementos , Solo/química , Metais Pesados/análise , Cobre/análise , Monitoramento Ambiental/métodos , Cádmio/análise , Chumbo/análise , Poluentes do Solo/análise , Medição de Risco , Oligoelementos/análise , China
15.
Math Biosci Eng ; 20(9): 16045-16059, 2023 08 07.
Artigo em Inglês | MEDLINE | ID: mdl-37920002

RESUMO

In this study, due to multiple cases of dengue fever in two locations in Haikou, Hainan, several factors affecting the transmission of dengue fever in Haikou in 2019 were analyzed. It was found that dengue fever spread from two sites: a construction site, which was an epidemic site in Haikou, and the university, where only four confirmed cases were reported. Comparative analysis revealed that the important factors affecting the spread of dengue fever in Haikou were environmental hygiene status, knowledge popularization of dengue fever, educational background, medical insurance coverage and free treatment policy knowledge and active response by the government.


Assuntos
Dengue , Epidemias , Humanos , Dengue/epidemiologia , Meio Ambiente , Cidades/epidemiologia
16.
Comput Med Imaging Graph ; 110: 102314, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37988845

RESUMO

In this paper, we address the problem of estimating remaining surgery duration (RSD) from surgical video frames. We propose a Bayesian long short-term memory (LSTM) network-based Deep Negative Correlation Learning approach called BD-Net for accurate regression of RSD prediction as well as estimation of prediction uncertainty. Our method aims to extract discriminative visual features from surgical video frames and model the temporal dependencies among frames to improve the RSD prediction accuracy. To this end, we propose to train an ensemble of Bayesian LSTMs on top of a backbone network by the way of deep negative correlation learning (DNCL). More specifically, we deeply learn a pool of decorrelated Bayesian regressors with sound generalization capabilities through managing their intrinsic diversities. BD-Net is simple and efficient. After training, it can produce both RSD prediction and uncertainty estimation in a single inference run. We demonstrate the efficacy of BD-Net on publicly available datasets of two different types of surgeries: one containing 101 cataract microscopic surgeries with short durations and the other containing 80 cholecystectomy laparoscopic surgeries with relatively longer durations. Experimental results on both datasets demonstrate that the proposed BD-Net achieves better results than the state-of-the-art (SOTA) methods. A reference implementation of our method can be found at: https://github.com/jywu511/BD-Net.


Assuntos
Aprendizagem , Teorema de Bayes , Incerteza
17.
BMC Bioinformatics ; 24(1): 393, 2023 Oct 19.
Artigo em Inglês | MEDLINE | ID: mdl-37858091

RESUMO

BACKGROUND: An important problem in toxicology in the context of gene expression data is the simultaneous inference of a large number of concentration-response relationships. The quality of the inference substantially depends on the choice of design of the experiments, in particular, on the set of different concentrations, at which observations are taken for the different genes under consideration. As this set has to be the same for all genes, the efficient planning of such experiments is very challenging. We address this problem by determining efficient designs for the simultaneous inference of a large number of concentration-response models. For that purpose, we both construct a D-optimality criterion for simultaneous inference and a K-means procedure which clusters the support points of the locally D-optimal designs of the individual models. RESULTS: We show that a planning of experiments that addresses the simultaneous inference of a large number of concentration-response relationships yields a substantially more accurate statistical analysis. In particular, we compare the performance of the constructed designs to the ones of other commonly used designs in terms of D-efficiencies and in terms of the quality of the resulting model fits using a real data example dealing with valproic acid. For the quality comparison we perform an extensive simulation study. CONCLUSIONS: The design maximizing the D-optimality criterion for simultaneous inference improves the inference of the different concentration-response relationships substantially. The design based on the K-means procedure also performs well, whereas a log-equidistant design, which was also included in the analysis, performs poorly in terms of the quality of the simultaneous inference. Based on our findings, the D-optimal design for simultaneous inference should be used for upcoming analyses dealing with high-dimensional gene expression data.


Assuntos
Projetos de Pesquisa , Simulação por Computador
18.
Methods Enzymol ; 690: 131-157, 2023.
Artigo em Inglês | MEDLINE | ID: mdl-37858528

RESUMO

A common mantra in drug discovery is that "You get what you screen for." This is not a promise that you will always get an effective drug candidate, but rather a warning that inaccuracies in your protocol for screening will more likely produce a compound that fails to be an effective candidate because it matches the properties of your screen, not the desired features of an ideal lead compound. It is with this in mind that we examine the current protocols for evaluating drug candidates and highlight some deficiencies while pointing the way to better methods. Many of the errors in data fitting can be rectified by abandoning the traditional equation-based data fitting methods and adopting the more rigorous mechanism-based fitting afforded by computer simulation based on numerical integration of rate equations. Using these methods bypasses the errors in judgement in choosing the appropriate equation for data fitting and the approximations required to derive those equations. In this chapter we outline the limitations and systematic errors in conventional methods of data fitting and illustrate the advantages of computer simulation and introduce the methods of analysis.


Assuntos
Descoberta de Drogas , Projetos de Pesquisa , Simulação por Computador
19.
Bio Protoc ; 13(18): e4820, 2023 Sep 20.
Artigo em Inglês | MEDLINE | ID: mdl-37753469

RESUMO

Information on RNA localisation is essential for understanding physiological and pathological processes, such as gene expression, cell reprogramming, host-pathogen interactions, and signalling pathways involving RNA transactions at the level of membrane-less or membrane-bounded organelles and extracellular vesicles. In many cases, it is important to assess the topology of RNA localisation, i.e., to distinguish the transcripts encapsulated within an organelle of interest from those merely attached to its surface. This allows establishing which RNAs can, in principle, engage in local molecular interactions and which are prevented from interacting by membranes or other physical barriers. The most widely used techniques interrogating RNA localisation topology are based on the treatment of isolated organelles with RNases with subsequent identification of the surviving transcripts by northern blotting, qRT-PCR, or RNA-seq. However, this approach produces incoherent results and many false positives. Here, we describe Controlled Level of Contamination coupled to deep sequencing (CoLoC-seq), a more refined subcellular transcriptomics approach that overcomes these pitfalls. CoLoC-seq starts by the purification of organelles of interest. They are then either left intact or lysed and subjected to a gradient of RNase concentrations to produce unique RNA degradation dynamics profiles, which can be monitored by northern blotting or RNA-seq. Through straightforward mathematical modelling, CoLoC-seq distinguishes true membrane-enveloped transcripts from degradable and non-degradable contaminants of any abundance. The method has been implemented in the mitochondria of HEK293 cells, where it outperformed alternative subcellular transcriptomics approaches. It is applicable to other membrane-bounded organelles, e.g., plastids, single-membrane organelles of the vesicular system, extracellular vesicles, or viral particles. Key features • Tested on human mitochondria; potentially applicable to cell cultures, non-model organisms, extracellular vesicles, enveloped viruses, tissues; does not require genetic manipulations or highly pure organelles. • In the case of human cells, the required amount of starting material is ~2,500 cm2 of 80% confluent cells (or ~3 × 108 HEK293 cells). • CoLoC-seq implements a special RNA-seq strategy to selectively capture intact transcripts, which requires RNases generating 5'-hydroxyl and 2'/3'-phosphate termini (e.g., RNase A, RNase I). • Relies on nonlinear regression software with customisable exponential functions.

20.
Bioengineering (Basel) ; 10(8)2023 Aug 06.
Artigo em Inglês | MEDLINE | ID: mdl-37627818

RESUMO

Microarray gene expression-based detection and classification of medical conditions have been prominent in research studies over the past few decades. However, extracting relevant data from the high-volume microarray gene expression with inherent nonlinearity and inseparable noise components raises significant challenges during data classification and disease detection. The dataset used for the research is the Lung Harvard 2 Dataset (LH2) which consists of 150 Adenocarcinoma subjects and 31 Mesothelioma subjects. The paper proposes a two-level strategy involving feature extraction and selection methods before the classification step. The feature extraction step utilizes Short Term Fourier Transform (STFT), and the feature selection step employs Particle Swarm Optimization (PSO) and Harmonic Search (HS) metaheuristic methods. The classifiers employed are Nonlinear Regression, Gaussian Mixture Model, Softmax Discriminant, Naive Bayes, SVM (Linear), SVM (Polynomial), and SVM (RBF). The two-level extracted relevant features are compared with raw data classification results, including Convolutional Neural Network (CNN) methodology. Among the methods, STFT with PSO feature selection and SVM (RBF) classifier produced the highest accuracy of 94.47%.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...