Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 76
Filtrar
Mais filtros

Base de dados
País/Região como assunto
Tipo de documento
Intervalo de ano de publicação
1.
Funct Integr Genomics ; 23(1): 47, 2023 Jan 24.
Artigo em Inglês | MEDLINE | ID: mdl-36692535

RESUMO

Climate change seriously impacts global agriculture, with rising temperatures directly affecting the yield. Vegetables are an essential part of daily human consumption and thus have importance among all agricultural crops. The human population is increasing daily, so there is a need for alternative ways which can be helpful in maximizing the harvestable yield of vegetables. The increase in temperature directly affects the plants' biochemical and molecular processes; having a significant impact on quality and yield. Breeding for climate-resilient crops with good yields takes a long time and lots of breeding efforts. However, with the advent of new omics technologies, such as genomics, transcriptomics, proteomics, and metabolomics, the efficiency and efficacy of unearthing information on pathways associated with high-temperature stress resilience has improved in many of the vegetable crops. Besides omics, the use of genomics-assisted breeding and new breeding approaches such as gene editing and speed breeding allow creation of modern vegetable cultivars that are more resilient to high temperatures. Collectively, these approaches will shorten the time to create and release novel vegetable varieties to meet growing demands for productivity and quality. This review discusses the effects of heat stress on vegetables and highlights recent research with a focus on how omics and genome editing can produce temperature-resilient vegetables more efficiently and faster.


Assuntos
Melhoramento Vegetal , Verduras , Humanos , Verduras/genética , Produtos Agrícolas/genética , Genômica , Proteômica
2.
Mol Biol Rep ; 50(8): 6783-6793, 2023 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-37392286

RESUMO

BACKGROUND: Bacterial diseases are a huge threat to the production of tomatoes. During infection intervals, pathogens affect biochemical, oxidant and molecular properties of tomato. Therefore, it is necessary to study the antioxidant enzymes, oxidation state and genes involved during bacterial infection in tomato. METHODS AND RESULTS: Different bioinformatic analyses were performed to conduct homology, gene promoter analysis and determined protein structure. Antioxidant, MDA and H2O2 response was measured in Falcon, Rio grande and Sazlica tomato cultivars. In this study, RNA Polymerase II (RNAP) C-Terminal Domain Phosphatase-like 3 (SlCPL-3) gene was identified and characterized. It contained 11 exons, and encoded for two protein domains i.e., CPDCs and BRCT. SOPMA and Phyre2, online bioinformatic tools were used to predict secondary structure. For the identification of protein pockets CASTp web-based tool was used. Netphos and Pondr was used for prediction of phosphorylation sites and protein disordered regions. Promoter analysis revealed that the SlCPL-3 is involved in defense-related mechanisms. We further amplified two different regions of SlCPL-3 and sequenced them. It showed homology respective to the reference tomato genome. Our results showed that SlCPL-3 gene was triggered during bacterial stress. SlCPL-3 expression was upregulated in response to bacterial stress during different time intervals. Rio grande showed a high level of SICPL-3 gene expression after 72 hpi. Biochemical and gene expression analysis showed that under biotic stress Rio grande cultivar is more sensitive to Pst DC 3000 bacteria. CONCLUSION: This study lays a solid foundation for the functional characterization of SlCPL-3 gene in tomato cultivars. All these findings would be beneficial for further analysis of SlCPL-3 gene and may be helpful for the development of resilient tomato cultivars.


Assuntos
Solanum lycopersicum , Solanum lycopersicum/genética , RNA Polimerase II/genética , Antioxidantes , Monoéster Fosfórico Hidrolases/genética , Peróxido de Hidrogênio/metabolismo , Estresse Fisiológico/genética , Doenças das Plantas/genética , Doenças das Plantas/microbiologia , Regulação da Expressão Gênica de Plantas/genética
3.
Sensors (Basel) ; 23(16)2023 Aug 14.
Artigo em Inglês | MEDLINE | ID: mdl-37631699

RESUMO

In the era of interconnected and intelligent cyber-physical systems, preserving privacy has become a paramount concern. This paper aims a groundbreaking proof-of-concept (PoC) design that leverages consortium blockchain technology to address privacy challenges in cyber-physical systems (CPSs). The proposed design introduces a novel approach to safeguarding sensitive information and ensuring data integrity while maintaining a high level of trust among stakeholders. By harnessing the power of consortium blockchain, the design establishes a decentralized and tamper-resistant framework for privacy preservation. However, ensuring the security and privacy of sensitive information within CPSs poses significant challenges. This paper proposes a cutting-edge privacy approach that leverages consortium blockchain technology to secure secrets in CPSs. Consortium blockchain, with its permissioned nature, provides a trusted framework for governing the network and validating transactions. By employing consortium blockchain, secrets in CPSs can be securely stored, shared, and accessed by authorized entities only, mitigating the risks of unauthorized access and data breaches. The proposed approach offers enhanced security, privacy preservation, increased trust and accountability, as well as interoperability and scalability. This paper aims to address the limitations of traditional security mechanisms in CPSs and harness the potential of consortium blockchain to revolutionize the management of secrets, contributing to the advancement of CPS security and privacy. The effectiveness of the design is demonstrated through extensive simulations and performance evaluations. The results indicate that the proposed approach offers significant advancements in privacy protection, paving the way for secure and trustworthy cyber-physical systems in various domains.

4.
Sensors (Basel) ; 23(17)2023 Aug 28.
Artigo em Inglês | MEDLINE | ID: mdl-37687931

RESUMO

Precision medicine has emerged as a transformative approach to healthcare, aiming to deliver personalized treatments and therapies tailored to individual patients. However, the realization of precision medicine relies heavily on the availability of comprehensive and diverse medical data. In this context, blockchain-enabled federated learning, coupled with electronic medical records (EMRs), presents a groundbreaking solution to unlock revolutionary insights in precision medicine. This abstract explores the potential of blockchain technology to empower precision medicine by enabling secure and decentralized data sharing and analysis. By leveraging blockchain's immutability, transparency, and cryptographic protocols, federated learning can be conducted on distributed EMR datasets without compromising patient privacy. The integration of blockchain technology ensures data integrity, traceability, and consent management, thereby addressing critical concerns associated with data privacy and security. Through the federated learning paradigm, healthcare institutions and research organizations can collaboratively train machine learning models on locally stored EMR data, without the need for data centralization. The blockchain acts as a decentralized ledger, securely recording the training process and aggregating model updates while preserving data privacy at its source. This approach allows the discovery of patterns, correlations, and novel insights across a wide range of medical conditions and patient populations. By unlocking revolutionary insights through blockchain-enabled federated learning and EMRs, precision medicine can revolutionize healthcare delivery. This paradigm shift has the potential to improve diagnosis accuracy, optimize treatment plans, identify subpopulations for clinical trials, and expedite the development of novel therapies. Furthermore, the transparent and auditable nature of blockchain technology enhances trust among stakeholders, enabling greater collaboration, data sharing, and collective intelligence in the pursuit of advancing precision medicine. In conclusion, this abstract highlights the transformative potential of blockchain-enabled federated learning in empowering precision medicine. By unlocking revolutionary insights from diverse and distributed EMR datasets, this approach paves the way for a future where healthcare is personalized, efficient, and tailored to the unique needs of each patient.


Assuntos
Blockchain , Medicina de Precisão , Humanos , Registros Eletrônicos de Saúde , Disseminação de Informação , Poder Psicológico
5.
Adv Anat Pathol ; 29(4): 241-251, 2022 Jul 01.
Artigo em Inglês | MEDLINE | ID: mdl-35249993

RESUMO

Metastases to the kidney are rare and were historically described in autopsy series, and the incidence ranged between 2.36% and 12.6%. However, in the contemporary literature with the improvements in imaging modalities (computed tomography scan and magnetic resonance imaging) and other health care screening services, metastatic tumors to the kidney are being diagnosed more frequently in surgical specimens. The utility of needle core biopsies in the primary evaluation of renal masses has also increased the number of sampled metastases, and as a result, only limited histologic material is available for evaluation in some cases and may potentially lead to diagnostic pitfalls. In the last decade, a few large clinical series have been published. In these series, the majority of metastatic tumors to the kidney are carcinomas, with the lung being the most common primary site. A significant number of the various tumor types with metastasis to the kidney are also associated with widespread metastases to other organs, and the renal metastasis may present several years after diagnosis of the primary tumor. The majority of secondary tumors of the kidney are asymptomatic, incidentally discovered, and solitary. There should be a high index of suspicion of metastasis to the kidney in patients with an associated enlarging renal lesion with minimal to no enhancement on imaging and tumor progression of a known high-grade nonrenal malignancy. Secondary tumors of the kidney can be accurately diagnosed by correlating histopathologic features with clinical and radiographic findings and the judicious use of ancillary studies.


Assuntos
Carcinoma , Neoplasias Renais , Carcinoma/patologia , Humanos , Rim/diagnóstico por imagem , Rim/patologia , Neoplasias Renais/diagnóstico , Neoplasias Renais/patologia , Tomografia Computadorizada por Raios X
6.
Sensors (Basel) ; 22(8)2022 Apr 13.
Artigo em Inglês | MEDLINE | ID: mdl-35458962

RESUMO

Emotions are an essential part of daily human communication. The emotional states and dynamics of the brain can be linked by electroencephalography (EEG) signals that can be used by the Brain-Computer Interface (BCI), to provide better human-machine interactions. Several studies have been conducted in the field of emotion recognition. However, one of the most important issues facing the emotion recognition process, using EEG signals, is the accuracy of recognition. This paper proposes a deep learning-based approach for emotion recognition through EEG signals, which includes data selection, feature extraction, feature selection and classification phases. This research serves the medical field, as the emotion recognition model helps diagnose psychological and behavioral disorders. The research contributes to improving the performance of the emotion recognition model to obtain more accurate results, which, in turn, aids in making the correct medical decisions. A standard pre-processed Database of Emotion Analysis using Physiological signaling (DEAP) was used in this work. The statistical features, wavelet features, and Hurst exponent were extracted from the dataset. The feature selection task was implemented through the Binary Gray Wolf Optimizer. At the classification stage, the stacked bi-directional Long Short-Term Memory (Bi-LSTM) Model was used to recognize human emotions. In this paper, emotions are classified into three main classes: arousal, valence and liking. The proposed approach achieved high accuracy compared to the methods used in past studies, with an average accuracy of 99.45%, 96.87% and 99.68% of valence, arousal, and liking, respectively, which is considered a high performance for the emotion recognition model.


Assuntos
Interfaces Cérebro-Computador , Aprendizado Profundo , Eletroencefalografia , Emoções , Memória de Curto Prazo
7.
Sensors (Basel) ; 22(9)2022 Apr 28.
Artigo em Inglês | MEDLINE | ID: mdl-35591061

RESUMO

Web applications have become ubiquitous for many business sectors due to their platform independence and low operation cost. Billions of users are visiting these applications to accomplish their daily tasks. However, many of these applications are either vulnerable to web defacement attacks or created and managed by hackers such as fraudulent and phishing websites. Detecting malicious websites is essential to prevent the spreading of malware and protect end-users from being victims. However, most existing solutions rely on extracting features from the website's content which can be harmful to the detection machines themselves and subject to obfuscations. Detecting malicious Uniform Resource Locators (URLs) is safer and more efficient than content analysis. However, the detection of malicious URLs is still not well addressed due to insufficient features and inaccurate classification. This study aims at improving the detection accuracy of malicious URL detection by designing and developing a cyber threat intelligence-based malicious URL detection model using two-stage ensemble learning. The cyber threat intelligence-based features are extracted from web searches to improve detection accuracy. Cybersecurity analysts and users reports around the globe can provide important information regarding malicious websites. Therefore, cyber threat intelligence-based (CTI) features extracted from Google searches and Whois websites are used to improve detection performance. The study also proposed a two-stage ensemble learning model that combines the random forest (RF) algorithm for preclassification with multilayer perceptron (MLP) for final decision making. The trained MLP classifier has replaced the majority voting scheme of the three trained random forest classifiers for decision making. The probabilistic output of the weak classifiers of the random forest was aggregated and used as input for the MLP classifier for adequate classification. Results show that the extracted CTI-based features with the two-stage classification outperform other studies' detection models. The proposed CTI-based detection model achieved a 7.8% accuracy improvement and 6.7% reduction in false-positive rates compared with the traditional URL-based model.


Assuntos
Aprendizado de Máquina , Redes Neurais de Computação , Algoritmos , Segurança Computacional , Inteligência
8.
Sensors (Basel) ; 22(9)2022 Apr 19.
Artigo em Inglês | MEDLINE | ID: mdl-35590801

RESUMO

Data streaming applications such as the Internet of Things (IoT) require processing or predicting from sequential data from various sensors. However, most of the data are unlabeled, making applying fully supervised learning algorithms impossible. The online manifold regularization approach allows sequential learning from partially labeled data, which is useful for sequential learning in environments with scarcely labeled data. Unfortunately, the manifold regularization technique does not work out of the box as it requires determining the radial basis function (RBF) kernel width parameter. The RBF kernel width parameter directly impacts the performance as it is used to inform the model to which class each piece of data most likely belongs. The width parameter is often determined off-line via hyperparameter search, where a vast amount of labeled data is required. Therefore, it limits its utility in applications where it is difficult to collect a great deal of labeled data, such as data stream mining. To address this issue, we proposed eliminating the RBF kernel from the manifold regularization technique altogether by combining the manifold regularization technique with a prototype learning method, which uses a finite set of prototypes to approximate the entire data set. Compared to other manifold regularization approaches, this approach instead queries the prototype-based learner to find the most similar samples for each sample instead of relying on the RBF kernel. Thus, it no longer necessitates the RBF kernel, which improves its practicality. The proposed approach can learn faster and achieve a higher classification performance than other manifold regularization techniques based on experiments on benchmark data sets. Results showed that the proposed approach can perform well even without using the RBF kernel, which improves the practicality of manifold regularization techniques for semi-supervised learning.


Assuntos
Internet das Coisas , Aprendizado de Máquina Supervisionado , Algoritmos , Benchmarking , Mineração de Dados
9.
Sensors (Basel) ; 22(7)2022 Apr 06.
Artigo em Inglês | MEDLINE | ID: mdl-35408423

RESUMO

A vehicular ad hoc network (VANET) is an emerging technology that improves road safety, traffic efficiency, and passenger comfort. VANETs' applications rely on co-operativeness among vehicles by periodically sharing their context information, such as position speed and acceleration, among others, at a high rate due to high vehicles mobility. However, rogue nodes, which exploit the co-operativeness feature and share false messages, can disrupt the fundamental operations of any potential application and cause the loss of people's lives and properties. Unfortunately, most of the current solutions cannot effectively detect rogue nodes due to the continuous context change and the inconsideration of dynamic data uncertainty during the identification. Although there are few context-aware solutions proposed for VANET, most of these solutions are data-centric. A vehicle is considered malicious if it shares false or inaccurate messages. Such a rule is fuzzy and not consistently accurate due to the dynamic uncertainty of the vehicular context, which leads to a poor detection rate. To this end, this study proposed a fuzzy-based context-aware detection model to improve the overall detection performance. A fuzzy inference system is constructed to evaluate the vehicles based on their generated information. The output of the proposed fuzzy inference system is used to build a dynamic context reference based on the proposed fuzzy inference system. Vehicles are classified into either honest or rogue nodes based on the deviation of their evaluation scores calculated using the proposed fuzzy inference system from the context reference. Extensive experiments were carried out to evaluate the proposed model. Results show that the proposed model outperforms the state-of-the-art models. It achieves a 7.88% improvement in the overall performance, while a 16.46% improvement is attained for detection rate compared to the state-of-the-art model. The proposed model can be used to evict the rogue nodes, and thus improve the safety and traffic efficiency of crewed or uncrewed vehicles designed for different environments, land, naval, or air.

10.
Sensors (Basel) ; 22(10)2022 May 10.
Artigo em Inglês | MEDLINE | ID: mdl-35632016

RESUMO

The Internet of Things (IoT) is a widely used technology in automated network systems across the world. The impact of the IoT on different industries has occurred in recent years. Many IoT nodes collect, store, and process personal data, which is an ideal target for attackers. Several researchers have worked on this problem and have presented many intrusion detection systems (IDSs). The existing system has difficulties in improving performance and identifying subcategories of cyberattacks. This paper proposes a deep-convolutional-neural-network (DCNN)-based IDS. A DCNN consists of two convolutional layers and three fully connected dense layers. The proposed model aims to improve performance and reduce computational power. Experiments were conducted utilizing the IoTID20 dataset. The performance analysis of the proposed model was carried out with several metrics, such as accuracy, precision, recall, and F1-score. A number of optimization techniques were applied to the proposed model in which Adam, AdaMax, and Nadam performance was optimum. In addition, the proposed model was compared with various advanced deep learning (DL) and traditional machine learning (ML) techniques. All experimental analysis indicates that the accuracy of the proposed approach is high and more robust than existing DL-based algorithms.


Assuntos
Internet das Coisas , Algoritmos , Aprendizado de Máquina , Redes Neurais de Computação
11.
Int J Mol Sci ; 23(21)2022 Oct 30.
Artigo em Inglês | MEDLINE | ID: mdl-36362018

RESUMO

Determining and modeling the possible behaviour and actions of molecules requires investigating the basic structural features and physicochemical properties that determine their behaviour during chemical, physical, biological, and environmental processes. Computational approaches such as machine learning methods are alternatives to predicting the physiochemical properties of molecules based on their structures. However, the limited accuracy and high error rates of such predictions restrict their use. In this paper, a novel technique based on a deep learning convolutional neural network (CNN) for the prediction of chemical compounds' bioactivity is proposed and developed. The molecules are represented in the new matrix format Mol2mat, a molecular matrix representation adapted from the well-known 2D-fingerprint descriptors. To evaluate the performance of the proposed methods, a series of experiments were conducted using two standard datasets, namely the MDL Drug Data Report (MDDR) and Sutherland, datasets comprising 10 homogeneous and 14 heterogeneous activity classes. After analysing the eight fingerprints, all the probable combinations were investigated using the five best descriptors. The results showed that a combination of three fingerprints, ECFP4, EPFP4, and ECFC4, along with a CNN activity prediction process, achieved the highest performance of 98% AUC when compared to the state-of-the-art ML algorithms NaiveB, LSVM, and RBFN.


Assuntos
Aprendizado de Máquina , Redes Neurais de Computação , Algoritmos
12.
Sensors (Basel) ; 22(1)2021 Dec 28.
Artigo em Inglês | MEDLINE | ID: mdl-35009725

RESUMO

Due to the wide availability and usage of connected devices in Internet of Things (IoT) networks, the number of attacks on these networks is continually increasing. A particularly serious and dangerous type of attack in the IoT environment is the botnet attack, where the attackers can control the IoT systems to generate enormous networks of "bot" devices for generating malicious activities. To detect this type of attack, several Intrusion Detection Systems (IDSs) have been proposed for IoT networks based on machine learning and deep learning methods. As the main characteristics of IoT systems include their limited battery power and processor capacity, maximizing the efficiency of intrusion detection systems for IoT networks is still a research challenge. It is important to provide efficient and effective methods that use lower computational time and have high detection rates. This paper proposes an aggregated mutual information-based feature selection approach with machine learning methods to enhance detection of IoT botnet attacks. In this study, the N-BaIoT benchmark dataset was used to detect botnet attack types using real traffic data gathered from nine commercial IoT devices. The dataset includes binary and multi-class classifications. The feature selection method incorporates Mutual Information (MI) technique, Principal Component Analysis (PCA) and ANOVA f-test at finely-granulated detection level to select the relevant features for improving the performance of IoT Botnet classifiers. In the classification step, several ensemble and individual classifiers were used, including Random Forest (RF), XGBoost (XGB), Gaussian Naïve Bayes (GNB), k-Nearest Neighbor (k-NN), Logistic Regression (LR) and Support Vector Machine (SVM). The experimental results showed the efficiency and effectiveness of the proposed approach, which outperformed other techniques using various evaluation metrics.


Assuntos
Internet das Coisas , Teorema de Bayes , Aprendizado de Máquina , Análise de Componente Principal , Máquina de Vetores de Suporte
13.
Sensors (Basel) ; 21(23)2021 Nov 30.
Artigo em Inglês | MEDLINE | ID: mdl-34884022

RESUMO

Wireless Sensors Networks have been the focus of significant attention from research and development due to their applications of collecting data from various fields such as smart cities, power grids, transportation systems, medical sectors, military, and rural areas. Accurate and reliable measurements for insightful data analysis and decision-making are the ultimate goals of sensor networks for critical domains. However, the raw data collected by WSNs usually are not reliable and inaccurate due to the imperfect nature of WSNs. Identifying misbehaviours or anomalies in the network is important for providing reliable and secure functioning of the network. However, due to resource constraints, a lightweight detection scheme is a major design challenge in sensor networks. This paper aims at designing and developing a lightweight anomaly detection scheme to improve efficiency in terms of reducing the computational complexity and communication and improving memory utilization overhead while maintaining high accuracy. To achieve this aim, one-class learning and dimension reduction concepts were used in the design. The One-Class Support Vector Machine (OCSVM) with hyper-ellipsoid variance was used for anomaly detection due to its advantage in classifying unlabelled and multivariate data. Various One-Class Support Vector Machine formulations have been investigated and Centred-Ellipsoid has been adopted in this study due to its effectiveness. Centred-Ellipsoid is the most effective kernel among studies formulations. To decrease the computational complexity and improve memory utilization, the dimensions of the data were reduced using the Candid Covariance-Free Incremental Principal Component Analysis (CCIPCA) algorithm. Extensive experiments were conducted to evaluate the proposed lightweight anomaly detection scheme. Results in terms of detection accuracy, memory utilization, computational complexity, and communication overhead show that the proposed scheme is effective and efficient compared few existing schemes evaluated. The proposed anomaly detection scheme achieved the accuracy higher than 98%, with O(nd) memory utilization and no communication overhead.


Assuntos
Redes de Comunicação de Computadores , Tecnologia sem Fio , Algoritmos , Análise de Componente Principal , Máquina de Vetores de Suporte
14.
Sensors (Basel) ; 21(24)2021 Dec 17.
Artigo em Inglês | MEDLINE | ID: mdl-34960516

RESUMO

This study presents a novel feature-engineered-natural gradient descent ensemble-boosting (NGBoost) machine-learning framework for detecting fraud in power consumption data. The proposed framework was sequentially executed in three stages: data pre-processing, feature engineering, and model evaluation. It utilized the random forest algorithm-based imputation technique initially to impute the missing data entries in the acquired smart meter dataset. In the second phase, the majority weighted minority oversampling technique (MWMOTE) algorithm was used to avoid an unequal distribution of data samples among different classes. The time-series feature-extraction library and whale optimization algorithm were utilized to extract and select the most relevant features from the kWh reading of consumers. Once the most relevant features were acquired, the model training and testing process was initiated by using the NGBoost algorithm to classify the consumers into two distinct categories ("Healthy" and "Theft"). Finally, each input feature's impact (positive or negative) in predicting the target variable was recognized with the tree SHAP additive-explanations algorithm. The proposed framework achieved an accuracy of 93%, recall of 91%, and precision of 95%, which was greater than all the competing models, and thus validated its efficacy and significance in the studied field of research.


Assuntos
Algoritmos , Aprendizado de Máquina , Eletricidade , Fraude , Fatores de Tempo
15.
J Urol ; 204(4): 734-740, 2020 10.
Artigo em Inglês | MEDLINE | ID: mdl-32347780

RESUMO

PURPOSE: Accurate preoperative staging of prostate cancer is essential for treatment planning. Conventional imaging is limited in detection of metastases. Our primary aim was to determine if [18F]fluciclovine positron emission tomography/computerized tomography is an early indicator of subclinical metastasis among patients with high risk prostate cancer. MATERIALS AND METHODS: A total of 68 patients with unfavorable intermediate to very high risk prostate cancer without systemic disease on conventional imaging were recruited before robotic radical prostatectomy with extended pelvic lymph node dissection. Diagnostic performance of [18F]fluciclovine positron emission tomography/computerized tomography and conventional imaging for detection of metastatic disease, and correlation of positivity to node and metastatic deposit size were determined. RESULTS: Overall 57 of 68 patients completed the protocol, of whom 31 had nodal metastasis on histology. [18F]Fluciclovine positron emission tomography/computerized tomography sensitivity and specificity in detecting nodal metastasis was 55.3% and 84.8% per patient, and 54.8% and 96.4% per region (right and left pelvis, presacral and nonregional), respectively. Compared with conventional imaging [18F]Fluciclovine positron emission tomography/computerized tomography had significantly higher sensitivity on patient based (55.3% vs 33.3%, p <0.01) and region based (54.8% vs 19.4%, p <0.01) analysis, detecting metastasis in 7 more patients and 22 more regions, with similar high specificity. Four additional patients had distant disease or other cancer detected on [18F] fluciclovine positron emission tomography/computerized tomography which precluded surgery. Detection of metastasis was related to size of metastatic deposits within lymph nodes and overall metastatic burden. CONCLUSIONS: [18F]Fluciclovine positron emission tomography/computerized tomography detects occult metastases not identified on conventional imaging and may help guide treatment decisions and lymph node dissection due to high specificity for metastatic disease.


Assuntos
Ácidos Carboxílicos , Ciclobutanos , Tomografia por Emissão de Pósitrons combinada à Tomografia Computadorizada , Neoplasias da Próstata/diagnóstico por imagem , Neoplasias da Próstata/patologia , Idoso , Humanos , Masculino , Pessoa de Meia-Idade , Metástase Neoplásica , Estadiamento de Neoplasias , Tomografia por Emissão de Pósitrons combinada à Tomografia Computadorizada/métodos , Período Pré-Operatório , Estudos Prospectivos , Neoplasias da Próstata/cirurgia
16.
Int J Qual Health Care ; 32(Supplement_1): 84-88, 2020 Feb 06.
Artigo em Inglês | MEDLINE | ID: mdl-32026936

RESUMO

This paper examines the principles of benchmarking in healthcare and how benchmarking can contribute to practice improvement and improved health outcomes for patients. It uses the Deepening our Understanding of Quality in Australia (DUQuA) study published in this Supplement and DUQuA's predecessor in Europe, the Deepening our Understanding of Quality improvement in Europe (DUQuE) study, as models. Benchmarking is where the performances of institutions or individuals are compared using agreed indicators or standards. The rationale for benchmarking is that institutions will respond positively to being identified as a low outlier or desire to be or stay as a high performer, or both, and patients will be empowered to make choices to seek care at institutions that are high performers. Benchmarking often begins with a conceptual framework that is based on a logic model. Such a framework can drive the selection of indicators to measure performance, rather than their selection being based on what is easy to measure. A Donabedian range of indicators can be chosen, including structure, process and outcomes, created around multiple domains or specialties. Indicators based on continuous variables allow organizations to understand where their performance is within a population, and their interdependencies and associations can be understood. Benchmarking should optimally target providers, in order to drive them towards improvement. The DUQuA and DUQuE studies both incorporated some of these principles into their design, thereby creating a model of how to incorporate robust benchmarking into large-scale health services research.


Assuntos
Benchmarking/métodos , Pesquisa sobre Serviços de Saúde/métodos , Indicadores de Qualidade em Assistência à Saúde , Austrália , Benchmarking/normas , Hospitais Públicos/normas , Humanos , Segurança do Paciente , Melhoria de Qualidade/organização & administração
17.
Molecules ; 26(1)2020 Dec 29.
Artigo em Inglês | MEDLINE | ID: mdl-33383976

RESUMO

Virtual screening (VS) is a computational practice applied in drug discovery research. VS is popularly applied in a computer-based search for new lead molecules based on molecular similarity searching. In chemical databases similarity searching is used to identify molecules that have similarities to a user-defined reference structure and is evaluated by quantitative measures of intermolecular structural similarity. Among existing approaches, 2D fingerprints are widely used. The similarity of a reference structure and a database structure is measured by the computation of association coefficients. In most classical similarity approaches, it is assumed that the molecular features in both biological and non-biologically-related activity carry the same weight. However, based on the chemical structure, it has been found that some distinguishable features are more important than others. Hence, this difference should be taken consideration by placing more weight on each important fragment. The main aim of this research is to enhance the performance of similarity searching by using multiple descriptors. In this paper, a deep learning method known as deep belief networks (DBN) has been used to reweight the molecule features. Several descriptors have been used for the MDL Drug Data Report (MDDR) dataset each of which represents different important features. The proposed method has been implemented with each descriptor individually to select the important features based on a new weight, with a lower error rate, and merging together all new features from all descriptors to produce a new descriptor for similarity searching. Based on the extensive experiments conducted, the results show that the proposed method outperformed several existing benchmark similarity methods, including Bayesian inference networks (BIN), the Tanimoto similarity method (TAN), adapted similarity measure of text processing (ASMTP) and the quantum-based similarity method (SQB). The results of this proposed multi-descriptor-based on Stack of deep belief networks method (SDBN) demonstrated a higher accuracy compared to existing methods on structurally heterogeneous datasets.


Assuntos
Aprendizado Profundo , Desenho de Fármacos , Descoberta de Drogas/métodos , Preparações Farmacêuticas/química , Teorema de Bayes , Quimioinformática/métodos , Bases de Dados de Produtos Farmacêuticos , Redes Neurais de Computação , Análise de Componente Principal
18.
Transpl Infect Dis ; 21(1): e13005, 2019 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-30276937

RESUMO

Primary effusion lymphoma (PEL) is a rare mature B-cell non-Hodgkin's lymphoma arising in body cavities and presenting with effusions. It has been described predominantly in patients with impaired immunity from the acquired immunodeficiency syndrome and is associated with the Human Herpesvirus-8 (HHV-8). Seldom has PEL been diagnosed in persons negative for the human immunodeficiency virus (HIV), and in such cases it has occurred primarily in the setting of posttransplant immunosuppression. We report an instructive case of a Caribbean-American HIV-negative orthotopic heart transplant recipient with a history of HHV-8-associated Kaposi's sarcoma who developed HHV-8 viremia and PEL of the pleural space early in the posttransplant course. This case highlights the importance of considering PEL in the differential diagnosis of a new pleural effusion in a transplant recipient at risk for HHV-8-associated disease.


Assuntos
Transplante de Coração/efeitos adversos , Herpesvirus Humano 8/isolamento & purificação , Linfoma de Efusão Primária/diagnóstico , Complicações Pós-Operatórias/diagnóstico , Viremia/diagnóstico , Diagnóstico Diferencial , Humanos , Terapia de Imunossupressão/efeitos adversos , Terapia de Imunossupressão/métodos , Linfoma de Efusão Primária/imunologia , Linfoma de Efusão Primária/patologia , Linfoma de Efusão Primária/virologia , Masculino , Pessoa de Meia-Idade , Recidiva Local de Neoplasia/imunologia , Recidiva Local de Neoplasia/patologia , Recidiva Local de Neoplasia/virologia , Cavidade Pleural/patologia , Cavidade Pleural/virologia , Complicações Pós-Operatórias/imunologia , Complicações Pós-Operatórias/virologia , Sarcoma de Kaposi/imunologia , Sarcoma de Kaposi/patologia , Sarcoma de Kaposi/virologia , Neoplasias Cutâneas/imunologia , Neoplasias Cutâneas/patologia , Neoplasias Cutâneas/virologia , Viremia/imunologia , Viremia/virologia
19.
J Comput Aided Mol Des ; 31(4): 365-378, 2017 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-28220440

RESUMO

Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.


Assuntos
Descoberta de Drogas/métodos , Bibliotecas de Moléculas Pequenas/química , Algoritmos , Simulação por Computador , Humanos , Ligantes , Modelos Moleculares , Probabilidade , Proteínas/metabolismo , Teoria Quântica , Bibliotecas de Moléculas Pequenas/farmacologia
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA