Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros

Base de dados
Tipo de documento
Intervalo de ano de publicação
1.
Comput Biol Med ; 172: 108152, 2024 Apr.
Artigo em Inglês | MEDLINE | ID: mdl-38452470

RESUMO

Healthcare has significantly contributed to the well-being of individuals around the globe; nevertheless, further benefits could be derived from a more streamlined healthcare system without incurring additional costs. Recently, the main attributes of cloud computing, such as on-demand service, high scalability, and virtualization, have brought many benefits across many areas, especially in medical services. It is considered an important element in healthcare services, enhancing the performance and efficacy of the services. The current state of the healthcare industry requires the supply of healthcare products and services, increasing its viability for everyone involved. Developing new approaches for discovering and selecting healthcare services in the cloud has become more critical due to the rising popularity of these kinds of services. As a result of the diverse array of healthcare services, service composition enables the execution of intricate operations by integrating multiple services' functionalities into a single procedure. However, many methods in this field encounter several issues, such as high energy consumption, cost, and response time. This article introduces a novel layered method for selecting and evaluating healthcare services to find optimal service selection and composition solutions based on Deep Reinforcement Learning (Deep RL), Kalman filtering, and repeated training, addressing the aforementioned issues. The results revealed that the proposed method has achieved acceptable results in terms of availability, reliability, energy consumption, and response time when compared to other methods.


Assuntos
Computação em Nuvem , Atenção à Saúde , Humanos , Reprodutibilidade dos Testes
2.
Artif Intell Med ; 149: 102779, 2024 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-38462281

RESUMO

The healthcare sector, characterized by vast datasets and many diseases, is pivotal in shaping community health and overall quality of life. Traditional healthcare methods, often characterized by limitations in disease prevention, predominantly react to illnesses after their onset rather than proactively averting them. The advent of Artificial Intelligence (AI) has ushered in a wave of transformative applications designed to enhance healthcare services, with Machine Learning (ML) as a noteworthy subset of AI. ML empowers computers to analyze extensive datasets, while Deep Learning (DL), a specific ML methodology, excels at extracting meaningful patterns from these data troves. Despite notable technological advancements in recent years, the full potential of these applications within medical contexts remains largely untapped, primarily due to the medical community's cautious stance toward novel technologies. The motivation of this paper lies in recognizing the pivotal role of the healthcare sector in community well-being and the necessity for a shift toward proactive healthcare approaches. To our knowledge, there is a notable absence of a comprehensive published review that delves into ML, DL and distributed systems, all aimed at elevating the Quality of Service (QoS) in healthcare. This study seeks to bridge this gap by presenting a systematic and organized review of prevailing ML, DL, and distributed system algorithms as applied in healthcare settings. Within our work, we outline key challenges that both current and future developers may encounter, with a particular focus on aspects such as approach, data utilization, strategy, and development processes. Our study findings reveal that the Internet of Things (IoT) stands out as the most frequently utilized platform (44.3 %), with disease diagnosis emerging as the predominant healthcare application (47.8 %). Notably, discussions center significantly on the prevention and identification of cardiovascular diseases (29.2 %). The studies under examination employ a diverse range of ML and DL methods, along with distributed systems, with Convolutional Neural Networks (CNNs) being the most commonly used (16.7 %), followed by Long Short-Term Memory (LSTM) networks (14.6 %) and shallow learning networks (12.5 %). In evaluating QoS, the predominant emphasis revolves around the accuracy parameter (80 %). This study highlights how ML, DL, and distributed systems reshape healthcare. It contributes to advancing healthcare quality, bridging the gap between technology and medical adoption, and benefiting practitioners and patients.


Assuntos
Inteligência Artificial , Qualidade de Vida , Humanos , Aprendizado de Máquina , Redes de Comunicação de Computadores , Qualidade da Assistência à Saúde
3.
Sensors (Basel) ; 23(16)2023 Aug 17.
Artigo em Inglês | MEDLINE | ID: mdl-37631769

RESUMO

The Internet of Things (IoT) represents a cutting-edge technical domain, encompassing billions of intelligent objects capable of bridging the physical and virtual worlds across various locations. IoT services are responsible for delivering essential functionalities. In this dynamic and interconnected IoT landscape, providing high-quality services is paramount to enhancing user experiences and optimizing system efficiency. Service composition techniques come into play to address user requests in IoT applications, allowing various IoT services to collaborate seamlessly. Considering the resource limitations of IoT devices, they often leverage cloud infrastructures to overcome technological constraints, benefiting from unlimited resources and capabilities. Moreover, the emergence of fog computing has gained prominence, facilitating IoT application processing in edge networks closer to IoT sensors and effectively reducing delays inherent in cloud data centers. In this context, our study proposes a cloud-/fog-based service composition for IoT, introducing a novel fuzzy-based hybrid algorithm. This algorithm ingeniously combines Ant Colony Optimization (ACO) and Artificial Bee Colony (ABC) optimization algorithms, taking into account energy consumption and Quality of Service (QoS) factors during the service selection process. By leveraging this fuzzy-based hybrid algorithm, our approach aims to revolutionize service composition in IoT environments by empowering intelligent decision-making capabilities and ensuring optimal user satisfaction. Our experimental results demonstrate the effectiveness of the proposed strategy in successfully fulfilling service composition requests by identifying suitable services. When compared to recently introduced methods, our hybrid approach yields significant benefits. On average, it reduces energy consumption by 17.11%, enhances availability and reliability by 8.27% and 4.52%, respectively, and improves the average cost by 21.56%.

4.
Neural Comput Appl ; 34(18): 15313-15348, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35702664

RESUMO

Recently, the COVID-19 epidemic has resulted in millions of deaths and has impacted practically every area of human life. Several machine learning (ML) approaches are employed in the medical field in many applications, including detecting and monitoring patients, notably in COVID-19 management. Different medical imaging systems, such as computed tomography (CT) and X-ray, offer ML an excellent platform for combating the pandemic. Because of this need, a significant quantity of study has been carried out; thus, in this work, we employed a systematic literature review (SLR) to cover all aspects of outcomes from related papers. Imaging methods, survival analysis, forecasting, economic and geographical issues, monitoring methods, medication development, and hybrid apps are the seven key uses of applications employed in the COVID-19 pandemic. Conventional neural networks (CNNs), long short-term memory networks (LSTM), recurrent neural networks (RNNs), generative adversarial networks (GANs), autoencoders, random forest, and other ML techniques are frequently used in such scenarios. Next, cutting-edge applications related to ML techniques for pandemic medical issues are discussed. Various problems and challenges linked with ML applications for this pandemic were reviewed. It is expected that additional research will be conducted in the upcoming to limit the spread and catastrophe management. According to the data, most papers are evaluated mainly on characteristics such as flexibility and accuracy, while other factors such as safety are overlooked. Also, Keras was the most often used library in the research studied, accounting for 24.4 percent of the time. Furthermore, medical imaging systems are employed for diagnostic reasons in 20.4 percent of applications.

5.
Comput Biol Med ; 141: 105141, 2022 02.
Artigo em Inglês | MEDLINE | ID: mdl-34929464

RESUMO

Since December 2019, the COVID-19 outbreak has resulted in countless deaths and has harmed all facets of human existence. COVID-19 has been designated an epidemic by the World Health Organization (WHO), which has placed a tremendous burden on nearly all countries, especially those with weak health systems. However, Deep Learning (DL) has been applied in several applications and many types of detection applications in the medical field, including thyroid diagnosis, lung nodule recognition, fetal localization, and detection of diabetic retinopathy. Furthermore, various clinical imaging sources, like Magnetic Resonance Imaging (MRI), X-ray, and Computed Tomography (CT), make DL a perfect technique to tackle the epidemic of COVID-19. Inspired by this fact, a considerable amount of research has been done. A Systematic Literature Review (SLR) has been used in this study to discover, assess, and integrate findings from relevant studies. DL techniques used in COVID-19 have also been categorized into seven main distinct categories as Long Short Term Memory Networks (LSTM), Self-Organizing Maps (SOMs), Conventional Neural Networks (CNNs), Generative Adversarial Networks (GANs), Recurrent Neural Networks (RNNs), Autoencoders, and hybrid approaches. Then, the state-of-the-art studies connected to DL techniques and applications for health problems with COVID-19 have been highlighted. Moreover, many issues and problems associated with DL implementation for COVID-19 have been addressed, which are anticipated to stimulate more investigations to control the prevalence and disaster control in the future. According to the findings, most papers are assessed using characteristics such as accuracy, delay, robustness, and scalability. Meanwhile, other features are underutilized, such as security and convergence time. Python is also the most commonly used language in papers, accounting for 75% of the time. According to the investigation, 37.83% of applications have identified chest CT/chest X-ray images for patients.


Assuntos
COVID-19 , Aprendizado Profundo , Algoritmos , Humanos , Redes Neurais de Computação , SARS-CoV-2
6.
PeerJ Comput Sci ; 7: e580, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34141897

RESUMO

Query optimization is the process of identifying the best Query Execution Plan (QEP). The query optimizer produces a close to optimal QEP for the given queries based on the minimum resource usage. The problem is that for a given query, there are plenty of different equivalent execution plans, each with a corresponding execution cost. To produce an effective query plan thus requires examining a large number of alternative plans. Access plan recommendation is an alternative technique to database query optimization, which reuses the previously-generated QEPs to execute new queries. In this technique, the query optimizer uses clustering methods to identify groups of similar queries. However, clustering such large datasets is challenging for traditional clustering algorithms due to huge processing time. Numerous cloud-based platforms have been introduced that offer low-cost solutions for the processing of distributed queries such as Hadoop, Hive, Pig, etc. This paper has applied and tested a model for clustering variant sizes of large query datasets parallelly using MapReduce. The results demonstrate the effectiveness of the parallel implementation of query workloads clustering to achieve good scalability.

7.
PeerJ Comput Sci ; 7: e539, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-34084936

RESUMO

Cloud computing is one of the most important computing patterns that use a pay-as-you-go manner to process data and execute applications. Therefore, numerous enterprises are migrating their applications to cloud environments. Not only do intensive applications deal with enormous quantities of data, but they also demonstrate compute-intensive properties very frequently. The dynamicity, coupled with the ambiguity between marketed resources and resource requirement queries from users, remains important issues that hamper efficient discovery in a cloud environment. Cloud service discovery becomes a complex problem because of the increase in network size and complexity. Complexity and network size keep increasing dynamically, making it a complex NP-hard problem that requires effective service discovery approaches. One of the most famous cloud service discovery methods is the Ant Colony Optimization (ACO) algorithm; however, it suffers from a load balancing problem among the discovered nodes. If the workload balance is inefficient, it limits the use of resources. This paper solved this problem by applying an Inverted Ant Colony Optimization (IACO) algorithm for load-aware service discovery in cloud computing. The IACO considers the pheromones' repulsion instead of attraction. We design a model for service discovery in the cloud environment to overcome the traditional shortcomings. Numerical results demonstrate that the proposed mechanism can obtain an efficient service discovery method. The algorithm is simulated using a CloudSim simulator, and the result shows better performance. Reducing energy consumption, mitigate response time, and better Service Level Agreement (SLA) violation in the cloud environments are the advantages of the proposed method.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA