Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Artigo em Inglês | MEDLINE | ID: mdl-38019634

RESUMO

A probabilistic load forecast that is accurate and reliable is crucial to not only the efficient operation of power systems but also to the efficient use of energy resources. In order to estimate the uncertainties in forecasting models and nonstationary electric load data, this study proposes a probabilistic load forecasting model, namely BFEEMD-LSTM-TWSVRSOA. This model consists of a data filtering method named fast ensemble empirical model decomposition (FEEMD) method, a twin support vector regression (TWSVR) whose features are extracted by deep learning-based long short-term memory (LSTM) networks, and parameters optimized by seeker optimization algorithms (SOAs). We compared the probabilistic forecasting performance of the BFEEMD-LSTM-TWSVRSOA and its point forecasting version with different machine learning and deep learning algorithms on Global Energy Forecasting Competition 2014 (GEFCom2014). The most representative month data of each season, totally four monthly data, collected from the one-year data in GEFCom2014, forming four datasets. Several bootstrap methods are compared in order to determine the best prediction intervals (PIs) for the proposed model. Various forecasting step sizes are also taken into consideration in order to obtain the best satisfactory point forecasting results. Experimental results on these four datasets indicate that the wild bootstrap method and 24-h step size are the best bootstrap method and forecasting step size for the proposed model. The proposed model achieves averaged 46%, 11%, 36%, and 44% better than suboptimal model on these four datasets with respect to point forecasting, and achieves averaged 53%, 48%, 46%, and 51% better than suboptimal model on these four datasets with respect to probabilistic forecasting.

2.
Medicina (Kaunas) ; 58(2)2022 Feb 18.
Artigo em Inglês | MEDLINE | ID: mdl-35208634

RESUMO

A coronavirus outbreak caused by a novel virus known as SARS-CoV-2 originated towards the latter half of 2019. COVID-19's abrupt emergence and unchecked global expansion highlight the inability of the current healthcare services to respond to public health emergencies promptly. This paper reviews the different aspects of human life comprehensively affected by COVID-19. It then discusses various tools and technologies from the leading domains and their integration into people's lives to overcome issues resulting from pandemics. This paper further focuses on providing a detailed review of existing and probable Artificial Intelligence (AI), Internet of Things (IoT), Augmented Reality (AR), Virtual Reality (VR), and Blockchain-based solutions. The COVID-19 pandemic brings several challenges from the viewpoint of the nation's healthcare, security, privacy, and economy. AI offers different predictive services and intelligent strategies for detecting coronavirus signs, promoting drug development, remote healthcare, classifying fake news detection, and security attacks. The incorporation of AI in the COVID-19 outbreak brings robust and reliable solutions to enhance the healthcare systems, increases user's life expectancy, and boosts the nation's economy. Furthermore, AR/VR helps in distance learning, factory automation, and setting up an environment of work from home. Blockchain helps in protecting consumer's privacy, and securing the medical supply chain operations. IoT is helpful in remote patient monitoring, distant sanitising via drones, managing social distancing (using IoT cameras), and many more in combating the pandemic. This study covers an up-to-date analysis on the use of blockchain technology, AI, AR/VR, and IoT for combating COVID-19 pandemic considering various applications. These technologies provide new emerging initiatives and use cases to deal with the COVID-19 pandemic. Finally, we discuss challenges and potential research paths that will promote further research into future pandemic outbreaks.


Assuntos
COVID-19 , Pandemias , Inteligência Artificial , Humanos , SARS-CoV-2 , Tecnologia
3.
Sensors (Basel) ; 21(21)2021 Oct 29.
Artigo em Inglês | MEDLINE | ID: mdl-34770508

RESUMO

Content-Centric Networking (CCN) has emerged as a potential Internet architecture that supports name-based content retrieval mechanism in contrast to the current host location-oriented IP architecture. The in-network caching capability of CCN ensures higher content availability, lesser network delay, and leads to server load reduction. It was observed that caching the contents on each intermediate node does not use the network resources efficiently. Hence, efficient content caching decisions are crucial to improve the Quality-of-Service (QoS) for the end-user devices and improved network performance. Towards this, a novel content caching scheme is proposed in this paper. The proposed scheme first clusters the network nodes based on the hop count and bandwidth parameters to reduce content redundancy and caching operations. Then, the scheme takes content placement decisions using the cluster information, content popularity, and the hop count parameters, where the caching probability improves as the content traversed toward the requester. Hence, using the proposed heuristics, the popular contents are placed near the edges of the network to achieve a high cache hit ratio. Once the cache becomes full, the scheme implements Least-Frequently-Used (LFU) replacement scheme to substitute the least accessed content in the network routers. Extensive simulations are conducted and the performance of the proposed scheme is investigated under different network parameters that demonstrate the superiority of the proposed strategy w.r.t the peer competing strategies.

4.
Sensors (Basel) ; 21(5)2021 Mar 05.
Artigo em Inglês | MEDLINE | ID: mdl-33807724

RESUMO

The escalated growth of the Internet of Things (IoT) has started to reform and reshape our lives. The deployment of a large number of objects adhered to the internet has unlocked the vision of the smart world around us, thereby paving a road towards automation and humongous data generation and collection. This automation and continuous explosion of personal and professional information to the digital world provides a potent ground to the adversaries to perform numerous cyber-attacks, thus making security in IoT a sizeable concern. Hence, timely detection and prevention of such threats are pre-requisites to prevent serious consequences. The survey conducted provides a brief insight into the technology with prime attention towards the various attacks and anomalies and their detection based on the intelligent intrusion detection system (IDS). The comprehensive look-over presented in this paper provides an in-depth analysis and assessment of diverse machine learning and deep learning-based network intrusion detection system (NIDS). Additionally, a case study of healthcare in IoT is presented. The study depicts the architecture, security, and privacy issues and application of learning paradigms in this sector. The research assessment is finally concluded by listing the results derived from the literature. Additionally, the paper discusses numerous research challenges to allow further rectifications in the approaches to deal with unusual complications.

5.
Sci Rep ; 10(1): 11417, 2020 Jul 10.
Artigo em Inglês | MEDLINE | ID: mdl-32651418

RESUMO

To detect substation faults for timely repair, this paper proposes a fault detection method that is based on the time series model and the statistical process control method to analyze the regulation and characteristics of the behavior in the switching process. As the first time, this paper proposes a fault detection model using SARIMA, statistical process control (SPC) methods, and 3σ criterion to analyze the characteristics in substation's switching process. The employed approaches are both very common tools in the statistics field, however, via effectively combining them with industrial process fault diagnosis, these common statistical tolls play excellent role to achieve rich technical contributions. Finally, for different fault samples, the proposed method improves the rate of detection by at least 9% (and up to 15%) than other methods.

6.
Sensors (Basel) ; 20(14)2020 Jul 13.
Artigo em Inglês | MEDLINE | ID: mdl-32668605

RESUMO

The lifetime of a node in wireless sensor networks (WSN) is directly responsible for the longevity of the wireless network. The routing of packets is the most energy-consuming activity for a sensor node. Thus, finding an energy-efficient routing strategy for transmission of packets becomes of utmost importance. The opportunistic routing (OR) protocol is one of the new routing protocol that promises reliability and energy efficiency during transmission of packets in wireless sensor networks (WSN). In this paper, we propose an intelligent opportunistic routing protocol (IOP) using a machine learning technique, to select a relay node from the list of potential forwarder nodes to achieve energy efficiency and reliability in the network. The proposed approach might have applications including e-healthcare services. As the proposed method might achieve reliability in the network because it can connect several healthcare network devices in a better way and good healthcare services might be offered. In addition to this, the proposed method saves energy, therefore, it helps the remote patient to connect with healthcare services for a longer duration with the integration of IoT services.


Assuntos
Algoritmos , Aprendizado de Máquina , Telemedicina , Tecnologia sem Fio , Humanos , Reprodutibilidade dos Testes , Fatores de Tempo
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA