Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 23(15)2023 Aug 01.
Artigo em Inglês | MEDLINE | ID: mdl-37571633

RESUMO

The Internet of Things is rapidly growing with the demand for low-power, long-range wireless communication technologies. Long Range Wide Area Network (LoRaWAN) is one such technology that has gained significant attention in recent years due to its ability to provide long-range communication with low power consumption. One of the main issues in LoRaWAN is the efficient utilization of radio resources (e.g., spreading factor and transmission power) by the end devices. To solve the resource allocation issue, machine learning (ML) methods have been used to improve the LoRaWAN network performance. The primary aim of this survey paper is to study and examine the issue of resource management in LoRaWAN that has been resolved through state-of-the-art ML methods. Further, this survey presents the publicly available LoRaWAN frameworks that could be utilized for dataset collection, discusses the required features for efficient resource management with suggested ML methods, and highlights the existing publicly available datasets. The survey also explores and evaluates the Network Simulator-3-based ML frameworks that can be leveraged for efficient resource management. Finally, future recommendations regarding the applicability of the ML applications for resource management in LoRaWAN are illustrated, providing a comprehensive guide for researchers and practitioners interested in applying ML to improve the performance of the LoRaWAN network.

2.
Sensors (Basel) ; 23(11)2023 May 24.
Artigo em Inglês | MEDLINE | ID: mdl-37299760

RESUMO

Terahertz (THz) is a promising technology for future wireless communication networks, particularly for 6G and beyond. The ultra-wide THz band, ranging from 0.1 to 10 THz, can potentially address the limited capacity and scarcity of spectrum in current wireless systems such as 4G-LTE and 5G. Furthermore, it is expected to support advanced wireless applications requiring high data transmission and quality services, i.e., terabit-per-second backhaul systems, ultra-high-definition streaming, virtual/augmented reality, and high-bandwidth wireless communications. In recent years, artificial intelligence (AI) has been used mainly for resource management, spectrum allocation, modulation and bandwidth classification, interference mitigation, beamforming, and medium access control layer protocols to improve THz performance. This survey paper examines the use of AI in state-of-the-art THz communications, discussing the challenges, potentials, and shortcomings. Additionally, this survey discusses the available platforms, including commercial, testbeds, and publicly available simulators for THz communications. Finally, this survey provides future strategies for improving the existing THz simulators and using AI methods, including deep learning, federated learning, and reinforcement learning, to improve THz communications.


Assuntos
Realidade Aumentada , Realidade Virtual , Inteligência Artificial , Tecnologia
3.
Sensors (Basel) ; 22(23)2022 Dec 05.
Artigo em Inglês | MEDLINE | ID: mdl-36502211

RESUMO

IEEE 802.11ah, known as Wi-Fi HaLow, is envisioned for long-range and low-power communication. It is sub-1 GHz technology designed for massive Internet of Things (IoT) and machine-to-machine devices. It aims to overcome the IoT challenges, such as providing connectivity to massive power-constrained devices distributed over a large geographical area. To accomplish this objective, IEEE 802.11ah introduces several unique physical and medium access control layer (MAC) features. In recent years, the MAC features of IEEE 802.11ah, including restricted access window, authentication (e.g., centralized and distributed) and association, relay and sectorization, target wake-up time, and traffic indication map, have been intensively investigated from various aspects to improve resource allocation and enhance the network performance in terms of device association time, throughput, delay, and energy consumption. This survey paper presents an in-depth assessment and analysis of these MAC features along with current solutions, their potentials, and key challenges, exposing how to use these novel features to meet the rigorous IoT standards.


Assuntos
Internet das Coisas , Cafeína , Comunicação , Alocação de Recursos , Tecnologia
4.
Stud Health Technol Inform ; 281: 504-505, 2021 May 27.
Artigo em Inglês | MEDLINE | ID: mdl-34042622

RESUMO

This paper presents a scoping review of federated learning for the Internet of Medical Things (IoMT) and demonstrates the limited amount of research work in an area which has potential to improve patient care. Federated Learning and IoMT - as standalone technologies - have already proved to be highly disruptive but there is a need for further research to apply federated learning to the IoMT.


Assuntos
Internet das Coisas , Humanos , Internet , Aprendizagem
5.
Sensors (Basel) ; 20(22)2020 Nov 12.
Artigo em Inglês | MEDLINE | ID: mdl-33198298

RESUMO

A long-range wide area network (LoRaWAN) is one of the leading communication technologies for Internet of Things (IoT) applications. In order to fulfill the IoT-enabled application requirements, LoRaWAN employs an adaptive data rate (ADR) mechanism at both the end device (ED) and the network server (NS). NS-managed ADR aims to offer a reliable and battery-efficient resource to EDs by managing the spreading factor (SF) and transmit power (TP). However, such management is severely affected by the lack of agility in adapting to the variable channel conditions. Thus, several hours or even days may be required to converge at a level of stable and energy-efficient communication. Therefore, we propose two NS-managed ADRs, a Gaussian filter-based ADR (G-ADR) and an exponential moving average-based ADR (EMA-ADR). Both of the proposed schemes operate as a low-pass filter to resist rapid changes in the signal-to-noise ratio of received packets at the NS. The proposed methods aim to allocate the best SF and TP to both static and mobile EDs by seeking to reduce the convergence period in the confirmed mode of LoRaWAN. Based on the simulation results, we show that the G-ADR and EMA-ADR schemes reduce the convergence period in a static scenario by 16% and 68%, and in a mobility scenario by 17% and 81%, respectively, as compared to typical ADR. Moreover, we show that the proposed schemes are successful in reducing the energy consumption and enhancing the packet success ratio.

6.
Sensors (Basel) ; 20(9)2020 May 06.
Artigo em Inglês | MEDLINE | ID: mdl-32384656

RESUMO

A long-range wide area network (LoRaWAN) adapts the ALOHA network concept for channel access, resulting in packet collisions caused by intra- and inter-spreading factor (SF) interference. This leads to a high packet loss ratio. In LoRaWAN, each end device (ED) increments the SF after every two consecutive failed retransmissions, thus forcing the EDs to use a high SF. When numerous EDs switch to the highest SF, the network loses its advantage of orthogonality. Thus, the collision probability of the ED packets increases drastically. In this study, we propose two SF allocation schemes to enhance the packet success ratio by lowering the impact of interference. The first scheme, called the channel-adaptive SF recovery algorithm, increments or decrements the SF based on the retransmission of the ED packets, indicating the channel status in the network. The second approach allocates SF to EDs based on ED sensitivity during the initial deployment. These schemes are validated through extensive simulations by considering the channel interference in both confirmed and unconfirmed modes of LoRaWAN. Through simulation results, we show that the SFs have been adaptively applied to each ED, and the proposed schemes enhance the packet success delivery ratio as compared to the typical SF allocation schemes.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA