Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 2 de 2
Filter
Add more filters










Database
Language
Publication year range
1.
Heliyon ; 9(9): e19548, 2023 Sep.
Article in English | MEDLINE | ID: mdl-37809766

ABSTRACT

In this study, we have presented our findings on the deployment of a machine learning (ML) technique to enhance the performance of LTE applications employing quasi-Yagi-Uda antennas at 2100 MHz UMTS band. A number of techniques, including simulation, measurement, and a model of an RLC-equivalent circuit, are discussed in this article as ways to assess an antenna's suitability for the intended applications. The CST simulation gives the suggested antenna a reflection coefficient of -38.40 dB at 2.1 GHz and a bandwidth of 357 MHz (1.95 GHz-2.31 GHz) at a -10 dB level. With a dimension of 0.535λ0×0.714λ0, it is not only compact but also features a maximum gain of 6.9 dB, a maximum directivity of 7.67, VSWR of 1.001 at center frequency and a maximum efficiency of 89.9%. The antenna is made of a low-cost substrate, FR4. The RLC circuit, sometimes referred to as the lumped element model, exhibits characteristics that are sufficiently similar to those of the proposed Yagi antenna. We use yet another supervised regression machine learning (ML) technique to create an exact forecast of the antenna's frequency and directivity. The performance of machine learning (ML) models can be evaluated using a variety of metrics, including the variance score, R square, mean square error (MSE), mean absolute error (MAE), root mean square error (RMSE), and mean squared logarithmic error (MSLE). Out of the seven ML models, the linear regression (LR) model has the lowest error and maximum accuracy when predicting directivity, whereas the ridge regression (RR) model performs the best when predicting frequency. The proposed antenna is a strong candidate for the intended UMTS LTE applications, as shown by the modeling results from CST and ADS, as well as the measured and forecasted outcomes from machine learning techniques.

2.
Sci Rep ; 13(1): 12590, 2023 Aug 03.
Article in English | MEDLINE | ID: mdl-37537201

ABSTRACT

In this study, we present our findings from investigating the use of a machine learning (ML) technique to improve the performance of Quasi-Yagi-Uda antennas operating in the n78 band for 5G applications. This research study investigates several techniques, such as simulation, measurement, and an RLC equivalent circuit model, to evaluate the performance of an antenna. In this investigation, the CST modelling tools are used to develop a high-gain, low-return-loss Yagi-Uda antenna for the 5G communication system. When considering the antenna's operating frequency, its dimensions are [Formula: see text]. The antenna has an operating frequency of 3.5 GHz, a return loss of [Formula: see text] dB, a bandwidth of 520 MHz, a maximum gain of 6.57 dB, and an efficiency of almost 97%. The impedance analysis tools in CST Studio's simulation and circuit design tools in Agilent ADS software are used to derive the antenna's equivalent circuit (RLC). We use supervised regression ML method to create an accurate prediction of the frequency and gain of the antenna. Machine learning models can be evaluated using a variety of measures, including variance score, R square, mean square error, mean absolute error, root mean square error, and mean squared logarithmic error. Among the nine ML models, the prediction result of Linear Regression is superior to other ML models for resonant frequency prediction, and Gaussian Process Regression shows an extraordinary performance for gain prediction. R-square and var score represents the accuracy of the prediction, which is close to 99% for both frequency and gain prediction. Considering these factors, the antenna can be deemed an excellent choice for the n78 band of a 5G communication system.

SELECTION OF CITATIONS
SEARCH DETAIL
...