Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Más filtros

Banco de datos
Tipo del documento
País de afiliación
Intervalo de año de publicación
1.
Sensors (Basel) ; 23(13)2023 Jun 25.
Artículo en Inglés | MEDLINE | ID: mdl-37447729

RESUMEN

The template matching technique is one of the most applied methods to find patterns in images, in which a reduced-size image, called a target, is searched within another image that represents the overall environment. In this work, template matching is used via a co-design system. A hardware coprocessor is designed for the computationally demanding step of template matching, which is the calculation of the normalized cross-correlation coefficient. This computation allows invariance in the global brightness changes in the images, but it is computationally more expensive when using images of larger dimensions, or even sets of images. Furthermore, we investigate the performance of six different swarm intelligence techniques aiming to accelerate the target search process. To evaluate the proposed design, the processing time, the number of iterations, and the success rate were compared. The results show that it is possible to obtain approaches capable of processing video images at 30 frames per second with an acceptable average success rate for detecting the tracked target. The search strategies based on PSO, ABC, FFA, and CS are able to meet the processing time of 30 frame/s, yielding average accuracy rates above 80% for the pipelined co-design implementation. However, FWA, EHO, and BFOA could not achieve the required timing restriction, and they achieved an acceptance rate around 60%. Among all the investigated search strategies, the PSO provides the best performance, yielding an average processing time of 16.22 ms coupled with a 95% success rate.


Asunto(s)
Algoritmos , Inteligencia Artificial , Inteligencia
2.
Cluster Comput ; : 1-19, 2022 Nov 17.
Artículo en Inglés | MEDLINE | ID: mdl-36415683

RESUMEN

Edge computing (EC) gets the Internet of Things (IoT)-based face recognition systems out of trouble caused by limited storage and computing resources of local or mobile terminals. However, data privacy leak remains a concerning problem. Previous studies only focused on some stages of face data processing, while this study focuses on the privacy protection of face data throughout its entire life cycle. Therefore, we propose a general privacy protection framework for edge-based face recognition (EFR) systems. To protect the privacy of face images and training models transmitted between edges and the remote cloud, we design a local differential privacy (LDP) algorithm based on the proportion difference of feature information. In addition, we also introduced identity authentication and hash technology to ensure the legitimacy of the terminal device and the integrity of the face image in the data acquisition phase. Theoretical analysis proves the rationality and feasibility of the scheme. Compared with the non-privacy protection situation and the equal privacy budget allocation method, our method achieves the best balance between availability and privacy protection in the numerical experiment.

3.
Sci Rep ; 14(1): 8067, 2024 Apr 05.
Artículo en Inglés | MEDLINE | ID: mdl-38580655

RESUMEN

The prediction of hydrological time series is of great significance for developing flood and drought prevention approaches and is an important component in research on smart water resources. The nonlinear characteristics of hydrological time series are important factors affecting the accuracy of predictions. To enhance the prediction of the nonlinear component in hydrological time series, we employed an improved whale optimisation algorithm (IWOA) to optimise an attention-based long short-term memory (ALSTM) network. The proposed model is termed IWOA-ALSTM. Specifically, we introduced an attention mechanism between two LSTM layers, enabling adaptive focus on distinct features within each time unit to gather information pertaining to a hydrological time series. Furthermore, given the critical impact of the model hyperparameter configuration on the prediction accuracy and operational efficiency, the proposed improved whale optimisation algorithm facilitates the discovery of optimal hyperparameters for the ALSTM model. In this work, we used nonlinear water level information obtained from Hankou station as experimental data. The results of this model were compared with those of genetic algorithms, particle swarm optimisation algorithms and whale optimisation algorithms. The experiments were conducted using five evaluation metrics, namely, the RMSE, MAE, NSE, SI and DR. The results show that the IWOA is effective at optimising the ALSTM and significantly improves the prediction accuracy of nonlinear hydrological time series.

4.
Math Biosci Eng ; 20(5): 7905-7921, 2023 Feb 22.
Artículo en Inglés | MEDLINE | ID: mdl-37161178

RESUMEN

Cloud storage has become a crucial service for many users who deal with big data. The auditing scheme for cloud storage is a mechanism that checks the integrity of outsourced data. Cloud storage deduplication is a technique that helps cloud service providers save on storage costs by storing only one copy of a file when multiple users outsource the same file to cloud servers. However, combining storage auditing and deduplication techniques can be challenging. To address this challenge, in 2019 Hou et al. proposed a cloud storage auditing scheme with deduplication that supports different security levels of data popularity. This proposal is interesting and has practical applications. However, in this paper, we show that their proposal has a flaw: the cloud or other adversaries can easily forge the data block's authenticators, which means the cloud can delete all the outsourced encrypted data blocks but still provide correct storage proof for the third-party auditor. Based on Hou et al.'s scheme, we propose an improved cloud storage auditing scheme with deduplication and analyze its security. The results show that the proposed scheme is more secure.

5.
Biomimetics (Basel) ; 8(5)2023 Aug 25.
Artículo en Inglés | MEDLINE | ID: mdl-37754139

RESUMEN

Open or short-circuit faults, as well as discrete parameter faults, are the most commonly used models in the simulation prior to testing methodology. However, since analog circuits exhibit continuous responses to input signals, faults in specific circuit elements may not fully capture all potential component faults. Consequently, diagnosing faults in analog circuits requires three key aspects: identifying faulty components, determining faulty element values, and considering circuit tolerance constraints. To tackle this problem, a methodology is proposed and implemented for fault diagnosis using swarm intelligence. The investigated optimization techniques are Particle Swarm Optimization (PSO) and the Bat Algorithm (BA). In this methodology, the nonlinear equations of the tested circuit are employed to calculate its parameters. The primary objective is to identify the specific circuit component that could potentially exhibit the fault by comparing the responses obtained from the actual circuit and the responses obtained through the optimization process. Two circuits are used as case studies to evaluate the performance of the proposed methodologies: the Tow-Thomas Biquad filter (case study 1) and the Butterworth filter (case study 2). The proposed methodologies are able to identify or at least reduce the number of possible faulty components. Four main performance metrics are extracted: accuracy, precision, sensitivity, and specificity. The BA technique demonstrates superior performance by utilizing the maximum combination of accessible nodes in the tested circuit, with an average accuracy of 95.5%, while PSO achieved only 93.9%. Additionally, the BA technique outperforms in terms of execution time, with an average time reduction of 7.95% reduction for the faultless circuit and an 8.12% reduction for the faulty cases. Compared to the machine-learning-based approach, using BA with the proposed methodology achieves similar accuracy rates but does not require any datasets nor any time-demanding training to proceed with circuit diagnostic.

6.
Neural Comput Appl ; : 1-14, 2021 Sep 07.
Artículo en Inglés | MEDLINE | ID: mdl-34511731

RESUMEN

Over the course of this year, more than a billion people have been afflicted by the COVID-19 outbreak. As long as individuals maintain their social distance, they should all be secure at this period. Because of this, there has been a rise in the usage of different online technologies, but at the same time, there has also been a rise in the likelihood of different cyber-attacks. A DDoS assault, the most prevalent and deadly of them all, impairs an online resource for its users. Thus, in this paper, we have proposed a filtering approach that can work efficiently in the COVID-19 scenario and detect the DDoS attack. We base our proposed approach on statistical methods like packet score and entropy variation for the identification of DDoS attack traffic. We have implemented our proposed approach on Omnet++ and for testing its efficiency we have checked it with different test cases. Our proposed approach detects the DDoS attack traffic with 96% accuracy and can also clearly have differentiated the DDoS attack traffic from the flash crowd.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA