Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 10 de 10
Filter
Add more filters











Publication year range
1.
Sensors (Basel) ; 24(8)2024 Apr 19.
Article in English | MEDLINE | ID: mdl-38676223

ABSTRACT

Vector Quantization (VQ) is a technique with a wide range of applications. For example, it can be used for image compression. The codebook design for VQ has great significance in the quality of the quantized signals and can benefit from the use of swarm intelligence. Initialization of the Linde-Buzo-Gray (LBG) algorithm, which is the most popular VQ codebook design algorithm, is a step that directly influences VQ performance, as the convergence speed and codebook quality depend on the initial codebook. A widely used initialization alternative is random initialization, in which the initial set of codevectors is drawn randomly from the training set. Other initialization methods can lead to a better quality of the designed codebooks. The present work evaluates the impacts of initialization strategies on swarm intelligence algorithms for codebook design in terms of the quality of the designed codebooks, assessed by the quality of the reconstructed images, and in terms of the convergence speed, evaluated by the number of iterations. Initialization strategies consist of a combination of codebooks obtained by initialization algorithms from the literature with codebooks composed of vectors randomly selected from the training set. The possibility of combining different initialization techniques provides new perspectives in the search for the quality of the VQ codebooks. Nine initialization strategies are presented, which are compared with random initialization. Initialization strategies are evaluated on the following algorithms for codebook design based on swarm clustering: modified firefly algorithm-Linde-Buzo-Gray (M-FA-LBG), modified particle swarm optimization-Linde-Buzo-Gray (M-PSO-LBG), modified fish school search-Linde-Buzo-Gray (M-FSS-LBG) and their accelerated versions (M-FA-LBGa, M-PSO-LBGa and M-FSS-LBGa) which are obtained by replacing the LBG with the accelerated LBG algorithm. The simulation results point out to the benefits of the proposed initialization strategies. The results show gains up to 4.43 dB in terms of PSNR for image Clock with M-PSO-LBG codebooks of size 512 and codebook design time savings up to 67.05% for image Clock, with M-FF-LBGa codebooks with size N=512, by using initialization strategies in substitution to Random initialization.

2.
Biomimetics (Basel) ; 8(5)2023 Aug 25.
Article in English | MEDLINE | ID: mdl-37754139

ABSTRACT

Open or short-circuit faults, as well as discrete parameter faults, are the most commonly used models in the simulation prior to testing methodology. However, since analog circuits exhibit continuous responses to input signals, faults in specific circuit elements may not fully capture all potential component faults. Consequently, diagnosing faults in analog circuits requires three key aspects: identifying faulty components, determining faulty element values, and considering circuit tolerance constraints. To tackle this problem, a methodology is proposed and implemented for fault diagnosis using swarm intelligence. The investigated optimization techniques are Particle Swarm Optimization (PSO) and the Bat Algorithm (BA). In this methodology, the nonlinear equations of the tested circuit are employed to calculate its parameters. The primary objective is to identify the specific circuit component that could potentially exhibit the fault by comparing the responses obtained from the actual circuit and the responses obtained through the optimization process. Two circuits are used as case studies to evaluate the performance of the proposed methodologies: the Tow-Thomas Biquad filter (case study 1) and the Butterworth filter (case study 2). The proposed methodologies are able to identify or at least reduce the number of possible faulty components. Four main performance metrics are extracted: accuracy, precision, sensitivity, and specificity. The BA technique demonstrates superior performance by utilizing the maximum combination of accessible nodes in the tested circuit, with an average accuracy of 95.5%, while PSO achieved only 93.9%. Additionally, the BA technique outperforms in terms of execution time, with an average time reduction of 7.95% reduction for the faultless circuit and an 8.12% reduction for the faulty cases. Compared to the machine-learning-based approach, using BA with the proposed methodology achieves similar accuracy rates but does not require any datasets nor any time-demanding training to proceed with circuit diagnostic.

3.
Sensors (Basel) ; 23(13)2023 Jun 25.
Article in English | MEDLINE | ID: mdl-37447729

ABSTRACT

The template matching technique is one of the most applied methods to find patterns in images, in which a reduced-size image, called a target, is searched within another image that represents the overall environment. In this work, template matching is used via a co-design system. A hardware coprocessor is designed for the computationally demanding step of template matching, which is the calculation of the normalized cross-correlation coefficient. This computation allows invariance in the global brightness changes in the images, but it is computationally more expensive when using images of larger dimensions, or even sets of images. Furthermore, we investigate the performance of six different swarm intelligence techniques aiming to accelerate the target search process. To evaluate the proposed design, the processing time, the number of iterations, and the success rate were compared. The results show that it is possible to obtain approaches capable of processing video images at 30 frames per second with an acceptable average success rate for detecting the tracked target. The search strategies based on PSO, ABC, FFA, and CS are able to meet the processing time of 30 frame/s, yielding average accuracy rates above 80% for the pipelined co-design implementation. However, FWA, EHO, and BFOA could not achieve the required timing restriction, and they achieved an acceptance rate around 60%. Among all the investigated search strategies, the PSO provides the best performance, yielding an average processing time of 16.22 ms coupled with a 95% success rate.


Subject(s)
Algorithms , Artificial Intelligence , Intelligence
4.
PeerJ Comput Sci ; 9: e1728, 2023.
Article in English | MEDLINE | ID: mdl-38192486

ABSTRACT

The one-dimensional cutting-stock problem (1D-CSP) consists of obtaining a set of items of different lengths from stocks of one or different lengths, where the minimization of waste is one of the main objectives to be achieved. This problem arises in several industries like wood, glass, and paper, among others similar. Different approaches have been designed to deal with this problem ranging from exact algorithms to hybrid methods of heuristics or metaheuristics. The African Buffalo Optimization (ABO) algorithm is used in this work to address the 1D-CSP. This algorithm has been recently introduced to solve combinatorial problems such as travel salesman and bin packing problems. A procedure was designed to improve the search by taking advantage of the location of the buffaloes just before it is needed to restart the herd, with the aim of not to losing the advance reached in the search. Different instances from the literature were used to test the algorithm. The results show that the developed method is competitive in waste minimization against other heuristics, metaheuristics, and hybrid approaches.

5.
Sensors (Basel) ; 22(3)2022 Feb 08.
Article in English | MEDLINE | ID: mdl-35162025

ABSTRACT

Video tracking involves detecting previously designated objects of interest within a sequence of image frames. It can be applied in robotics, unmanned vehicles, and automation, among other fields of interest. Video tracking is still regarded as an open problem due to a number of obstacles that still need to be overcome, including the need for high precision and real-time results, as well as portability and low-power demands. This work presents the design, implementation and assessment of a low-power embedded system based on an SoC-FPGA platform and the honeybee search algorithm (HSA) for real-time video tracking. HSA is a meta-heuristic that combines evolutionary computing and swarm intelligence techniques. Our findings demonstrated that the combination of SoC-FPGA and HSA reduced the consumption of computational resources, allowing real-time multiprocessing without a reduction in precision, and with the advantage of lower power consumption, which enabled portability. A starker difference was observed when measuring the power consumption. The proposed SoC-FPGA system consumed about 5 Watts, whereas the CPU-GPU system required more than 200 Watts. A general recommendation obtained from this research is to use SoC-FPGA over CPU-GPU to work with meta-heuristics in computer vision applications when an embedded solution is required.


Subject(s)
Algorithms , Software , Animals , Bees
6.
Sensors (Basel) ; 21(5)2021 Mar 09.
Article in English | MEDLINE | ID: mdl-33803171

ABSTRACT

This work proposes a new approach to improve swarm intelligence algorithms for dynamic optimization problems by promoting a balance between the transfer of knowledge and the diversity of particles. The proposed method was designed to be applied to the problem of video tracking targets in environments with almost constant lighting. This approach also delimits the solution space for a more efficient search. A robust version to outliers of the double exponential smoothing (DES) model is used to predict the target position in the frame delimiting the solution space in a more promising region for target tracking. To assess the quality of the proposed approach, an appropriate tracker for a discrete solution space was implemented using the meta-heuristic Shuffled Frog Leaping Algorithm (SFLA) adapted to dynamic optimization problems, named the Dynamic Shuffled Frog Leaping Algorithm (DSFLA). The DSFLA was compared with other classic and current trackers whose algorithms are based on swarm intelligence. The trackers were compared in terms of the average processing time per frame and the area under curve of the success rate per Pascal metric. For the experiment, we used a random sample of videos obtained from the public Hanyang visual tracker benchmark. The experimental results suggest that the DSFLA has an efficient processing time and higher quality of tracking compared with the other competing trackers analyzed in this work. The success rate of the DSFLA tracker is about 7.2 to 76.6% higher on average when comparing the success rate of its competitors. The average processing time per frame is about at least 10% faster than competing trackers, except one that was about 26% faster than the DSFLA tracker. The results also show that the predictions of the robust DES model are quite accurate.

7.
Sensors (Basel) ; 21(6)2021 Mar 15.
Article in English | MEDLINE | ID: mdl-33804187

ABSTRACT

Known as an artificial intelligence subarea, Swarm Robotics is a developing study field investigating bio-inspired collaborative control approaches and integrates a huge collection of agents, reasonably plain robots, in a distributed and decentralized manner. It offers an inspiring essential platform for new researchers to be engaged and share new knowledge to examine their concepts in analytical and heuristic strategies. This paper introduces an overview of current activities in Swarm Robotics and examines the present literature in this area to establish to approach between a realistic swarm robotic system and real-world enforcements. First, we review several Swarm Intelligence concepts to define Swarm Robotics systems, reporting their essential qualities and features and contrast them to generic multi-robotic systems. Second, we report a review of the principal projects that allow realistic study of Swarm Robotics. We demonstrate knowledge regarding current hardware platforms and multi-robot simulators. Finally, the forthcoming promissory applications and the troubles to surpass with a view to achieving them have been described and analyzed.

8.
Heliyon ; 6(6): e04136, 2020 Jun.
Article in English | MEDLINE | ID: mdl-32548328

ABSTRACT

This article presents a multivariable optimization of the energy and exergetic performance of a power generation system, which is integrated by a supercritical Brayton Cycle using carbon dioxide, and a Simple Organic Rankine Cycle (SORC) using toluene, with reheater ( S - C O 2 R H - S O R C ), and without reheater ( S - C O 2 N R H - S O R C ) using the PSO algorithm. A thermodynamic model of the integrated system was developed from the application of mass, energy and exergy balances to each component, which allowed the calculation of the exergy destroyed a fraction of each equipment, the power generated, the thermal and exergetic efficiency of the system. In addition, through a sensitivity analysis, the effect of the main operational and design variables on thermal efficiency and total exergy destroyed was studied, which were the objective functions selected in the proposed optimization. The results show that the greatest exergy destruction occurs at the thermal source, with a value of 97 kW for the system without Reheater (NRH), but this is reduced by 92.28% for the system with Reheater (RH). In addition, by optimizing the integrated cycle for a particle number of 25, the maximum thermal efficiency of 55.53% (NRH) was achieved, and 56.95% in the RH system. Likewise, for a particle number of 15 and 20 in the PSO algorithm, exergy destruction was minimized to 60.72 kW (NRH) and 112.06 kW (RH), respectively. Comparative analyses of some swarm intelligence optimization algorithms were conducted for the integrated S-CO2-SORC system, evaluating performance indicators, where the PSO optimization algorithm was favorable in the analyses, guaranteeing that it is the ideal algorithm to solve this case study.

9.
Sensors (Basel) ; 20(10)2020 May 18.
Article in English | MEDLINE | ID: mdl-32443435

ABSTRACT

The Industrial Internet of Things (IIoT) network generates great economic benefits in processes, system installation, maintenance, reliability, scalability, and interoperability. Wireless sensor networks (WSNs) allow the IIoT network to collect, process, and share data of different parameters among Industrial IoT sense Node (IISN). ESP8266 are IISNs connected to the Internet by means of a hub to share their information. In this article, a light-diffusion algorithm in WSN to connect all the IISNs is designed, based on the Peano fractal and swarm intelligence, i.e., without using a hub, simply sharing parameters with two adjacent IINSs, assuming that any IISN knows the parameters of the rest of these devices, even if they are not adjacent. We simulated the performance of our algorithm and compared it with other state-of-the-art protocols, finding that our proposal generates a longer lifetime of the IIoT network when few IISNs were connected. Thus, there is a saving-energy of approximately 5% but with 64 nodes there is a saving of more than 20%, because the IIoT network can grow in a 3 n way and the proposed topology does not impact in a linear way but log 3 , which balances energy consumption throughout the IIoT network.

10.
Sensors (Basel) ; 19(6)2019 Mar 23.
Article in English | MEDLINE | ID: mdl-30909621

ABSTRACT

Wireless sensor networks (WSNs) consist of a large number of small devices or nodes, called micro controller units (MCUs) and located in homes and/or offices, to be operated through the internet from anywhere, making these devices smarter and more efficient. Quality of service routing is one of the critical challenges in WSNs, especially in surveillance systems. To improve the efficiency of the network, in this article we proposes a distributed learning fractal algorithm (DFLA) to design the control topology of a wireless sensor network (WSN), whose nodes are the MCUs distributed in a physical space and which are connected to share parameters of the sensors such as concentrations of C O 2 , humidity, temperature within the space or adjustment of the intensity of light inside and outside the home or office. For this, we start defining the production rules of the L-systems to generate the Hilbert fractal, since these rules facilitate the generation of this fractal, which is a fill-space curve. Then, we model the optimization of a centralized control topology of WSNs and proposed a DFLA to find the best two nodes where a device can find the highly reliable link between these nodes. Thus, we propose a software defined network (SDN) with strong mobility since it can be reconfigured depending on the amount of nodes, also we employ a target coverage because distributed learning fractal algorithm (DLFA) only consider reliable links among devices. Finally, through laboratory tests and computer simulations, we demonstrate the effectiveness of our approach by means of a fractal routing in WSNs, by using a large amount of WSNs devices (from 16 to 64 sensors) for real time monitoring of different parameters, in order to make efficient WSNs and its application in a forthcoming Smart City.

SELECTION OF CITATIONS
SEARCH DETAIL