Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 47
Filtrar
Más filtros

Bases de datos
Tipo del documento
Intervalo de año de publicación
1.
Entropy (Basel) ; 25(3)2023 Mar 21.
Artículo en Inglés | MEDLINE | ID: mdl-36981429

RESUMEN

Recent advances in quantum hardware offer new approaches to solve various optimization problems that can be computationally expensive when classical algorithms are employed. We propose a hybrid quantum-classical algorithm to solve a dynamic asset allocation problem where a target return and a target risk metric (expected shortfall) are specified. We propose an iterative algorithm that treats the target return as a constraint in a Markowitz portfolio optimization model, and dynamically adjusts the target return to satisfy the targeted expected shortfall. The Markowitz optimization is formulated as a Quadratic Unconstrained Binary Optimization (QUBO) problem. The use of the expected shortfall risk metric enables the modeling of extreme market events. We compare the results from D-Wave's 2000Q and Advantage quantum annealers using real-world financial data. Both quantum annealers are able to generate portfolios with more than 80% of the return of the classical optimal solutions, while satisfying the expected shortfall. We observe that experiments on assets with higher correlations tend to perform better, which may help to design practical quantum applications in the near term.

2.
Neth Heart J ; 31(3): 117-123, 2023 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-36445615

RESUMEN

INTRODUCTION: In a Dutch heart centre, a dedicated chronic total occlusion (CTO) team was implemented in June 2017. The aim of this study was to the evaluate treatment success and clinical outcomes before and after this implementation. METHODS: A total of 662 patients who underwent percutaneous coronary intervention (PCI) for a CTO between January 2013 and June 2020 were included and divided into pre- and post-CTO team groups. The primary endpoint was the angiographic success rate of CTO-PCI. Secondary endpoints included angiographic success stratified by complexity using the J­CTO score and the following clinical outcomes: in-hospital complications and myocardial infarction, target vessel revascularisation, all-cause mortality, quality of life (QoL) and major adverse cardiac events (MACE) at 30-day and 1­year follow-up. RESULTS: Compared with the pre-CTO team group, the success rate in the post-CTO team group was higher after the first attempt (81.4% vs 62.7%; p < 0.001) and final attempt (86.7% vs 73.8%; p = 0.001). This was mainly driven by higher success rates for difficult and very difficult CTO lesions according to the J­CTO score. The MACE rate at 1 year was lower in the post-CTO team group than in the pre-CTO team group (6.4% vs 16.0%; p < 0.01), while it was comparable at 30-day follow-up (0.1% vs 1.7%; p = 0.74). Angina symptoms were significantly reduced at 30-day and 1­year follow-up, and QoL scores were higher after 1 year. CONCLUSION: This study demonstrated higher success rates of CTO-PCI and improved clinical outcomes and QoL at 1­year follow-up after implementation of a dedicated CTO team using the hybrid algorithm.

3.
BMC Bioinformatics ; 23(1): 122, 2022 Apr 07.
Artículo en Inglés | MEDLINE | ID: mdl-35392798

RESUMEN

BACKGROUND: The assembly task is an indispensable step in sequencing genomes of new organisms and studying structural genomic changes. In recent years, the dynamic development of next-generation sequencing (NGS) methods raises hopes for making whole-genome sequencing a fast and reliable tool used, for example, in medical diagnostics. However, this is hampered by the slowness and computational requirements of the current processing algorithms, which raises the need to develop more efficient algorithms. One possible approach, still little explored, is the use of quantum computing. RESULTS: We present a proof of concept of de novo assembly algorithm, using the Genomic Signal Processing approach, detecting overlaps between DNA reads by calculating the Pearson correlation coefficient and formulating the assembly problem as an optimization task (Traveling Salesman Problem). Computations performed on a classic computer were compared with the results achieved by a hybrid method combining CPU and QPU calculations. For this purpose quantum annealer by D-Wave was used. The experiments were performed with artificially generated data and DNA reads coming from a simulator, with actual organism genomes used as input sequences. To our knowledge, this work is one of the few where actual sequences of organisms were used to study the de novo assembly task on quantum annealer. CONCLUSIONS: Proof of concept carried out by us showed that the use of quantum annealer (QA) for the de novo assembly task might be a promising alternative to the computations performed in the classical model. The current computing power of the available devices requires a hybrid approach (combining CPU and QPU computations). The next step may be developing a hybrid algorithm strictly dedicated to the de novo assembly task, using its specificity (e.g. the sparsity and bounded degree of the overlap-layout-consensus graph).


Asunto(s)
Metodologías Computacionales , Teoría Cuántica , Algoritmos , Secuencia de Bases , ADN/genética , Secuenciación de Nucleótidos de Alto Rendimiento/métodos , Análisis de Secuencia de ADN/métodos
4.
Sensors (Basel) ; 22(3)2022 Jan 24.
Artículo en Inglés | MEDLINE | ID: mdl-35161630

RESUMEN

The waste mine water is produced in the process of coal mining, which is the main cause of mine flood and environmental pollution. Therefore, economic treatment and efficient reuse of mine water is one of the main research directions in the mining area at present. It is an urgent problem to use an intelligent algorithm to realize optimal allocation and economic reuse of mine water. In order to solve this problem, this paper first designs a reuse mathematical model according to the mine water treatment system, which includes the mine water reuse rate, the reuse cost at different stages and the operational efficiency of the whole mine water treatment system. Then, a hybrid optimization algorithm, GAPSO, was proposed by combining genetic algorithm (GA) and particle swarm optimization (PSO), and adaptive improvement (TSA-GAPSO) was carried out for the two optimization stages. Finally, simulation analysis and actual data detection of the mine water reuse model are carried out by using four algorithms, respectively. The results show that the hybrid improved algorithm has better convergence speed and precision in solving the mine water scheduling problem. TSA-GAPSO algorithm has the best effect and is superior to the other three algorithms. The cost of mine water reuse is reduced by 9.09%, and the treatment efficiency of the whole system is improved by 5.81%, which proves the practicability and superiority of the algorithm.


Asunto(s)
Algoritmos , Modelos Teóricos , Simulación por Computador , Contaminación Ambiental , Inundaciones
5.
Sensors (Basel) ; 22(21)2022 Oct 24.
Artículo en Inglés | MEDLINE | ID: mdl-36365820

RESUMEN

Impact force is the most common form of load which acts on engineering structures and presents a great hidden risk to the healthy operation of machinery. Therefore, the identification or monitoring of impact forces is a significant issue in structural health monitoring. The conventional optimisation scheme based on inversion techniques requires a significant amount of time to identify random impact forces (impact force localisation and time history reconstruction) and is not suitable for engineering applications. Recently, a pattern recognition method combined with the similarity metric, PRMCSM, has been proposed, which exhibits rapidity in practical engineering applications. This study proposes a novel scheme for identifying unknown random impact forces which hybridises two existing methods and combines the advantages of both. The experimental results indicate that the localisation accuracy of the proposed algorithm (100%) is higher than that of PRMCSM (92%), and the calculation time of the hybrid algorithm (179 s) for 25 validation cases is approximately one nineteenth of the traditional optimisation strategy (3446 s).


Asunto(s)
Algoritmos , Acero , Fenómenos Mecánicos
6.
J Environ Manage ; 303: 114252, 2022 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-34894493

RESUMEN

Many companies and organizations are pursuing "carbon footprint" projects to estimate their own contribution due to growing concerns about global climate change and carbon emissions. Measures such as carbon taxes are the most powerful means of dealing with the threats of climate change. In recent years, researchers have shown a particular interest in modelling supply chain networks under this scheme. Disorganized disposal of by-products from sugarcane mills is the inspiration of this research. In order to connect the problem with the real world, the proposed sustainable sugarcane supply chain network considers carbon taxes on the emission from industries and during transportation of goods. The presented mixed-integer linear programming modelling is a location-allocation problem and, due to the inherent complexity, it is considered a Non-Polynomial hard (NP-hard) problem. To deal with the model, three superior metaheuristics Genetic Algorithm (GA), Simulated Annealing (SA), Social Engineering Optimizer (SEO) and hybrid methods based on these metaheuristics, namely, Genetic-Simulated Annealing (GASA) and Genetic-Social Engineering Optimizer (GASEO), are employed. The control parameters of the algorithms are tuned using the Taguchi approach. Subsequently, one-way ANOVA is used to elucidate the performance of the proposed algorithms, which compliments the performance of the proposed GASEO.


Asunto(s)
Modelos Teóricos , Saccharum , Algoritmos , Huella de Carbono , Transportes
7.
BMC Public Health ; 21(1): 1375, 2021 07 12.
Artículo en Inglés | MEDLINE | ID: mdl-34247609

RESUMEN

BACKGROUND: This article aims to understand the prevalence of hyperlipidemia and its related factors in Shanxi Province. On the basis of multivariate Logistic regression analysis to find out the influencing factors closely related to hyperlipidemia, the complex network connection between various variables was presented through Bayesian networks(BNs). METHODS: Logistic regression was used to screen for hyperlipidemia-related variables, and then the complex network connection between various variables was presented through BNs. Since some drawbacks stand out in the Max-Min Hill-Climbing (MMHC) hybrid algorithm, extra hybrid algorithms are proposed to construct the BN structure: MMPC-Tabu, Fast.iamb-Tabu and Inter.iamb-Tabu. To assess their performance, we made a comparison between these three hybrid algorithms with the widely used MMHC hybrid algorithm on randomly generated datasets. Afterwards, the optimized BN was determined to explore to study related factors for hyperlipidemia. We also make a comparison between the BN model with logistic regression model. RESULTS: The BN constructed by Inter.iamb-Tabu hybrid algorithm had the best fitting degree to the benchmark networks, and was used to construct the BN model of hyperlipidemia. Multivariate logistic regression analysis suggested that gender, smoking, central obesity, daily average salt intake, daily average oil intake, diabetes mellitus, hypertension and physical activity were associated with hyperlipidemia. BNs model of hyperlipidemia further showed that gender, BMI, and physical activity were directly related to the occurrence of hyperlipidemia, hyperlipidemia was directly related to the occurrence of diabetes mellitus and hypertension; the average daily salt intake, daily average oil consumption, smoking, and central obesity were indirectly related to hyperlipidemia. CONCLUSIONS: The BN of hyperlipidemia constructed by the Inter.iamb-Tabu hybrid algorithm is more reasonable, and allows for the overall linking effect between factors and diseases, revealing the direct and indirect factors associated with hyperlipidemia and correlation between related variables, which can provide a new approach to the study of chronic diseases and their associated factors.


Asunto(s)
Hiperlipidemias , Algoritmos , Teorema de Bayes , Estudios Transversales , Humanos , Hiperlipidemias/epidemiología , Modelos Logísticos
8.
Genomics ; 112(6): 4777-4787, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-33348478

RESUMEN

An increasing number of research shows that long non-coding RNA plays a key role in many important biological processes. However, the number of disease-related lncRNAs found by researchers remains relatively small, and experimental identification is time consuming and labor intensive. In this study, we propose a novel method, namely HAUBRW, to predict undiscovered lncRNA-disease associations. First, the hybrid algorithm, which combines the heat spread algorithm and the probability diffusion algorithm, redistributes the resources. Second, unbalanced bi-random walk, is used to infer undiscovered lncRNA disease associations. Seven advanced models, i.e. BRWLDA, DSCMF, RWRlncD, IDLDA, KATZ, Ping's, and Yang's were compared with our method, and simulation results show that the AUC of our method is more perfect than the other models. In addition, case studies have shown that HAUBRW can effectively predict candidate lncRNAs for breast, osteosarcoma and cervical cancer. Therefore, our approach may be a good choice in future biomedical research.


Asunto(s)
Algoritmos , Biología Computacional , Predisposición Genética a la Enfermedad , ARN Largo no Codificante/genética , Simulación por Computador , Estudios de Asociación Genética , Humanos
9.
Sensors (Basel) ; 21(15)2021 Aug 03.
Artículo en Inglés | MEDLINE | ID: mdl-34372483

RESUMEN

The power system planning problem considering system loss function, voltage profile function, the cost function of FACTS (flexible alternating current transmission system) devices, and stability function are investigated in this paper. With the growth of electronic technologies, FACTS devices have improved stability and more reliable planning in reactive power (RP) planning. In addition, in modern power systems, renewable resources have an inevitable effect on power system planning. Therefore, wind resources make a complicated problem of planning due to conflicting functions and non-linear constraints. This confliction is the stochastic nature of the cost, loss, and voltage functions that cannot be summarized in function. A multi-objective hybrid algorithm is proposed to solve this problem by considering the linear and non-linear constraints that combine particle swarm optimization (PSO) and the virus colony search (VCS). VCS is a new optimization method based on viruses' search function to destroy host cells and cause the penetration of the best virus into a cell for reproduction. In the proposed model, the PSO is used to enhance local and global search. In addition, the non-dominated sort of the Pareto criterion is used to sort the data. The optimization results on different scenarios reveal that the combined method of the proposed hybrid algorithm can improve the parameters such as convergence time, index of voltage stability, and absolute magnitude of voltage deviation, and this method can reduce the total transmission line losses. In addition, the presence of wind resources has a positive effect on the mentioned issue.


Asunto(s)
Algoritmos , Viento , Electricidad , Objetivos
10.
Catheter Cardiovasc Interv ; 95(1): 97-104, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-30919577

RESUMEN

BACKGROUND: Percutaneous recanalization of coronary chronic total occlusions (CTOs) traditionally relies on the use of dual-access and large bore catheters, with trans-femoral approach adoption in most cases. OBJECTIVES: Aim of this manuscript is to describe the outcomes of an alternative hybrid algorithm, called "Minimalistic Hybrid Algorithm," which has the purpose to minimize the use of double access, large bore catheters, and femoral approach in order to minimize the risk of vascular complications and patient's discomfort, without compromising efficacy. METHODS: In this single-center registry, a "minimalistic" approach was attempted in consecutive patients undergoing CTO PCI between March 2016 and October 2017. Data regarding the applicability of this algorithm and the related procedural success rates were collected, together with common demographic and angiographic characteristics. RESULTS: Of the 100 CTO PCI performed in the study period, 91(91%) were successfully approached according to the novel algorithm. Mean J-CTO score of all minimalistic procedures was 1.9 ± 1.2, with 31(34%) patients presenting with J-CTO score ≥3. In 52 procedures, the approach consisted of single-catheter access, 49(94.2%) of which were trans-radial. Out of the 39 patients approached with dual-catheters, 26(69.2%) were biradial, and 8(21%) radial-femoral. Procedural success in patients approached with the minimalistic algorithm was 89%, in line with the results of large-multicenter experiences nowadays available. CONCLUSIONS: Our results show that an alternative algorithm limiting the routine use of large bore catheters and trans-femoral approach is feasible in the clinical practice and yields good procedural outcomes.


Asunto(s)
Algoritmos , Cateterismo Periférico , Protocolos Clínicos , Oclusión Coronaria/terapia , Arteria Femoral , Intervención Coronaria Percutánea , Arteria Radial , Anciano , Cateterismo Periférico/efectos adversos , Cateterismo Periférico/instrumentación , Enfermedad Crónica , Oclusión Coronaria/diagnóstico por imagen , Oclusión Coronaria/fisiopatología , Árboles de Decisión , Diseño de Equipo , Femenino , Humanos , Masculino , Persona de Mediana Edad , Intervención Coronaria Percutánea/efectos adversos , Sistema de Registros , Estudios Retrospectivos , Factores de Riesgo , Resultado del Tratamiento , Dispositivos de Acceso Vascular
11.
Sensors (Basel) ; 20(7)2020 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-32290193

RESUMEN

Grey wolf optimizer (GWO) is a meta-heuristic algorithm inspired by the hierarchy of grey wolves (Canis lupus). Fireworks algorithm (FWA) is a nature-inspired optimization method mimicking the explosion process of fireworks for optimization problems. Both of them have a strong optimal search capability. However, in some cases, GWO converges to the local optimum and FWA converges slowly. In this paper, a new hybrid algorithm (named as FWGWO) is proposed, which fuses the advantages of these two algorithms to achieve global optima effectively. The proposed algorithm combines the exploration ability of the fireworks algorithm with the exploitation ability of the grey wolf optimizer (GWO) by setting a balance coefficient. In order to test the competence of the proposed hybrid FWGWO, 16 well-known benchmark functions having a wide range of dimensions and varied complexities are used in this paper. The results of the proposed FWGWO are compared to nine other algorithms, including the standard FWA, the native GWO, enhanced grey wolf optimizer (EGWO), and augmented grey wolf optimizer (AGWO). The experimental results show that the FWGWO effectively improves the global optimal search capability and convergence speed of the GWO and FWA.


Asunto(s)
Algoritmos , Biomimética , Simulación por Computador
12.
Entropy (Basel) ; 21(3)2019 Mar 23.
Artículo en Inglés | MEDLINE | ID: mdl-33267032

RESUMEN

In this paper, a new hybrid whale optimization algorithm (WOA) called WOA-DE is proposed to better balance the exploitation and exploration phases of optimization. Differential evolution (DE) is adopted as a local search strategy with the purpose of enhancing exploitation capability. The WOA-DE algorithm is then utilized to solve the problem of multilevel color image segmentation that can be considered as a challenging optimization task. Kapur's entropy is used to obtain an efficient image segmentation method. In order to evaluate the performance of proposed algorithm, different images are selected for experiments, including natural images, satellite images and magnetic resonance (MR) images. The experimental results are compared with state-of-the-art meta-heuristic algorithms as well as conventional approaches. Several performance measures have been used such as average fitness values, standard deviation (STD), peak signal to noise ratio (PSNR), structural similarity index (SSIM), feature similarity index (FSIM), Wilcoxon's rank sum test, and Friedman test. The experimental results indicate that the WOA-DE algorithm is superior to the other meta-heuristic algorithms. In addition, to show the effectiveness of the proposed technique, the Otsu method is used for comparison.

13.
Entropy (Basel) ; 20(8)2018 Aug 20.
Artículo en Inglés | MEDLINE | ID: mdl-33265709

RESUMEN

Bayesian network structure learning from data has been proved to be a NP-hard (Non-deterministic Polynomial-hard) problem. An effective method of improving the accuracy of Bayesian network structure is using experts' knowledge instead of only using data. Some experts' knowledge (named here explicit knowledge) can make the causal relationship between nodes in Bayesian Networks (BN) structure clear, while the others (named here vague knowledge) cannot. In the previous algorithms for BN structure learning, only the explicit knowledge was used, but the vague knowledge, which was ignored, is also valuable and often exists in the real world. Therefore we propose a new method of using more comprehensive experts' knowledge based on hybrid structure learning algorithm, a kind of two-stage algorithm. Two types of experts' knowledge are defined and incorporated into the hybrid algorithm. We formulate rules to generate better initial network structure and improve the scoring function. Furthermore, we take expert level difference and opinion conflict into account. Experimental results show that our proposed method can improve the structure learning performance.

14.
Plant Cell Environ ; 38(11): 2299-312, 2015 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-25850935

RESUMEN

Emissions of biogenic volatile organic compounds (BVOC) by boreal evergreen trees have strong seasonality, with low emission rates during photosynthetically inactive winter and increasing rates towards summer. Yet, the regulation of this seasonality remains unclear. We measured in situ monoterpene emissions from Scots pine shoots during several spring periods and analysed their dynamics in connection with the spring recovery of photosynthesis. We found high emission peaks caused by enhanced monoterpene synthesis consistently during every spring period (monoterpene emission bursts, MEB). The timing of the MEBs varied relatively little between the spring periods. The timing of the MEBs showed good agreement with the photosynthetic spring recovery, which was studied with simultaneous measurements of chlorophyll fluorescence, CO2 exchange and a simple, temperature history-based proxy for state of photosynthetic acclimation, S. We conclude that the MEBs were related to the early stages of photosynthetic recovery, when the efficiency of photosynthetic carbon reactions is still low whereas the light harvesting machinery actively absorbs light energy. This suggests that the MEBs may serve a protective functional role for the foliage during this critical transitory state and that these high emission peaks may contribute to atmospheric chemistry in the boreal forest in springtime.


Asunto(s)
Monoterpenos/metabolismo , Fotosíntesis , Pinus sylvestris/metabolismo , Estaciones del Año , Dióxido de Carbono/metabolismo , Clorofila/metabolismo , Temperatura
15.
Heliyon ; 10(7): e29006, 2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38601575

RESUMEN

The estimation of groundwater levels is crucial and an important step in ensuring sustainable management of water resources. In this paper, selected piezometers of the Hamedan-Bahar plain located in west of Iran. The main objective of this study is to compare effect of various pre-processing methods on input data for different artificial intelligence (AI) models to predict groundwater levels (GWLs). The observed GWL, evaporation, precipitation, and temperature were used as input variables in the AI algorithms. Firstly, 126 method of data pre-processing was done by python programming which are classified into three classes: 1- statistical methods, 2- wavelet transform methods and 3- decomposition methods; later, various pre-processed data used by four types of widely used AI models with different kernels, which includes: Support Vector Machine (SVR), Artificial Neural Network (ANN), Long-Short Term memory (LSTM), and Pelican Optimization Algorithm (POA) - Artificial Neural Network (POA-ANN) are classified into three classes: 1- machine learning (SVR and ANN), 2- deep learning (LSTM) and 3- hybrid-ML (POA-ANN) models, to predict groundwater levels (GWLs). Akaike Information Criterion (AIC) were used to evaluate and validate the predictive accuracy of algorithms. According to the results, based on summation (train and test phases) of AIC value of 1778 models, average of AIC values for ML, DL, hybrid-ML classes, was decreased to -25.3%, -29.6% and -57.8%, respectively. Therefore, the results showed that all data pre-processing methods do not lead to improvement of prediction accuracy, and they should be selected very carefully by trial and error. In conclusion, wavelet-ANN model with daubechies 13 and 25 neurons (db13_ANN_25) is the best model to predict GWL that has -204.9 value for AIC which has grown by 5.23% (-194.7) compared to the state without any pre-processing method (ANN_Relu_25).

16.
Sci Rep ; 14(1): 11259, 2024 May 17.
Artículo en Inglés | MEDLINE | ID: mdl-38755222

RESUMEN

As the terminal of the power system, the distribution network is the main area where failures occur. In addition, with the integration of distributed generation, the traditional distribution network becomes more complex, rendering the conventional fault location algorithms based on a single power supply obsolete. Therefore, it is necessary to seek a new algorithm to locate the fault of the distributed power distribution network. In existing fault localization algorithms for distribution networks, since there are only two states of line faults, which can usually be represented by 0 and 1, most algorithms use discrete algorithms with this characteristic for iterative optimization. Therefore, this paper combines the advantages of the particle swarm algorithm and genetic algorithm and uses continuous real numbers for iteration to construct a successive particle swarm genetic algorithm (SPSO-GA) different from previous algorithms. The accuracy, speed, and fault tolerance of SPSO-GA, discrete particle swarm Genetic algorithm, and artificial fish swarm algorithm are compared in an IEEE33-node distribution network with the distributed power supply. The simulation results show that the SPSO-GA algorithm has high optimization accuracy and stability for single, double, or triple faults. Furthermore, SPSO-GA has a rapid convergence velocity, requires fewer particles, and can locate the fault segment accurately for the distribution network containing distorted information.

17.
J Contam Hydrol ; 265: 104385, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-38878553

RESUMEN

This study aims to develop a multi-objective quantitative-qualitative reservoir operation model (MOQQROM) by a simulation-optimization approach. However, the main challenge of these models is their computational complexity. The simulation-optimization method used in this study consists of CE-QUAL-W2 as a hydrodynamic and water quality simulation model and a multi-objective firefly algorithm-k nearest neighbor (MOFA-KNN) as an optimization algorithm which is an efficient algorithm to overcome the computational burden in simulation-optimization approaches by decreasing simulation model calls. MOFA-KNN was expanded for this study, and its performance was evaluated in the MOQQROM. Three objectives were considered in this study, including (1) the sum of the squared mass of total dissolved solids (TDS), (2) the sum of the squared temperature difference between reservoir inflow and outflow as water quality objectives, and (3) the vulnerability index as a water quantity objective. Aidoghmoush reservoir was employed as a case study, and the model was investigated under three scenarios, including the normal, wet, and dry years. Results showed the expanded MOFA-KNN reduced the number of original simulation model calls compared to the total number of simulations in MOQQROM by more than 99%, indicating its efficacy in significantly reducing execution time. The three most desired operating policies for meeting each objective were selected for investigation. Results showed that the operation policy with the best value for the second objective could be chosen as a compromise policy to balance the two conflicting goals of improving quality and supplying the demand in normal and wet scenarios. In terms of contamination mass, this policy was, on average, 16% worse than the first policy and 40% better than the third policy in the normal scenario. In the wet scenario, it was, on average, 55% worse than the first policy and 16% better than the third policy. The outflow temperature of this policy was, on average, only 8.35% different from the inflow temperature in the normal scenario and 0.93% different in the wet scenario. The performance of the developed model is satisfactory.


Asunto(s)
Modelos Teóricos , Calidad del Agua , Abastecimiento de Agua , Algoritmos , Simulación por Computador
18.
Front Big Data ; 7: 1358486, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-38449564

RESUMEN

As the volume and velocity of Big Data continue to grow, traditional cloud computing approaches struggle to meet the demands of real-time processing and low latency. Fog computing, with its distributed network of edge devices, emerges as a compelling solution. However, efficient task scheduling in fog computing remains a challenge due to its inherently multi-objective nature, balancing factors like execution time, response time, and resource utilization. This paper proposes a hybrid Genetic Algorithm (GA)-Particle Swarm Optimization (PSO) algorithm to optimize multi-objective task scheduling in fog computing environments. The hybrid approach combines the strengths of GA and PSO, achieving effective exploration and exploitation of the search space, leading to improved performance compared to traditional single-algorithm approaches. The proposed hybrid algorithm results improved the execution time by 85.68% when compared with GA algorithm, by 84% when compared with Hybrid PWOA and by 51.03% when compared with PSO algorithm as well as it improved the response time by 67.28% when compared with GA algorithm, by 54.24% when compared with Hybrid PWOA and by 75.40% when compared with PSO algorithm as well as it improved the completion time by 68.69% when compared with GA algorithm, by 98.91% when compared with Hybrid PWOA and by 75.90% when compared with PSO algorithm when various tasks inputs are given. The proposed hybrid algorithm results also improved the execution time by 84.87% when compared with GA algorithm, by 88.64% when compared with Hybrid PWOA and by 85.07% when compared with PSO algorithm it improved the response time by 65.92% when compared with GA algorithm, by 80.51% when compared with Hybrid PWOA and by 85.26% when compared with PSO algorithm as well as it improved the completion time by 67.60% when compared with GA algorithm, by 81.34% when compared with Hybrid PWOA and by 85.23% when compared with PSO algorithm when various fog nodes are given.

19.
Comput Oper Res ; 40(1): 490-497, 2013 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-23471127

RESUMEN

Demographic change towards an ever aging population entails an increasing demand for specialized transportation systems to complement the traditional public means of transportation. Typically, users place transportation requests, specifying a pickup and a drop off location and a fleet of minibuses or taxis is used to serve these requests. The underlying optimization problem can be modeled as a dial-a-ride problem. In the dial-a-ride problem considered in this paper, total routing costs are minimized while respecting time window, maximum user ride time, maximum route duration, and vehicle capacity restrictions. We propose a hybrid column generation and large neighborhood search algorithm and compare different hybridization strategies on a set of benchmark instances from the literature.

20.
Stud Health Technol Inform ; 305: 190-193, 2023 Jun 29.
Artículo en Inglés | MEDLINE | ID: mdl-37386993

RESUMEN

Process Mining is a technique looking into the analysis and mining of existing process flow. On the other hand, Machine Learning is a data science field and a sub-branch of Artificial Intelligence with the main purpose of replicating human behavior through algorithms. The separate application of Process Mining and Machine Learning for healthcare purposes has been widely explored with a various number of published works discussing their use. However, the simultaneous application of Process Mining and Machine Learning algorithms is still a growing field with ongoing studies on its application. This paper proposes a feasible framework where Process Mining and Machine Learning can be used in combination within the healthcare environment.


Asunto(s)
Inteligencia Artificial , Insuficiencia Renal Crónica , Humanos , Aprendizaje Automático , Pacientes , Insuficiencia Renal Crónica/diagnóstico , Insuficiencia Renal Crónica/terapia , Progresión de la Enfermedad
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA