Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 6 de 6
Filtrar
Mais filtros

Bases de dados
Tipo de documento
País de afiliação
Intervalo de ano de publicação
1.
Sci Rep ; 14(1): 1328, 2024 Jan 15.
Artigo em Inglês | MEDLINE | ID: mdl-38225371

RESUMO

Quantum computers have the unique ability to operate relatively quickly in high-dimensional spaces-this is sought to give them a competitive advantage over classical computers. In this work, we propose a novel quantum machine learning model called the Quantum Discriminator, which leverages the ability of quantum computers to operate in the high-dimensional spaces. The quantum discriminator is trained using a quantum-classical hybrid algorithm in [Formula: see text] time, and inferencing is performed on a universal quantum computer in [Formula: see text] time. The quantum discriminator takes as input the binary features extracted from a given datum along with a prediction qubit, and outputs the predicted label. We analyze its performance on the Iris and Bars and Stripes data sets, and show that it can attain 99% accuracy in simulation.

2.
Sci Rep ; 13(1): 10975, 2023 Jul 06.
Artigo em Inglês | MEDLINE | ID: mdl-37414838

RESUMO

Neuromorphic computers emulate the human brain while being extremely power efficient for computing tasks. In fact, they are poised to be critical for energy-efficient computing in the future. Neuromorphic computers are primarily used in spiking neural network-based machine learning applications. However, they are known to be Turing-complete, and in theory can perform all general-purpose computation. One of the biggest bottlenecks in realizing general-purpose computations on neuromorphic computers today is the inability to efficiently encode data on the neuromorphic computers. To fully realize the potential of neuromorphic computers for energy-efficient general-purpose computing, efficient mechanisms must be devised for encoding numbers. Current encoding mechanisms (e.g., binning, rate-based encoding, and time-based encoding) have limited applicability and are not suited for general-purpose computation. In this paper, we present the virtual neuron abstraction as a mechanism for encoding and adding integers and rational numbers by using spiking neural network primitives. We evaluate the performance of the virtual neuron on physical and simulated neuromorphic hardware. We estimate that the virtual neuron could perform an addition operation using just 23 nJ of energy on average with a mixed-signal, memristor-based neuromorphic processor. We also demonstrate the utility of the virtual neuron by using it in some of the µ-recursive functions, which are the building blocks of general-purpose computation.


Assuntos
Computadores , Redes Neurais de Computação , Humanos , Neurônios/fisiologia , Aprendizado de Máquina , Encéfalo/fisiologia
3.
Nat Comput Sci ; 2(1): 10-19, 2022 Jan.
Artigo em Inglês | MEDLINE | ID: mdl-38177712

RESUMO

Neuromorphic computing technologies will be important for the future of computing, but much of the work in neuromorphic computing has focused on hardware development. Here, we review recent results in neuromorphic computing algorithms and applications. We highlight characteristics of neuromorphic computing technologies that make them attractive for the future of computing and we discuss opportunities for future development of algorithms and applications on these systems.

4.
Sci Rep ; 11(1): 21905, 2021 Nov 09.
Artigo em Inglês | MEDLINE | ID: mdl-34754050

RESUMO

A major challenge in machine learning is the computational expense of training these models. Model training can be viewed as a form of optimization used to fit a machine learning model to a set of data, which can take up significant amount of time on classical computers. Adiabatic quantum computers have been shown to excel at solving optimization problems, and therefore, we believe, present a promising alternative to improve machine learning training times. In this paper, we present an adiabatic quantum computing approach for training a linear regression model. In order to do this, we formulate the regression problem as a quadratic unconstrained binary optimization (QUBO) problem. We analyze our quantum approach theoretically, test it on the D-Wave adiabatic quantum computer and compare its performance to a classical approach that uses the Scikit-learn library in Python. Our analysis shows that the quantum approach attains up to [Formula: see text] speedup over the classical approach on larger datasets, and performs at par with the classical approach on the regression error metric. The quantum approach used the D-Wave 2000Q adiabatic quantum computer, whereas the classical approach used a desktop workstation with an 8-core Intel i9 processor. As such, the results obtained in this work must be interpreted within the context of the specific hardware and software implementations of these machines.

5.
Sci Rep ; 11(1): 10029, 2021 May 11.
Artigo em Inglês | MEDLINE | ID: mdl-33976283

RESUMO

Training machine learning models on classical computers is usually a time and compute intensive process. With Moore's law nearing its inevitable end and an ever-increasing demand for large-scale data analysis using machine learning, we must leverage non-conventional computing paradigms like quantum computing to train machine learning models efficiently. Adiabatic quantum computers can approximately solve NP-hard problems, such as the quadratic unconstrained binary optimization (QUBO), faster than classical computers. Since many machine learning problems are also NP-hard, we believe adiabatic quantum computers might be instrumental in training machine learning models efficiently in the post Moore's law era. In order to solve problems on adiabatic quantum computers, they must be formulated as QUBO problems, which is very challenging. In this paper, we formulate the training problems of three machine learning models-linear regression, support vector machine (SVM) and balanced k-means clustering-as QUBO problems, making them conducive to be trained on adiabatic quantum computers. We also analyze the computational complexities of our formulations and compare them to corresponding state-of-the-art classical approaches. We show that the time and space complexities of our formulations are better (in case of SVM and balanced k-means clustering) or equivalent (in case of linear regression) to their classical counterparts.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA