Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 4 de 4
Filter
Add more filters

Database
Language
Publication year range
1.
Nature ; 608(7923): 504-512, 2022 08.
Article in English | MEDLINE | ID: mdl-35978128

ABSTRACT

Realizing increasingly complex artificial intelligence (AI) functionalities directly on edge devices calls for unprecedented energy efficiency of edge hardware. Compute-in-memory (CIM) based on resistive random-access memory (RRAM)1 promises to meet such demand by storing AI model weights in dense, analogue and non-volatile RRAM devices, and by performing AI computation directly within RRAM, thus eliminating power-hungry data movement between separate compute and memory2-5. Although recent studies have demonstrated in-memory matrix-vector multiplication on fully integrated RRAM-CIM hardware6-17, it remains a goal for a RRAM-CIM chip to simultaneously deliver high energy efficiency, versatility to support diverse models and software-comparable accuracy. Although efficiency, versatility and accuracy are all indispensable for broad adoption of the technology, the inter-related trade-offs among them cannot be addressed by isolated improvements on any single abstraction level of the design. Here, by co-optimizing across all hierarchies of the design from algorithms and architecture to circuits and devices, we present NeuRRAM-a RRAM-based CIM chip that simultaneously delivers versatility in reconfiguring CIM cores for diverse model architectures, energy efficiency that is two-times better than previous state-of-the-art RRAM-CIM chips across various computational bit-precisions, and inference accuracy comparable to software models quantized to four-bit weights across various AI tasks, including accuracy of 99.0 percent on MNIST18 and 85.7 percent on CIFAR-1019 image classification, 84.7-percent accuracy on Google speech command recognition20, and a 70-percent reduction in image-reconstruction error on a Bayesian image-recovery task.

2.
ACS Nano ; 17(13): 11994-12039, 2023 Jul 11.
Article in English | MEDLINE | ID: mdl-37382380

ABSTRACT

Memristive technology has been rapidly emerging as a potential alternative to traditional CMOS technology, which is facing fundamental limitations in its development. Since oxide-based resistive switches were demonstrated as memristors in 2008, memristive devices have garnered significant attention due to their biomimetic memory properties, which promise to significantly improve power consumption in computing applications. Here, we provide a comprehensive overview of recent advances in memristive technology, including memristive devices, theory, algorithms, architectures, and systems. In addition, we discuss research directions for various applications of memristive technology including hardware accelerators for artificial intelligence, in-sensor computing, and probabilistic computing. Finally, we provide a forward-looking perspective on the future of memristive technology, outlining the challenges and opportunities for further research and innovation in this field. By providing an up-to-date overview of the state-of-the-art in memristive technology, this review aims to inform and inspire further research in this field.

3.
Sci Rep ; 10(1): 13404, 2020 Aug 04.
Article in English | MEDLINE | ID: mdl-32747716

ABSTRACT

An amendment to this paper has been published and can be accessed via a link at the top of the paper.

4.
Sci Rep ; 10(1): 6831, 2020 04 22.
Article in English | MEDLINE | ID: mdl-32322007

ABSTRACT

Exponential growth in data generation and large-scale data science has created an unprecedented need for inexpensive, low-power, low-latency, high-density information storage. This need has motivated significant research into multi-level memory devices that are capable of storing multiple bits of information per device. The memory state of these devices is intrinsically analog. Furthermore, much of the data they will store, along with the subsequent operations on the majority of this data, are all intrinsically analog-valued. Ironically though, in the current storage paradigm, both the devices and data are quantized for use with digital systems and digital error-correcting codes. Here, we recast the storage problem as a communication problem. This then allows us to use ideas from analog coding and show, using phase change memory as a prototypical multi-level storage technology, that analog-valued emerging memory devices can achieve higher capacities when paired with analog codes. Further, we show that storing analog signals directly through joint coding can achieve low distortion with reduced coding complexity. Specifically, by jointly optimizing for signal statistics, device statistics, and a distortion metric, we demonstrate that single-symbol analog codings can perform comparably to digital codings with asymptotically large code lengths. These results show that end-to-end analog memory systems have the potential to not only reach higher storage capacities than discrete systems but also to significantly lower coding complexity, leading to faster and more energy efficient data storage.

SELECTION OF CITATIONS
SEARCH DETAIL