Lightening-Transformer: A Dynamically-Operated Optically-Interconnected Photonic Transformer Accelerator

Hanqing Zhu, Jiaqi Gu, Hanrui Wang, Zixuan Jiang, Zhekai Zhang, Rongxing Tang, Chenghao Feng, Song Han, Ray T Chen, David Z Pan
UT Austin, MIT, ASU
(* indicates equal contribution)

News

Waiting for more news.

Awards

No items found.

Competition Awards

No items found.

Abstract

The wide adoption and significant computing resource of attention-based transformers, e.g., Vision Transformers and large language models (LLM), have driven the demand for efficient hardware accelerators. There is a growing interest in exploring photonics as an alternative technology to digital electronics due to its high energy efficiency and ultra-fast processing speed. Photonic accelerators have shown promising results for CNNs, which mainly rely on weight-static linear operations. However, they encounter issues when efficiently supporting Transformer architectures, questioning the applicability of photonics to advanced ML tasks. The primary hurdle lies in their inefficiency in handling unique workloads in Transformers, i.e., dynamic and full-range tensor multiplication. In this work, we propose Lightening-Transformer, the first light-empowered, high-performance, and energy-efficient photonic Transformer accelerator. To overcome prior designs' fundamental limitations, we introduce a novel dynamically-operated photonic tensor core, DPTC, a crossbar array of interference-based optical vector dot-product engines supporting highly parallel, dynamic, and full-range matrix multiplication. Furthermore, we design a dedicated accelerator that integrates our novel photonic computing cores with photonic interconnects for inter-core data broadcast, fully unleashing the power of optics. Comprehensive evaluations show that ours achieves >2.6x energy and >12x latency reductions compared to prior photonic accelerators and delivers the lowest energy cost and 2 to 3 orders of magnitude lower energy-delay product compared to electronic Transformer accelerators, all while maintaining digital-comparable accuracy. Our work highlights the immense potential of photonics for advanced ML workloads, such as Transformer-backboned LLM.

Video

Citation

@inproceedings{zhu2024lightening,
 title={Lightening-transformer: A dynamically-operated optically-interconnected photonic transformer accelerator},
 author={Zhu, Hanqing and Gu, Jiaqi and Wang, Hanrui and Jiang, Zixuan and Zhang, Zhekai and Tang, Rongxing and Feng, Chenghao and Han, Song and Chen, Ray T and Pan, David Z},
 booktitle={2024 IEEE International Symposium on High-Performance Computer Architecture (HPCA)},
 pages={686--703},
 year={2024},
 organization={IEEE}
}

Media

No media articles found.

Acknowledgment

This work is supported in part by the Multidisciplinary University Research Initiative (MURI) program through the Air Force Office of Scientific Research (AFOSR) under contract #FA 9550-17-1-0071 and AFOSR project #FA9550-23-1-0452. We thank all anonymous HPCA reviewers for their insightful comments. Thanks to Zhixing Jiang and Shupeng Ning from the University of Texas at Austin for helpful suggestions and help during the artifact evaluation process.

Team Members