Gekko Budiutama, Hirofumi Nishi, Yu-ichiro Matsushita of Quemix, and Shunsuke Daimon1, Ryui Kaneko 2,3, and Tomi Ohtsuki,3, have posted a paper on airXiv titled
"Channel Attention for Quantum Convolutional Neural Networks ".
1: Quantum Materials and Applications Research Center,National Institutes for Quantum Science and Technology
2: Waseda Research Institute for Science and Engineering, Waseda University
3: Physics Division, Sophia University
Drawing inspiration from the human ability to concentrate on specific details, the attention mechanism has become a crucial component in today's most advanced AI systems, including generative AI, large language models, and advanced computer vision. In this paper, we introduce the world's first channel attention mechanism for quantum neural networks.
Inspired by classical machine learning literature, our attention mechanism puts different weights of importance on different probabilities of outcome.
The novel mechanism was employed in quantum convolutional neural networks, the quantum counterpart of the widely adopted convolutional neural networks. The effect of the channel attention was then investigated in a classification of the quantum phases of a material.
As a result, the proposed method significantly reduces the cost of training with a small number of additional learning parameters and significantly improves the performance of the quantum convolutional neural networks, making them both cost-effective and useful for a broad range of applications. This may allow the implementation of quantum neural networks on tasks that require a significantly large number of qubits, such as drug and chemical modeling, computer vision, and forecasting financial risk.
Quemix is also advancing research in the field of quantum machine learning, and this innovative technology opens up possibilities for future expansion into the realm of quantum AI products.
תגובות