Presentation Information
[15p-M_123-3]Adaptive Interpolating Quantum Transform (AIQT): Efficient Data-Adaptive Quantum Transform Learning
〇Gekko Patria Budiutama1,2, Shunsuke Daimon3, Xinchi Huang1,2, Hirofumi Nishi1,2, Ryui Kaneko4, Tomi Ohtsuki4, Yu-ichiro Matsushita1,2,3 (1.Quemix, 2.UTokyo, 3.QST, 4.Sophia)
Keywords:
quantum machine learning,quantum algorithm,quantum computing
Data-driven learning on quantum computers is an important goal, yet many quantum algorithms are still designed with fixed, hand-crafted structures rather than being learned from data. A common current approach is to use quantum neural networks (QNNs), where parameterized quantum circuits are trained to fit a target objective. However, practical QNN training often faces major obstacles: optimization can become difficult due to poor trainability (e.g., barren-plateau–like behavior), parameter counts can grow quickly, and circuit depth and connectivity can scale in ways that are unfavorable for near-term implementations.
In this talk, we present the Adaptive Interpolating Quantum Transform (AIQT) as an alternative framework for quantum learning that emphasizes structured inductive bias and parameter efficiency. AIQT is a trainable unitary operator designed to smoothly interpolate between well-defined unitary transforms, where the interpolation is controlled by a small set of global or structured parameters. Rather than learning an arbitrary unitary, AIQT restricts learning to a constrained, meaningful subspace of unitary operations, aiming to improve trainability and scalability while remaining compatible with quantum circuit realizations. We outline the AIQT concept and highlight how this framework can serve as a practical building block for data-driven quantum models.
In this talk, we present the Adaptive Interpolating Quantum Transform (AIQT) as an alternative framework for quantum learning that emphasizes structured inductive bias and parameter efficiency. AIQT is a trainable unitary operator designed to smoothly interpolate between well-defined unitary transforms, where the interpolation is controlled by a small set of global or structured parameters. Rather than learning an arbitrary unitary, AIQT restricts learning to a constrained, meaningful subspace of unitary operations, aiming to improve trainability and scalability while remaining compatible with quantum circuit realizations. We outline the AIQT concept and highlight how this framework can serve as a practical building block for data-driven quantum models.
