QUACK: Quantum Aligned Centroid Kernel

Uloženo v:
Podrobná bibliografie
Vydáno v:arXiv.org (Jul 24, 2024), p. n/a
Hlavní autor: Kilian Tscharke
Další autoři: Issel, Sebastian, Debus, Pascal
Vydáno:
Cornell University Library, arXiv.org
Témata:
On-line přístup:Citation/Abstract
Full text outside of ProQuest
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!

MARC

LEADER 00000nab a2200000uu 4500
001 3049906788
003 UK-CbPIL
022 |a 2331-8422 
035 |a 3049906788 
045 0 |b d20240724 
100 1 |a Kilian Tscharke 
245 1 |a QUACK: Quantum Aligned Centroid Kernel 
260 |b Cornell University Library, arXiv.org  |c Jul 24, 2024 
513 |a Working Paper 
520 3 |a Quantum computing (QC) seems to show potential for application in machine learning (ML). In particular quantum kernel methods (QKM) exhibit promising properties for use in supervised ML tasks. However, a major disadvantage of kernel methods is their unfavorable quadratic scaling with the number of training samples. Together with the limits imposed by currently available quantum hardware (NISQ devices) with their low qubit coherence times, small number of qubits, and high error rates, the use of QC in ML at an industrially relevant scale is currently impossible. As a small step in improving the potential applications of QKMs, we introduce QUACK, a quantum kernel algorithm whose time complexity scales linear with the number of samples during training, and independent of the number of training samples in the inference stage. In the training process, only the kernel entries for the samples and the centers of the classes are calculated, i.e. the maximum shape of the kernel for n samples and c classes is (n, c). During training, the parameters of the quantum kernel and the positions of the centroids are optimized iteratively. In the inference stage, for every new sample the circuit is only evaluated for every centroid, i.e. c times. We show that the QUACK algorithm nevertheless provides satisfactory results and can perform at a similar level as classical kernel methods with quadratic scaling during training. In addition, our (simulated) algorithm is able to handle high-dimensional datasets such as MNIST with 784 features without any dimensionality reduction. 
653 |a Algorithms 
653 |a Quantum computing 
653 |a Machine learning 
653 |a Kernel functions 
653 |a Centroids 
653 |a Inference 
653 |a Qubits (quantum computing) 
700 1 |a Issel, Sebastian 
700 1 |a Debus, Pascal 
773 0 |t arXiv.org  |g (Jul 24, 2024), p. n/a 
786 0 |d ProQuest  |t Engineering Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3049906788/abstract/embedded/L8HZQI7Z43R0LA5T?source=fedsrch 
856 4 0 |3 Full text outside of ProQuest  |u http://arxiv.org/abs/2405.00304