Vector Quantization with Sorting Transformation
Guardado en:
| Publicado en: | The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings (2024) |
|---|---|
| Autor principal: | |
| Otros Autores: | |
| Publicado: |
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
|
| Materias: | |
| Acceso en línea: | Citation/Abstract |
| Etiquetas: |
Sin Etiquetas, Sea el primero en etiquetar este registro!
|
MARC
| LEADER | 00000nab a2200000uu 4500 | ||
|---|---|---|---|
| 001 | 3156643173 | ||
| 003 | UK-CbPIL | ||
| 024 | 7 | |a 10.1109/BigData62323.2024.10825761 |2 doi | |
| 035 | |a 3156643173 | ||
| 045 | 2 | |b d20240101 |b d20241231 | |
| 084 | |a 228229 |2 nlm | ||
| 100 | 1 | |a Wang, Hongzhi |u IBM Almaden Research Center,San Jose,CA,USA | |
| 245 | 1 | |a Vector Quantization with Sorting Transformation | |
| 260 | |b The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |c 2024 | ||
| 513 | |a Conference Proceedings | ||
| 520 | 3 | |a Conference Title: 2024 IEEE International Conference on Big Data (BigData)Conference Start Date: 2024, Dec. 15 Conference End Date: 2024, Dec. 18 Conference Location: Washington, DC, USAVector quantization is a nearest neighbor representation based compression technique for vector data. It creates a collection of codewords to represent the entire vector space. Each vector data is then represented by its nearest neighbor codeword, where the distance between them is the compression error. To improve nearest neighbor representation for vector quantization, we propose to apply sorting transformation to vector data such that members within each vector are sorted. We show that among all permutation transformations, the sorting transformation minimizes L2 distance and maximizes similarity measures such as cosine similarity and Pearson correlation for vector data. Applying sorting transformation with vector quantization can substantially reduce compression errors. Meanwhile, it incurs storage overhead for saving the sorting permutation for each compressed vector. Through experimental validation on compression and nearest neighbor retrieval, we show that this is a beneficial trade-off for vector quantization on low dimensional vectors, a common scenario for vector quantization applications. | |
| 653 | |a Similarity | ||
| 653 | |a Codes | ||
| 653 | |a Error reduction | ||
| 653 | |a Big Data | ||
| 653 | |a Permutations | ||
| 653 | |a Data compression | ||
| 653 | |a Representations | ||
| 653 | |a Vector spaces | ||
| 653 | |a Environmental | ||
| 700 | 1 | |a Syeda-Mahmood, Tanveer |u IBM Almaden Research Center,San Jose,CA,USA | |
| 773 | 0 | |t The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings |g (2024) | |
| 786 | 0 | |d ProQuest |t Science Database | |
| 856 | 4 | 1 | |3 Citation/Abstract |u https://www.proquest.com/docview/3156643173/abstract/embedded/75I98GEZK8WCJMPQ?source=fedsrch |