Vector Quantization with Sorting Transformation

Guardado en:
Detalles Bibliográficos
Publicado en:The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Conference Proceedings (2024)
Autor principal: Wang, Hongzhi
Otros Autores: Syeda-Mahmood, Tanveer
Publicado:
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Materias:
Acceso en línea:Citation/Abstract
Etiquetas: Agregar Etiqueta
Sin Etiquetas, Sea el primero en etiquetar este registro!
Descripción
Resumen:Conference Title: 2024 IEEE International Conference on Big Data (BigData)Conference Start Date: 2024, Dec. 15 Conference End Date: 2024, Dec. 18 Conference Location: Washington, DC, USAVector quantization is a nearest neighbor representation based compression technique for vector data. It creates a collection of codewords to represent the entire vector space. Each vector data is then represented by its nearest neighbor codeword, where the distance between them is the compression error. To improve nearest neighbor representation for vector quantization, we propose to apply sorting transformation to vector data such that members within each vector are sorted. We show that among all permutation transformations, the sorting transformation minimizes L2 distance and maximizes similarity measures such as cosine similarity and Pearson correlation for vector data. Applying sorting transformation with vector quantization can substantially reduce compression errors. Meanwhile, it incurs storage overhead for saving the sorting permutation for each compressed vector. Through experimental validation on compression and nearest neighbor retrieval, we show that this is a beneficial trade-off for vector quantization on low dimensional vectors, a common scenario for vector quantization applications.
DOI:10.1109/BigData62323.2024.10825761
Fuente:Science Database