Muestra métricas de impacto externas asociadas a la publicación. Para mayor detalle:
| Indexado |
|
||||
| DOI | 10.1007/978-3-031-80084-9_11 | ||||
| Año | 2025 | ||||
| Tipo | proceedings paper |
Citas Totales
Autores Afiliación Chile
Instituciones Chile
% Participación
Internacional
Autores
Afiliación Extranjera
Instituciones
Extranjeras
In this paper, we propose a novel quantization technique for Bayesian deep learning aimed at enhancing efficiency without compromising performance. Our approach leverages post-training quantization to significantly reduce the memory footprint of stochastic gradient samplers, particularly Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) methods. This technique achieves a level of compression comparable to optimal thinning, which traditionally necessitates not only the original samples in single precision floating-point representation but also the gradients, resulting in substantial computational overhead. In contrast, our quantization method requires only the original samples and can accurately recover posterior modes through a simple affine transformation. This process incurs minimal additional memory or computational costs, making it a highly efficient alternative for Bayesian deep learning applications.
| Ord. | Autor | Género | Institución - País |
|---|---|---|---|
| 1 | Hernández, Sergio | - |
Universidad Católica del Maule - Chile
|
| 2 | López-Cortes, Xaviera | - |
Universidad Católica del Maule - Chile
|
| 3 | Guerrero, G | - | |
| 4 | SanMartin, J | - | |
| 5 | Meneses, E | - | |
| 6 | Hernandez, CJB | - | |
| 7 | Osthoff, C | - | |
| 8 | Diaz, JMM | - |
| Agradecimiento |
|---|
| Xaviera L\u00F3pez gratefully acknowledge financial support from Research Project ANID FONDECYT Iniciaci\u00F3n en Investigaci\u00F3n No. 11220897. |
| Xaviera Lopez gratefully acknowledge financial support from Research Project ANID FONDECYT Iniciacion en Investigacion No. 11220897. |