title | software | abstract | layout | series | publisher | issn | id | month | tex_title | firstpage | lastpage | page | order | cycles | bibtex_author | author | date | address | container-title | volume | genre | issued | extras | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Coreset Markov chain Monte Carlo |
A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during inference in order to reduce computational cost. However, state of the art methods for tuning coreset weights are expensive, require nontrivial user input, and impose constraints on the model. In this work, we propose a new method—coreset MCMC—that simulates a Markov chain targeting the coreset posterior, while simultaneously updating the coreset weights using those same draws. Coreset MCMC is simple to implement and tune, and can be used with any existing MCMC kernel. We analyze coreset MCMC in a representative setting to obtain key insights about the convergence behaviour of the method. Empirical results demonstrate that coreset MCMC provides higher quality posterior approximations and reduced computational cost compared with other coreset construction methods. Further, compared with other general subsampling MCMC methods, we find that coreset MCMC has a higher sampling efficiency with competitively accurate posterior approximations. |
inproceedings |
Proceedings of Machine Learning Research |
PMLR |
2640-3498 |
chen24f |
0 |
{C}oreset {M}arkov chain {M}onte {C}arlo |
4438 |
4446 |
4438-4446 |
4438 |
false |
Chen, Naitong and Campbell, Trevor |
|
2024-04-18 |
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics |
238 |
inproceedings |
|