You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Specifically, I'd like to integrate the SVDKL model in place of the VAE-GP combination. Additionally, I aim to train the DKL model dynamically during the Bayesian Optimization (BO) step, instead of relying on a static latent space from a pretrained model, as is the case with the VAE.
Could you provide guidance on how this could be implemented or if there are any potential challenges or limitations with such an approach?
Thank you!
The text was updated successfully, but these errors were encountered:
Hi,
I have been following the tutorial VAE for MNIST in BoTorch and was wondering if it's possible to replace the combination of the VAE and single-task GP model with the SVDKL (Stochastic Variational Deep Kernel Learning) model from the GPyTorch tutorial.
Specifically, I'd like to integrate the SVDKL model in place of the VAE-GP combination. Additionally, I aim to train the DKL model dynamically during the Bayesian Optimization (BO) step, instead of relying on a static latent space from a pretrained model, as is the case with the VAE.
Could you provide guidance on how this could be implemented or if there are any potential challenges or limitations with such an approach?
Thank you!
The text was updated successfully, but these errors were encountered: