You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for sharing your inspiring work and providing the codebase.
However, while trying to reproduce the results presented in the paper, I noticed that the detailed hyperparameter settings for each dataset are not provided. The hyperparameter ranges mentioned are quite broad, and finding the optimal settings without additional guidance is challenging and time-consuming.
For the sake of reproducibility and to help the community better understand and validate the contributions of your work, could you please share the specific hyperparameter configurations (e.g., learning rate, batch size and loss weight) used for each dataset (e.g., OfficeHome, VLCS, and Terra Incognita) in the experiments? Providing these details would greatly facilitate reproducing the reported results and conducting further research based on your work.
Thank you for your time and consideration!
The text was updated successfully, but these errors were encountered:
Dear authors,
Thank you for sharing your inspiring work and providing the codebase.
However, while trying to reproduce the results presented in the paper, I noticed that the detailed hyperparameter settings for each dataset are not provided. The hyperparameter ranges mentioned are quite broad, and finding the optimal settings without additional guidance is challenging and time-consuming.
For the sake of reproducibility and to help the community better understand and validate the contributions of your work, could you please share the specific hyperparameter configurations (e.g., learning rate, batch size and loss weight) used for each dataset (e.g., OfficeHome, VLCS, and Terra Incognita) in the experiments? Providing these details would greatly facilitate reproducing the reported results and conducting further research based on your work.
Thank you for your time and consideration!
The text was updated successfully, but these errors were encountered: