You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I would like to ask if there is a limitation on the maximum number of columns that can be passed to a tabular model? Is there an upper limit? Is it going to fail in case there are many columns?
Of course, I am talking about the case of using the classic early stopping mechanism and not the critic one, because in the past we have seen that having the critic metric with high-dimensional data might lead to large memory consumption and many errors can occur.
So, my use case is fitting a tabular model with a simple early stopping (no critic-sensitivity metric). Is GPT-2 going to fail with many columns at its input when training or generation? I have in my dataset mixed types like, text, float, datetime, int, etc. Text columns are not going to be very lengthy they are just categorical values.
Lastly, is it possible that the tabular model to have such limitation while the relational not? If that's the case, maybe I could fit a relational instead as I see here https://github.com/worldbank/REaLTabFormer/issues/22#issuecomment-1598082977 by providing no parent. Also, in the past (#11) I remember you have told me that the relational model has no limitation like the pre-trained GPT-2 but probably that was only for the generation part?
Thanks!
The text was updated successfully, but these errors were encountered:
Hi @avsolatorio,
I would like to ask if there is a limitation on the maximum number of columns that can be passed to a tabular model? Is there an upper limit? Is it going to fail in case there are many columns?
Of course, I am talking about the case of using the classic early stopping mechanism and not the critic one, because in the past we have seen that having the critic metric with high-dimensional data might lead to large memory consumption and many errors can occur.
So, my use case is fitting a tabular model with a simple early stopping (no critic-sensitivity metric). Is GPT-2 going to fail with many columns at its input when training or generation? I have in my dataset mixed types like, text, float, datetime, int, etc. Text columns are not going to be very lengthy they are just categorical values.
Lastly, is it possible that the tabular model to have such limitation while the relational not? If that's the case, maybe I could fit a relational instead as I see here
https://github.com/worldbank/REaLTabFormer/issues/22#issuecomment-1598082977
by providing no parent. Also, in the past (#11) I remember you have told me that the relational model has no limitation like the pre-trained GPT-2 but probably that was only for the generation part?Thanks!
The text was updated successfully, but these errors were encountered: