You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have been actively working with scGPT over the past few months and am thoroughly impressed with its capabilities and potential in advancing single-cell analysis.
As part of my efforts to evaluate and optimize scGPT for various applications, I am particularly interested in understanding the technical benchmarks related to its development. Specifically, I would like to know:
The computational resources required to train scGPT from scratch, including GPU/TPU specifications, CPU requirements, and memory needs.
The approximate training time and dataset size used for the pretraining process.
Whether the datasets used for training are publicly available for further exploration or benchmarking.
This information would be immensely helpful in guiding my own implementations and evaluations.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I have been actively working with scGPT over the past few months and am thoroughly impressed with its capabilities and potential in advancing single-cell analysis.
As part of my efforts to evaluate and optimize scGPT for various applications, I am particularly interested in understanding the technical benchmarks related to its development. Specifically, I would like to know:
This information would be immensely helpful in guiding my own implementations and evaluations.
Thank you for your time and support.
Beta Was this translation helpful? Give feedback.
All reactions