Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEA] Refactor AutoTuner into separate classes for Qualification and Profiling Tools #1470

Open
parthosa opened this issue Dec 18, 2024 · 0 comments · May be fixed by #1471
Open

[FEA] Refactor AutoTuner into separate classes for Qualification and Profiling Tools #1470

parthosa opened this issue Dec 18, 2024 · 0 comments · May be fixed by #1471
Assignees
Labels
core_tools Scope the core module (scala) feature request New feature or request

Comments

@parthosa
Copy link
Collaborator

Description
In #1397, we identified that many AutoTuner recommendations need to differ based on the context in which it is run:

  • When AutoTuner processes GPU event logs (via the Profiling Tool), the recommendations align in improving the GPU based runs.
  • When AutoTuner processes CPU event logs (via the Qualification Tool), different recommendations are required for transition to the first-GPU run.

Currently, there is no clean separation between these two case.

Proposed Solution

  • Introduce a class-based design to create specialized AutoTuner implementations for Qualification.
  • QualificationAutoTuner subclass will override specific configurations or logic as needed.
@parthosa parthosa added core_tools Scope the core module (scala) feature request New feature or request labels Dec 18, 2024
@parthosa parthosa self-assigned this Dec 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core_tools Scope the core module (scala) feature request New feature or request
Projects
None yet
1 participant