Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Metrics Orchestrator #236

Open
jasonlyik opened this issue Jul 31, 2024 · 1 comment
Open

Metrics Orchestrator #236

jasonlyik opened this issue Jul 31, 2024 · 1 comment
Assignees

Comments

@jasonlyik
Copy link
Contributor

jasonlyik commented Jul 31, 2024

Should break out the metrics orchestration (initialization, hooks, accumulation, etc.) into a separate MetricsOrchestrator managing class. Currently it is built into the Benchmark functionality, but this limits extension and customized metrics for the user.

User custom metrics should have abstract class, which describes how to fill in the metric. Current AccumulatedMetric is already a good example.

  • We may want different abstract base classes for StaticMetric, non-accumulated workload metric, and the existing AccumulatedMetric.

All workload metrics have the same interface (model, preds, data), and all static metrics have the interface (model)

@ben9809 ben9809 self-assigned this Aug 14, 2024
@jasonlyik
Copy link
Contributor Author

Notes 9/12

Refactor metrics to all be objects, add legacy support where if all strings then will work but send warning that support will end

User interface could look like:
from neurobench.metrics import ...
# import all from .metrics but can have dir structure with static/workload split
from neurobench.metrics import StaticMetric, WorkloadMetric # custom objs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

2 participants