Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Export the loadModel method #22

Open
lawvs opened this issue Mar 4, 2024 · 0 comments
Open

Export the loadModel method #22

lawvs opened this issue Mar 4, 2024 · 0 comments

Comments

@lawvs
Copy link

lawvs commented Mar 4, 2024

Loading the model can be time-consuming because it may need to be downloaded from the network.

It would be beneficial to make modelOperations.loadModel public. This would allow developers to preload the model as needed, even though we can also load the model by executing modelOperations.runModel('').

private async loadModel() {
if (this._model) {
return;
}
// These 2 env set's just suppress some warnings that get logged that
// are not applicable for this use case.
const tfEnv = env();
tfEnv.set('IS_NODE', false);
tfEnv.set('PROD', true);
if(!(await setBackend('cpu'))) {
throw new Error('Unable to set backend to CPU.');
}
const resolvedModelJSON = await this.getModelJSON();
const resolvedWeights = await this.getWeights();
this._model = await loadGraphModel(new InMemoryIOHandler(resolvedModelJSON, resolvedWeights));
}

Furthermore, a new isReady method is very helpful for developers.

public isReady() {
  return !!this._model;
}

By the way, this package is awesome! I deployed it in guesslang-worker and tried to use it in blocksuite and its performance was excellent.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant