Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add an LLM generation UI #684

Open
JulienVig opened this issue Jun 13, 2024 · 3 comments
Open

Add an LLM generation UI #684

JulienVig opened this issue Jun 13, 2024 · 3 comments
Assignees
Labels
feature New feature or request
Milestone

Comments

@JulienVig
Copy link
Collaborator

After training an LLM such as for the wikitext task, it would be expected to be able to prompt the model and generate some text. Right now, no such UI is available, we can only test and predict on connected text files.

We can implement a simple text box where users can input free text, tweak the generation parameters and see the model's completion in real time in another text field below.

@JulienVig JulienVig added the feature New feature or request label Jun 13, 2024
@tharvik
Copy link
Collaborator

tharvik commented Jun 19, 2024

I'm having an issue with generating with the gpt-tfjs model. somewhere inside the implementation, it is failling in deep tensorflow code.

TypeError: info is undefined
    moveData engine.ts:426
    get backend.ts:55
    reshape4 Reshape.ts:44
    gatherV22 GatherV2.ts:57
    kernelFunc engine.ts:647
    runKernelFunc engine.ts:713
    [...]
    predictLoop training.ts:1035
    predict training.ts:1116
    idxNext model.js:160
    [...]
    generateOnce model.js:152
    generate model.js:140
    generate index.js:71
    generate TextInput.vue:70

because of the weird way that gpt-tfjs is coded (hijacking tfjs internals, redoing a bunch of tfjs), I'm not really able to find where and how it is triggered. maybe @JulienVig you knows a bit more on that as you are the one who lastly tried to fix some of its issues.
as previously stated, this gpt-tfjs code is bad. it will need to be rewritten or changed with OONX once we have that. putting this feature on hold in the meantime (code on 684-add-chat-with-llm-tharvik branch)

@JulienVig
Copy link
Collaborator Author

Just looking at the stack trace I don't know what's the error, will look into it when it becomes a priority.

@JulienVig JulienVig added this to the v4.0.0 milestone Jul 23, 2024
@tharvik
Copy link
Collaborator

tharvik commented Aug 27, 2024

after hitting TypeError: info is undefined elsewhere, it seems that it was due to vue's proxy wrapping of model which do not behave nicely with it. toRaw before training with it should work around it

@walidabn walidabn self-assigned this Aug 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

3 participants