Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading chat history from LM Studio? #152

Open
hornetDC opened this issue Dec 13, 2024 · 3 comments
Open

Loading chat history from LM Studio? #152

hornetDC opened this issue Dec 13, 2024 · 3 comments

Comments

@hornetDC
Copy link

How can I load a chat history generated in LM Studio? I see a JSON file at C:\Users\USER\.cache\lm-studio\conversations, but when I tried loading it with applyPromptTemplate I get an error

import fs from "fs";
import path from "path";
import { LMStudioClient } from "@lmstudio/sdk";

const lmstudio = new LMStudioClient();

async function main() {
  const model = await lmstudio.llm.load("hermes-3-llama-3.2-3b");

  const history = JSON.parse(
    fs.readFileSync(path.join(__dirname, "chat_history.json"), "utf-8")
  );

  model.applyPromptTemplate(history); // history: Invalid input 

  // Create a text completion prediction
  const prediction = model.complete("The meaning of life is");

  // Stream the response
  for await (const { content } of prediction) {
    process.stdout.write(content);
  }
}

main();
@ryan-the-crayon
Copy link
Contributor

Hi. Currently, you will need to read and convert conversation to ChatHistory so it can be fed to the model. We will look into allowing exporting conversations very soon.

@hornetDC
Copy link
Author

Hi. Currently, you will need to read and convert conversation to ChatHistory so it can be fed to the model. We will look into allowing exporting conversations very soon.

I thought I was dumb and somehow missed it in the docs 😅
Are there any examples how to convert conversations?

@ryan-the-crayon
Copy link
Contributor

Sorry, not at the moment. Conversations json files are considered internal data structures to LM Studio and we do not recommend build with them since we might change their structures. However, we will definitely add the feature to export conversations in a format that can be consumed by/continued from by an LLM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants