Skip to content

Commit

Permalink
Fix commands don't work if they woke LLM service (#119)
Browse files Browse the repository at this point in the history
  • Loading branch information
ryan-the-crayon authored Dec 6, 2024
1 parent ee056c0 commit 39c044b
Show file tree
Hide file tree
Showing 4 changed files with 15 additions and 5 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,8 @@ Here are some frequently used commands:
- `lms ls --json` - To list all downloaded models in machine-readable JSON format.
- `lms ps` - To list all loaded models available for inferencing.
- `lms ps --json` - To list all loaded models available for inferencing in machine-readable JSON format.
- `lms load --gpu max` - To load a model with maximum GPU acceleration
- `lms load <model path> --gpu max -y` - To load a model with maximum GPU acceleration without confirmation
- `lms load` - To load a model
- `lms load <model path> -y` - To load a model with maximum GPU acceleration without confirmation
- `lms unload <model identifier>` - To unload a model
- `lms unload --all` - To unload all models
- `lms create` - To create a new project with LM Studio SDK
Expand Down
12 changes: 11 additions & 1 deletion src/createClient.ts
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,7 @@ export async function wakeUpService(logger: SimpleLogger): Promise<boolean> {
}

export interface CreateClientOpts {}
const lmsKey = "<LMS-CLI-LMS-KEY>";

export async function createClient(
logger: SimpleLogger,
Expand Down Expand Up @@ -134,7 +135,6 @@ export async function createClient(
};
} else {
// Not remote. We need to check if this is a production build.
const lmsKey = "<LMS-CLI-LMS-KEY>";
if (lmsKey.startsWith("<") && !process.env.LMS_FORCE_PROD) {
// lmsKey not injected and we did not force prod, this is not a production build.
logger.warnText`
Expand Down Expand Up @@ -176,6 +176,16 @@ export async function createClient(
if (localPort !== null) {
const baseUrl = `ws://${host}:${localPort}`;
logger.debug(`Found local API server at ${baseUrl}`);

if (auth.clientIdentifier === "lms-cli") {
// We need to refetch the lms key due to the possibility of a new key being generated.
const lmsKey2 = (await readFile(lmsKey2Path, "utf-8")).trim();
auth = {
...auth,
clientPasskey: lmsKey + lmsKey2,
};
}

return new LMStudioClient({ baseUrl, logger, ...auth });
}
}
Expand Down
2 changes: 1 addition & 1 deletion src/subcommands/list.ts
Original file line number Diff line number Diff line change
Expand Up @@ -330,7 +330,7 @@ export const ps = command({
To load a model, run:
${chalk.yellow("lms load --gpu max")}${"\n"}
${chalk.yellow("lms load")}${"\n"}
`,
);
return;
Expand Down
2 changes: 1 addition & 1 deletion src/subcommands/unload.ts
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ export const unload = command({
logger.info(`Model "${identifier}" unloaded.`);
} else {
if (models.length === 0) {
logger.error(`You don't have any models loaded. Use "lms load --gpu max" to load a model.`);
logger.error(`You don't have any models loaded. Use "lms load" to load a model.`);
process.exit(1);
}
console.info();
Expand Down

0 comments on commit 39c044b

Please sign in to comment.