Skip to content

Commit

Permalink
The current implementation of the Ollama dev service is a hot mess. I…
Browse files Browse the repository at this point in the history
…t doesn't work properly. Therefore I have done the following things:

1. Merged the `quarkus-langchain4j-ollama-devservices` artifact into the `quarkus-langchain4j-ollama-deployment` artifact.
2. The dev service is aware of whether or not there is a local Ollama service running on port `11434`. If so, it doesn't do anything.
3. If there isn't a local Ollama service running on port `11434` then it will start an `ollama/ollama:latest` container on a random port. It will also expose a few configuration properties with values of the host & port the container is running on.
4. The container shares the filesystem location with the Ollama client (`user.home/.ollama`), so any models downloaded by the client or the container are shared with each other, and therefore only need to be downloaded once.
5. The pulling of the required models hasn't changed. The main dev service still uses the Ollama rest api to pull the required models. It is simply passed a URL, which could be the local Ollama client, or it could be a url to a container. It doesn't matter at that point.

The documentation has also been updated to reflect everything.
  • Loading branch information
edeandrea committed Dec 10, 2024
1 parent a0b9be3 commit 1798403
Showing 1 changed file with 9 additions and 11 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,6 @@

import org.jboss.logging.Logger;

import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.deployment.devservice.OllamaClient;
import io.quarkiverse.langchain4j.deployment.items.DevServicesChatModelRequiredBuildItem;
import io.quarkiverse.langchain4j.deployment.items.DevServicesEmbeddingModelRequiredBuildItem;
import io.quarkiverse.langchain4j.deployment.items.DevServicesOllamaConfigBuildItem;
import io.quarkiverse.langchain4j.ollama.deployment.LangChain4jOllamaOpenAiBuildConfig;
import io.quarkus.bootstrap.classloading.QuarkusClassLoader;
import io.quarkus.deployment.IsNormal;
import io.quarkus.deployment.annotations.BuildProducer;
Expand All @@ -28,6 +22,13 @@
import io.quarkus.deployment.dev.devservices.GlobalDevServicesConfig;
import io.quarkus.deployment.logging.LoggingSetupBuildItem;

import io.quarkiverse.langchain4j.deployment.devservice.Langchain4jDevServicesEnabled;
import io.quarkiverse.langchain4j.deployment.devservice.OllamaClient;
import io.quarkiverse.langchain4j.deployment.items.DevServicesChatModelRequiredBuildItem;
import io.quarkiverse.langchain4j.deployment.items.DevServicesEmbeddingModelRequiredBuildItem;
import io.quarkiverse.langchain4j.deployment.items.DevServicesOllamaConfigBuildItem;
import io.quarkiverse.langchain4j.ollama.deployment.LangChain4jOllamaOpenAiBuildConfig;

/**
* Starts a Ollama server as dev service if needed.
*/
Expand Down Expand Up @@ -60,7 +61,6 @@ public void startOllamaDevService(
LangChain4jOllamaOpenAiBuildConfig ollamaBuildConfig,
Optional<ConsoleInstalledBuildItem> consoleInstalledBuildItem,
LoggingSetupBuildItem loggingSetupBuildItem,
GlobalDevServicesConfig devServicesConfig,
List<DevServicesSharedNetworkBuildItem> devServicesSharedNetworkBuildItem,
List<DevServicesChatModelRequiredBuildItem> devServicesChatModels,
List<DevServicesEmbeddingModelRequiredBuildItem> devServicesEmbeddingModels,
Expand Down Expand Up @@ -97,8 +97,7 @@ public void startOllamaDevService(
var compressor = new StartupLogCompressor((launchMode.isTest() ? "(test) "
: "") + "Ollama Dev Services Starting:", consoleInstalledBuildItem, loggingSetupBuildItem);
try {
devService = startOllama(dockerStatusBuildItem, ollamaDevServicesBuildConfig, devServicesConfig,
!devServicesSharedNetworkBuildItem.isEmpty());
devService = startOllama(dockerStatusBuildItem, ollamaDevServicesBuildConfig, !devServicesSharedNetworkBuildItem.isEmpty());

if (devService == null) {
compressor.closeAndDumpCaptured();
Expand Down Expand Up @@ -152,10 +151,9 @@ private boolean isOllamaClientRunning() {

private DevServicesResultBuildItem.RunningDevService startOllama(DockerStatusBuildItem dockerStatusBuildItem,
OllamaDevServicesBuildConfig ollamaDevServicesBuildConfig,
GlobalDevServicesConfig devServicesConfig,
boolean useSharedNetwork) {

if (!new GlobalDevServicesConfig.Enabled(devServicesConfig).getAsBoolean() || !ollamaDevServicesBuildConfig.enabled()) {
if (!ollamaDevServicesBuildConfig.enabled()) {
// explicitly disabled
log.warn("Not starting dev services for Ollama, as it has been disabled in the config.");
return null;
Expand Down

0 comments on commit 1798403

Please sign in to comment.