From 50e3b7fcab611416546f682f5d31dfa7df17df61 Mon Sep 17 00:00:00 2001 From: MadcowD Date: Wed, 20 Nov 2024 21:16:50 +0000 Subject: [PATCH] deploy: d66f4c71f0bbaac93fbf6fc6c08dc9157b4907ee --- _sources/core_concepts/configuration.rst.txt | 8 +- _sources/core_concepts/tool_usage.rst.txt | 2 +- _sources/installation.rst.txt | 79 +- core_concepts/configuration.html | 105 +- core_concepts/ell_simple.html | 4 +- core_concepts/message_api.html | 28 +- core_concepts/tool_usage.html | 2 +- genindex.html | 110 +- installation.html | 69 +- objects.inv | Bin 902 -> 1011 bytes reference/index.html | 1231 +++++++++++++++++- searchindex.js | 2 +- 12 files changed, 1530 insertions(+), 110 deletions(-) diff --git a/_sources/core_concepts/configuration.rst.txt b/_sources/core_concepts/configuration.rst.txt index eb1afd7e..cc32dc89 100644 --- a/_sources/core_concepts/configuration.rst.txt +++ b/_sources/core_concepts/configuration.rst.txt @@ -5,6 +5,7 @@ Configuration ell provides various configuration options to customize its behavior. .. autofunction:: ell.init + :no-index: This ``init`` function is a convenience function that sets up the configuration for ell. It is a thin wrapper around the ``Config`` class, which is a Pydantic model. @@ -12,9 +13,10 @@ You can modify the global configuration using the ``ell.config`` object which is .. autopydantic_model:: ell.Config :members: - :exclude-members: default_client, registry, store + :exclude-members: default_client, registry, store, providers :model-show-json: false :model-show-validator-members: false :model-show-config-summary: false - :model-show-field-summary: false - :model-show-validator-summary: false \ No newline at end of file + :model-show-field-summary: true + :model-show-validator-summary: false + :no-index: \ No newline at end of file diff --git a/_sources/core_concepts/tool_usage.rst.txt b/_sources/core_concepts/tool_usage.rst.txt index 3d6df4a1..83f49cc3 100644 --- a/_sources/core_concepts/tool_usage.rst.txt +++ b/_sources/core_concepts/tool_usage.rst.txt @@ -271,7 +271,7 @@ This is accomplished by a language model program that takes the source code of a .. code-block:: python - @ell.simple(model="claude-3-5-sonnet", temperature=0.0) + @ell.simple(model="claude-3-5-sonnet-20241022", temperature=0.0) def generate_tool_spec(tool_source: str): ''' You are a helpful assistant that takes in source code for a python function and produces a JSON schema for the function. diff --git a/_sources/installation.rst.txt b/_sources/installation.rst.txt index 80a66135..dbd93638 100644 --- a/_sources/installation.rst.txt +++ b/_sources/installation.rst.txt @@ -10,13 +10,9 @@ Installing ell .. code-block:: bash - pip install -U ell-ai + pip install -U ell-ai[all] - By default, this installs only the OpenAI client SDK. If you want to include the Anthropic client SDK, use the "anthropic" extra like so: - - .. code-block:: bash - - pip install -U 'ell-ai[anthropic]' + This installs ``ell``, ``ell-studio``, versioning and tracing with SQLite, and the default provider clients. 2. Verify installation: @@ -24,6 +20,77 @@ Installing ell python -c "import ell; print(ell.__version__)" +Custom Installation +------------------- + +You can create a custom ``ell`` installation with the following options. + +Install ``ell`` without storage or ``ell-studio`` and with the default OpenAI client: + +.. code-block:: bash + + pip install -U ell-ai + +Supported options: + +``anthropic`` +~~~~~~~~~~~~~ +Adds the Anthropic client. + +.. code-block:: bash + + pip install -U ell-ai[anthropic] + + +``groq`` +~~~~~~~~ +Adds the Groq client. + +.. code-block:: bash + + pip install -U ell-ai[groq] + + +``studio`` +~~~~~~~~~~ +Adds ``ell-studio``. + +.. code-block:: bash + + pip install -U ell-ai[studio] + + +``sqlite`` +~~~~~~~~~~ +SQLite storage for versioning and tracing. + +.. code-block:: bash + + pip install -U ell-ai[sqlite] + + +``postgres`` +~~~~~~~~~~~~ +Postgres storage for versioning and tracing. + +Include this option if you'd like to use ``ell-studio`` with Postgres. + +.. code-block:: bash + + pip install -U ell-ai[postgres] + +Combining options +~~~~~~~~~~~~~~~~~ + +All options are additive and can be combined as needed. + +Example: Install ``ell`` with ``ell-studio``, Postgres, and the Anthropic client: + +.. code-block:: bash + + pip install -U ell-ai[studio, postgres, anthropic] + + API Key Setup ------------- diff --git a/core_concepts/configuration.html b/core_concepts/configuration.html index 516e9f1a..13d89169 100644 --- a/core_concepts/configuration.html +++ b/core_concepts/configuration.html @@ -350,8 +350,8 @@

Configuration

ell provides various configuration options to customize its behavior.

-
-ell.init(store: Store | str | None = None, verbose: bool = False, autocommit: bool = True, lazy_versioning: bool = True, default_api_params: Dict[str, Any] | None = None, default_client: Any | None = None, autocommit_model: str = 'gpt-4o-mini') None
+
+ell.init(store: None | str = None, verbose: bool = False, autocommit: bool = True, lazy_versioning: bool = True, default_api_params: Dict[str, Any] | None = None, default_client: Any | None = None, autocommit_model: str = 'gpt-4o-mini') None

Initialize the ELL configuration with various settings.

Parameters:
@@ -370,51 +370,64 @@

Configurationinit function is a convenience function that sets up the configuration for ell. It is a thin wrapper around the Config class, which is a Pydantic model.

You can modify the global configuration using the ell.config object which is an instance of Config:

-
-pydantic model ell.Config
-
-
-field autocommit: bool = False
+
+pydantic model ell.Config
+

Configuration class for ELL.

+
+
Fields:
+
    +
  • autocommit (bool)

  • +
  • autocommit_model (str)

  • +
  • default_api_params (Dict[str, Any])

  • +
  • default_client (openai.OpenAI | None)

  • +
  • lazy_versioning (bool)

  • +
  • override_wrapped_logging_width (int | None)

  • +
  • providers (Dict[Type, ell.provider.Provider])

  • +
  • registry (Dict[str, ell.configurator._Model])

  • +
  • store (None)

  • +
  • verbose (bool)

  • +
  • wrapped_logging (bool)

  • +
+
+
+
+
+field autocommit: bool = False

If True, enables automatic committing of changes to the store.

-
-field autocommit_model: str = 'gpt-4o-mini'
+
+field autocommit_model: str = 'gpt-4o-mini'

When set, changes the default autocommit model from GPT 4o mini.

-
-field default_api_params: Dict[str, Any] [Optional]
+
+field default_api_params: Dict[str, Any] [Optional]

Default parameters for language models.

-
-field lazy_versioning: bool = True
+
+field lazy_versioning: bool = True

If True, enables lazy versioning for improved performance.

-
-field override_wrapped_logging_width: int | None = None
+
+field override_wrapped_logging_width: int | None = None

If set, overrides the default width for wrapped logging.

-
-field providers: Dict[Type, Provider] [Optional]
-

A dictionary mapping client types to provider classes.

-
-
-
-field verbose: bool = False
+
+field verbose: bool = False

If True, enables verbose logging.

-
-field wrapped_logging: bool = True
+
+field wrapped_logging: bool = True

If True, enables wrapped logging for better readability.

-
-get_client_for(model_name: str) Tuple[OpenAI | None, bool]
+
+get_client_for(model_name: str) Tuple[OpenAI | None, bool]

Get the OpenAI client for a specific model name.

Parameters:
@@ -429,8 +442,8 @@

Configuration -
-get_provider_for(client: Type[Any] | Any) Provider | None
+
+get_provider_for(client: Type[Any] | Any) Provider | None

Get the provider instance for a specific client instance.

Parameters:
@@ -445,8 +458,8 @@

Configuration -
-model_registry_override(overrides: Dict[str, _Model])
+
+model_registry_override(overrides: Dict[str, _Model])

Temporarily override the model registry with new model configurations.

Parameters:
@@ -455,13 +468,13 @@

Configuration -
-register_model(name: str, default_client: OpenAI | Any | None = None, supports_streaming: bool | None = None) None
+
+register_model(name: str, default_client: OpenAI | Any | None = None, supports_streaming: bool | None = None) None

Register a model with its configuration.

-
-register_provider(provider: Provider, client_type: Type[Any]) None
+
+register_provider(provider: Provider, client_type: Type[Any]) None

Register a provider class for a specific client type.

Parameters:
@@ -488,29 +501,7 @@

Configuration - +
diff --git a/core_concepts/ell_simple.html b/core_concepts/ell_simple.html index dad69369..5dcfc2ce 100644 --- a/core_concepts/ell_simple.html +++ b/core_concepts/ell_simple.html @@ -585,8 +585,8 @@

Reference -
  • ell.complex(): For LMPs that preserve full structure of model responses, including multimodal outputs.

  • -
  • ell.tool(): For defining tools that can be used within complex LMPs.

  • +
  • ell.complex(): For LMPs that preserve full structure of model responses, including multimodal outputs.

  • +
  • ell.tool(): For defining tools that can be used within complex LMPs.

  • ell.studio: For visualizing and analyzing LMP executions.

  • diff --git a/core_concepts/message_api.html b/core_concepts/message_api.html index 6a44d08f..e1e4c226 100644 --- a/core_concepts/message_api.html +++ b/core_concepts/message_api.html @@ -399,6 +399,21 @@

    The ell Message API +
    +classmethod model_validate(obj: Any) Message
    +

    Custom validation to handle deserialization

    +

    +
    +
    +classmethod model_validate_json(json_str: str) Message
    +

    Custom validation to handle deserialization from JSON string

    +
    +
    +
    +serialize_content(content: List[ContentBlock])
    +

    Serialize content blocks to a format suitable for JSON

    +

    @@ -462,7 +477,7 @@

    Common roles
    -ell.system(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message
    +ell.system(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message

    Create a system message with the given content.

    Args: content (str): The content of the system message.

    @@ -471,7 +486,7 @@

    Common roles
    -ell.user(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message
    +ell.user(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message

    Create a user message with the given content.

    Args: content (str): The content of the user message.

    @@ -480,7 +495,7 @@

    Common roles
    -ell.assistant(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message
    +ell.assistant(content: ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel | List[ContentBlock | str | ToolCall | ToolResult | ImageContent | ndarray | Image | BaseModel]) Message

    Create an assistant message with the given content.

    Args: content (str): The content of the assistant message.

    @@ -670,7 +685,12 @@

    Solving the parsing problemChallenges with LLM APIs
  • The ell Message API