Skip to content

Commit

Permalink
[inferno-ml] New testing route and other improvements (#146)
Browse files Browse the repository at this point in the history
Several improvements to `inferno-ml-server` (and its types package):
- Adds a new testing route with ability to override scripts, models,
inputs
- Purges last vestiges of user IDs (now everything uses group IDs)
- Switches to UUIDs instead of integers for IDs
- Makes `/status` much clearer
- Removes some unnecessary parts of model caching; model versions are
now saved to a filepath corresponding to their ID
  • Loading branch information
ngua authored Oct 22, 2024
1 parent e918183 commit 890b786
Show file tree
Hide file tree
Showing 18 changed files with 405 additions and 317 deletions.
4 changes: 4 additions & 0 deletions inferno-ml-server-types/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
# Revision History for inferno-ml-server-types
*Note*: we use https://pvp.haskell.org/ (MAJOR.MAJOR.MINOR.PATCH)

## 0.10.0
* Change `Id` to `UUID`
* Add new testing endpoint to override models, script, etc...

## 0.9.1
* `Ord`/`VCHashUpdate` instances for `ScriptInputType`

Expand Down
2 changes: 1 addition & 1 deletion inferno-ml-server-types/inferno-ml-server-types.cabal
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
cabal-version: 2.4
name: inferno-ml-server-types
version: 0.9.1
version: 0.10.0
synopsis: Types for Inferno ML server
description: Types for Inferno ML server
homepage: https://github.com/plow-technologies/inferno.git#readme
Expand Down
33 changes: 22 additions & 11 deletions inferno-ml-server-types/src/Inferno/ML/Server/Client.hs
Original file line number Diff line number Diff line change
@@ -1,29 +1,35 @@
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE NoMonomorphismRestriction #-}

module Inferno.ML.Server.Client
( statusC,
inferenceC,
inferenceTestC,
cancelC,
)
where

import Data.Aeson (ToJSON)
import Data.Int (Int64)
import Data.Proxy (Proxy (Proxy))
import Data.UUID (UUID)
import Inferno.ML.Server.Types
import Servant ((:<|>) ((:<|>)))
import Servant.Client.Streaming (ClientM, client)

-- | Get the status of the server. @Nothing@ indicates that an inference job
-- is being evaluated. @Just ()@ means the server is idle
statusC :: ClientM (Maybe ())
statusC :: ClientM ServerStatus
statusC = client $ Proxy @StatusAPI

-- | Cancel the existing inference job, if it exists
cancelC :: ClientM ()
cancelC = client $ Proxy @CancelAPI

-- | Run an inference parameter
inferenceC ::
forall gid p s.
-- | SQL identifier of the inference parameter to be run
Id (InferenceParam uid gid p s) ->
Id (InferenceParam gid p s) ->
-- | Optional resolution for scripts that use e.g. @valueAt@; defaults to
-- the param\'s stored resolution if not provided. This lets users override
-- the resolution on an ad-hoc basis without needing to alter the stored
Expand All @@ -38,11 +44,16 @@ inferenceC ::
-- (not defined in this repository) to verify this before directing
-- the writes to their final destination
ClientM (WriteStream IO)
inferenceC = client $ Proxy @(InferenceAPI gid p s)

-- | Cancel the existing inference job, if it exists
cancelC :: ClientM ()
statusC :<|> inferenceC :<|> cancelC =
client api

api :: Proxy (InfernoMlServerAPI uid gid p s t)
api = Proxy
-- | Run an inference parameter
inferenceTestC ::
forall gid p s.
ToJSON p =>
-- | SQL identifier of the inference parameter to be run
Id (InferenceParam gid p s) ->
Maybe Int64 ->
UUID ->
EvaluationEnv gid p ->
ClientM (WriteStream IO)
inferenceTestC = client $ Proxy @(InferenceTestAPI gid p s)
Loading

0 comments on commit 890b786

Please sign in to comment.