Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[inferno-ml] New testing route and other improvements #146

Merged
merged 14 commits into from
Oct 22, 2024
4 changes: 4 additions & 0 deletions inferno-ml-server-types/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,10 @@
# Revision History for inferno-ml-server-types
*Note*: we use https://pvp.haskell.org/ (MAJOR.MAJOR.MINOR.PATCH)

## 0.10.0
* Change `Id` to `UUID`
* Add new testing endpoint to override models, script, etc...

## 0.9.1
* `Ord`/`VCHashUpdate` instances for `ScriptInputType`

Expand Down
2 changes: 1 addition & 1 deletion inferno-ml-server-types/inferno-ml-server-types.cabal
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
cabal-version: 2.4
name: inferno-ml-server-types
version: 0.9.1
version: 0.10.0
synopsis: Types for Inferno ML server
description: Types for Inferno ML server
homepage: https://github.com/plow-technologies/inferno.git#readme
Expand Down
33 changes: 22 additions & 11 deletions inferno-ml-server-types/src/Inferno/ML/Server/Client.hs
Original file line number Diff line number Diff line change
@@ -1,29 +1,35 @@
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE NoMonomorphismRestriction #-}

module Inferno.ML.Server.Client
( statusC,
inferenceC,
inferenceTestC,
cancelC,
)
where

import Data.Aeson (ToJSON)
import Data.Int (Int64)
import Data.Proxy (Proxy (Proxy))
import Data.UUID (UUID)
import Inferno.ML.Server.Types
import Servant ((:<|>) ((:<|>)))
import Servant.Client.Streaming (ClientM, client)

-- | Get the status of the server. @Nothing@ indicates that an inference job
-- is being evaluated. @Just ()@ means the server is idle
statusC :: ClientM (Maybe ())
statusC :: ClientM ServerStatus
statusC = client $ Proxy @StatusAPI

-- | Cancel the existing inference job, if it exists
cancelC :: ClientM ()
cancelC = client $ Proxy @CancelAPI

-- | Run an inference parameter
inferenceC ::
forall gid p s.
-- | SQL identifier of the inference parameter to be run
Id (InferenceParam uid gid p s) ->
Id (InferenceParam gid p s) ->
-- | Optional resolution for scripts that use e.g. @valueAt@; defaults to
-- the param\'s stored resolution if not provided. This lets users override
-- the resolution on an ad-hoc basis without needing to alter the stored
Expand All @@ -38,11 +44,16 @@ inferenceC ::
-- (not defined in this repository) to verify this before directing
-- the writes to their final destination
ClientM (WriteStream IO)
inferenceC = client $ Proxy @(InferenceAPI gid p s)

-- | Cancel the existing inference job, if it exists
cancelC :: ClientM ()
statusC :<|> inferenceC :<|> cancelC =
client api

api :: Proxy (InfernoMlServerAPI uid gid p s t)
api = Proxy
-- | Run an inference parameter
inferenceTestC ::
forall gid p s.
ToJSON p =>
-- | SQL identifier of the inference parameter to be run
Id (InferenceParam gid p s) ->
Maybe Int64 ->
UUID ->
EvaluationEnv gid p ->
ClientM (WriteStream IO)
inferenceTestC = client $ Proxy @(InferenceTestAPI gid p s)
Loading
Loading