Skip to content

ashishawasthi/infer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Model Inference API example

Self-contained minimalistic ML model inference service example. The application demonstrates a wrapper API over models to serve the predictions in batches. It includes e2e tests for requests and prediction responses.

Application expects the provided model object to have an sklearn-style predict function.

Local Development Environment

Setup

pip install -r requirements.txt

Test

python -m pytest
coverage run -m pytest
coverage report -m

Run

flask run

Try

<flask_url>/infer?model_id=iris_svm_v1&model_inputs=[[1,2,3,4],[1,1,1,1]]

Deloy Docker Image

docker build  -t infer .
docker run -it infer

Deloy on Google Cloud Run

gcloud builds submit --tag gcr.io/<project-id>/infer
gcloud run deploy --image gcr.io/<project-id>/infer

Deployed at

https://infer-vz7lve6tka-as.a.run.app/infer?model_id=iris_svm_v1&model_inputs=[[1,2,3,4],[1,1,1,1]]

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published