Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: changed every instance of default_bucket to bucket #507

Merged
merged 3 commits into from
Jun 7, 2024

Conversation

aayushsss1
Copy link
Contributor

Motivation

Replacing default_bucket -> bucket everywhere in this repo to ensure it's consistent with KServe.

Modifications

replaced every instance of default_bucket to bucket

Result

Tested the quickstart install after modifying quickstart.yaml

pods up and running -

kubectl get pods
NAME                                              READY   STATUS    RESTARTS   AGE
etcd-6fdc487479-m9pkx                             1/1     Running   0          32m
minio-6b5c846587-8bwdv                            1/1     Running   0          32m
modelmesh-controller-5cd8d68bc-9ls9p              1/1     Running   0          31m
modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj   4/4     Running   0          26m
modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw   4/4     Running   0          26m

Model deployed and InferenceService is Ready -

kubectl get isvc
NAME                   URL                                               READY   PREV   LATEST   PREVROLLEDOUTREVISION   LATESTREADYREVISION   AGE
example-sklearn-isvc   grpc://modelmesh-serving.modelmesh-serving:8033   True  
kubectl describe isvc example-sklearn-isvc
Name:         example-sklearn-isvc
Namespace:    modelmesh-serving
Labels:       <none>
Annotations:  serving.kserve.io/deploymentMode: ModelMesh
API Version:  serving.kserve.io/v1beta1
Kind:         InferenceService
Metadata:
  Creation Timestamp:  2024-05-28T07:19:00Z
  Generation:          1
  Resource Version:    5950
  UID:                 db71cf11-7842-4bc1-af97-647282e6b9b9
Spec:
  Predictor:
    Model:
      Model Format:
        Name:  sklearn
      Storage:
        Key:   localMinIO
        Path:  sklearn/mnist-svm.joblib
Status:
  Components:
    Predictor:
      Grpc URL:  grpc://modelmesh-serving.modelmesh-serving:8033
      Rest URL:  http://modelmesh-serving.modelmesh-serving:8008
      URL:       grpc://modelmesh-serving.modelmesh-serving:8033
  Conditions:
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  PredictorReady
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  Ready
  Model Status:
    Copies:
      Failed Copies:  0
      Total Copies:   1
    States:
      Active Model State:  Loaded
      Target Model State:  
    Transition Status:     UpToDate
  URL:                     grpc://modelmesh-serving.modelmesh-serving:8033
Events:                    <none>

Inference Request successful -

MODEL_NAME=example-sklearn-isvc
grpcurl \
  -plaintext \
  -proto fvt/proto/kfs_inference_v2.proto \
  -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \
  localhost:8033 \
  inference.GRPCInferenceService.ModelInfer

Handling connection for 8033
{
  "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf",
  "outputs": [
    {
      "name": "predict",
      "datatype": "INT64",
      "shape": [
        "1",
        "1"
      ],
      "contents": {
        "int64Contents": [
          "8"
        ]
      }
    }
  ]
}

This issue closes #456

Signed-off-by: Aayush Subramaniam <[email protected]>
@aayushsss1
Copy link
Contributor Author

Hi @rafvasq could you have a look at this PR? I've fixed the linting errors of the previous build

Copy link
Member

@rafvasq rafvasq left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @aayushsss1!

/lgtm

Copy link

oss-prow-bot bot commented Jun 7, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: aayushsss1, rafvasq

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

1 similar comment
Copy link

oss-prow-bot bot commented Jun 7, 2024

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: aayushsss1, rafvasq

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@oss-prow-bot oss-prow-bot bot added the approved label Jun 7, 2024
@rafvasq rafvasq merged commit c99e0ee into kserve:main Jun 7, 2024
7 checks passed
openshift-merge-bot bot pushed a commit to opendatahub-io/modelmesh-serving that referenced this pull request Aug 20, 2024
* docs: changed every instance of default_bucket to bucket (kserve#507)

#### Motivation

Replacing default_bucket -> bucket everywhere in this repo to ensure
it's consistent with KServe.

#### Modifications

replaced every instance of `default_bucket` to `bucket`

#### Result

Tested the [quickstart
install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md)
after modifying
[quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127)

pods up and running - 

```
kubectl get pods
NAME                                              READY   STATUS    RESTARTS   AGE
etcd-6fdc487479-m9pkx                             1/1     Running   0          32m
minio-6b5c846587-8bwdv                            1/1     Running   0          32m
modelmesh-controller-5cd8d68bc-9ls9p              1/1     Running   0          31m
modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj   4/4     Running   0          26m
modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw   4/4     Running   0          26m
```

Model deployed and InferenceService is Ready - 

```
kubectl get isvc
NAME                   URL                                               READY   PREV   LATEST   PREVROLLEDOUTREVISION   LATESTREADYREVISION   AGE
example-sklearn-isvc   grpc://modelmesh-serving.modelmesh-serving:8033   True  
```

```
kubectl describe isvc example-sklearn-isvc
Name:         example-sklearn-isvc
Namespace:    modelmesh-serving
Labels:       <none>
Annotations:  serving.kserve.io/deploymentMode: ModelMesh
API Version:  serving.kserve.io/v1beta1
Kind:         InferenceService
Metadata:
  Creation Timestamp:  2024-05-28T07:19:00Z
  Generation:          1
  Resource Version:    5950
  UID:                 db71cf11-7842-4bc1-af97-647282e6b9b9
Spec:
  Predictor:
    Model:
      Model Format:
        Name:  sklearn
      Storage:
        Key:   localMinIO
        Path:  sklearn/mnist-svm.joblib
Status:
  Components:
    Predictor:
      Grpc URL:  grpc://modelmesh-serving.modelmesh-serving:8033
      Rest URL:  http://modelmesh-serving.modelmesh-serving:8008
      URL:       grpc://modelmesh-serving.modelmesh-serving:8033
  Conditions:
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  PredictorReady
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  Ready
  Model Status:
    Copies:
      Failed Copies:  0
      Total Copies:   1
    States:
      Active Model State:  Loaded
      Target Model State:  
    Transition Status:     UpToDate
  URL:                     grpc://modelmesh-serving.modelmesh-serving:8033
Events:                    <none>
```

Inference Request successful - 

```
MODEL_NAME=example-sklearn-isvc
grpcurl \
  -plaintext \
  -proto fvt/proto/kfs_inference_v2.proto \
  -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \
  localhost:8033 \
  inference.GRPCInferenceService.ModelInfer

Handling connection for 8033
{
  "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf",
  "outputs": [
    {
      "name": "predict",
      "datatype": "INT64",
      "shape": [
        "1",
        "1"
      ],
      "contents": {
        "int64Contents": [
          "8"
        ]
      }
    }
  ]
}
```


This issue closes kserve#456

---------

Signed-off-by: Aayush Subramaniam <[email protected]>

* ci: Add nightly build twice a week (kserve#513)

Signed-off-by: Christian Kadner <[email protected]>

* chore: Use ubi8/go-toolset:1.21 for dev image (kserve#515)

Signed-off-by: Spolti <[email protected]>

---------

Signed-off-by: Aayush Subramaniam <[email protected]>
Signed-off-by: Christian Kadner <[email protected]>
Signed-off-by: Spolti <[email protected]>
Co-authored-by: Aayush Subramaniam <[email protected]>
Co-authored-by: Christian Kadner <[email protected]>
Co-authored-by: Filippe Spolti <[email protected]>
openshift-merge-bot bot pushed a commit to opendatahub-io/modelmesh-serving that referenced this pull request Aug 22, 2024
* docs: changed every instance of default_bucket to bucket (kserve#507)

#### Motivation

Replacing default_bucket -> bucket everywhere in this repo to ensure
it's consistent with KServe.

#### Modifications

replaced every instance of `default_bucket` to `bucket`

#### Result

Tested the [quickstart
install](https://github.com/kserve/modelmesh-serving/blob/main/docs/quickstart.md)
after modifying
[quickstart.yaml](https://github.com/kserve/modelmesh-serving/blob/6c86da9473d50de63f9ea3af8a4d7c223849547e/config/dependencies/quickstart.yaml#L127)

pods up and running - 

```
kubectl get pods
NAME                                              READY   STATUS    RESTARTS   AGE
etcd-6fdc487479-m9pkx                             1/1     Running   0          32m
minio-6b5c846587-8bwdv                            1/1     Running   0          32m
modelmesh-controller-5cd8d68bc-9ls9p              1/1     Running   0          31m
modelmesh-serving-mlserver-1.x-66bb94dcf6-hvgzj   4/4     Running   0          26m
modelmesh-serving-mlserver-1.x-66bb94dcf6-qtdzw   4/4     Running   0          26m
```

Model deployed and InferenceService is Ready - 

```
kubectl get isvc
NAME                   URL                                               READY   PREV   LATEST   PREVROLLEDOUTREVISION   LATESTREADYREVISION   AGE
example-sklearn-isvc   grpc://modelmesh-serving.modelmesh-serving:8033   True  
```

```
kubectl describe isvc example-sklearn-isvc
Name:         example-sklearn-isvc
Namespace:    modelmesh-serving
Labels:       <none>
Annotations:  serving.kserve.io/deploymentMode: ModelMesh
API Version:  serving.kserve.io/v1beta1
Kind:         InferenceService
Metadata:
  Creation Timestamp:  2024-05-28T07:19:00Z
  Generation:          1
  Resource Version:    5950
  UID:                 db71cf11-7842-4bc1-af97-647282e6b9b9
Spec:
  Predictor:
    Model:
      Model Format:
        Name:  sklearn
      Storage:
        Key:   localMinIO
        Path:  sklearn/mnist-svm.joblib
Status:
  Components:
    Predictor:
      Grpc URL:  grpc://modelmesh-serving.modelmesh-serving:8033
      Rest URL:  http://modelmesh-serving.modelmesh-serving:8008
      URL:       grpc://modelmesh-serving.modelmesh-serving:8033
  Conditions:
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  PredictorReady
    Last Transition Time:  2024-05-28T07:25:07Z
    Status:                True
    Type:                  Ready
  Model Status:
    Copies:
      Failed Copies:  0
      Total Copies:   1
    States:
      Active Model State:  Loaded
      Target Model State:  
    Transition Status:     UpToDate
  URL:                     grpc://modelmesh-serving.modelmesh-serving:8033
Events:                    <none>
```

Inference Request successful - 

```
MODEL_NAME=example-sklearn-isvc
grpcurl \
  -plaintext \
  -proto fvt/proto/kfs_inference_v2.proto \
  -d '{ "model_name": "'"${MODEL_NAME}"'", "inputs": [{ "name": "predict", "shape": [1, 64], "datatype": "FP32", "contents": { "fp32_contents": [0.0, 0.0, 1.0, 11.0, 14.0, 15.0, 3.0, 0.0, 0.0, 1.0, 13.0, 16.0, 12.0, 16.0, 8.0, 0.0, 0.0, 8.0, 16.0, 4.0, 6.0, 16.0, 5.0, 0.0, 0.0, 5.0, 15.0, 11.0, 13.0, 14.0, 0.0, 0.0, 0.0, 0.0, 2.0, 12.0, 16.0, 13.0, 0.0, 0.0, 0.0, 0.0, 0.0, 13.0, 16.0, 16.0, 6.0, 0.0, 0.0, 0.0, 0.0, 16.0, 16.0, 16.0, 7.0, 0.0, 0.0, 0.0, 0.0, 11.0, 13.0, 12.0, 1.0, 0.0] }}]}' \
  localhost:8033 \
  inference.GRPCInferenceService.ModelInfer

Handling connection for 8033
{
  "modelName": "example-sklearn-isvc__isvc-6b2eb0b8bf",
  "outputs": [
    {
      "name": "predict",
      "datatype": "INT64",
      "shape": [
        "1",
        "1"
      ],
      "contents": {
        "int64Contents": [
          "8"
        ]
      }
    }
  ]
}
```


This issue closes kserve#456

---------

Signed-off-by: Aayush Subramaniam <[email protected]>

* ci: Add nightly build twice a week (kserve#513)

Signed-off-by: Christian Kadner <[email protected]>

* chore: Use ubi8/go-toolset:1.21 for dev image (kserve#515)

Signed-off-by: Spolti <[email protected]>

---------

Signed-off-by: Aayush Subramaniam <[email protected]>
Signed-off-by: Christian Kadner <[email protected]>
Signed-off-by: Spolti <[email protected]>
Co-authored-by: Aayush Subramaniam <[email protected]>
Co-authored-by: Christian Kadner <[email protected]>
Co-authored-by: Filippe Spolti <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

storage-secret-config should have the same parameters as Kserve
2 participants