Skip to content

Commit

Permalink
Deployed dcbbcae to 0.12 with MkDocs 1.5.3 and mike 2.0.0
Browse files Browse the repository at this point in the history
  • Loading branch information
github-actions[bot] committed Apr 6, 2024
1 parent 0a14c34 commit 10692ba
Show file tree
Hide file tree
Showing 10 changed files with 426 additions and 198 deletions.
16 changes: 7 additions & 9 deletions 0.12/admin/kubernetes_deployment/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1234,7 +1234,7 @@ <h2 id="3-install-kserve">3. Install KServe<a class="headerlink" href="#3-instal
<div class="tabbed-set tabbed-alternate" data-tabs="2:1"><input checked="checked" id="__tabbed_2_1" name="__tabbed_2" type="radio"><div class="tabbed-labels"><label for="__tabbed_2_1">kubectl</label></div>
<div class="tabbed-content">
<div class="tabbed-block">
<div class="highlight"><pre><span></span><code>kubectl<span class="w"> </span>apply<span class="w"> </span>-f<span class="w"> </span>https://github.com/kserve/kserve/releases/download/v0.12.0/kserve-runtimes.yaml
<div class="highlight"><pre><span></span><code>kubectl<span class="w"> </span>apply<span class="w"> </span>-f<span class="w"> </span>https://github.com/kserve/kserve/releases/download/v0.12.0/kserve-cluster-resources.yaml
</code></pre></div>
</div>
</div>
Expand All @@ -1249,14 +1249,12 @@ <h2 id="3-install-kserve">3. Install KServe<a class="headerlink" href="#3-instal
</div>
</div>
</input></div>
<p>then modify the <code>ingressClassName</code> in <code>ingress</code> section to point to <code>IngressClass</code> name created in step 1.</p>
<pre><code>```yaml
ingress: |-
{
"ingressClassName" : "your-ingress-class",
}
```
</code></pre>
<p>then modify the <code>ingressClassName</code> in <code>ingress</code> section to point to <code>IngressClass</code> name created in <a href="#1-install-istio">step 1</a>.
<div class="highlight"><pre><span></span><code><span class="nt">ingress</span><span class="p">:</span><span class="w"> </span><span class="p p-Indicator">|-</span>
<span class="p p-Indicator">{</span>
<span class="w"> </span><span class="s">"ingressClassName"</span><span class="nt"> </span><span class="p">:</span><span class="w"> </span><span class="s">"your-ingress-class"</span><span class="p p-Indicator">,</span>
<span class="p p-Indicator">}</span>
</code></pre></div></p>
</article>
</div>
</div>
Expand Down
2 changes: 1 addition & 1 deletion 0.12/admin/serverless/serverless/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1254,7 +1254,7 @@ <h2 id="5-install-kserve-built-in-clusterservingruntimes">5. Install KServe Buil
<div class="tabbed-set tabbed-alternate" data-tabs="2:1"><input checked="checked" id="__tabbed_2_1" name="__tabbed_2" type="radio"><div class="tabbed-labels"><label for="__tabbed_2_1">kubectl</label></div>
<div class="tabbed-content">
<div class="tabbed-block">
<div class="highlight"><pre><span></span><code>kubectl<span class="w"> </span>apply<span class="w"> </span>-f<span class="w"> </span>https://github.com/kserve/kserve/releases/download/v0.12.0/kserve-runtimes.yaml
<div class="highlight"><pre><span></span><code>kubectl<span class="w"> </span>apply<span class="w"> </span>-f<span class="w"> </span>https://github.com/kserve/kserve/releases/download/v0.12.0/kserve-cluster-resources.yaml
</code></pre></div>
</div>
</div>
Expand Down
7 changes: 6 additions & 1 deletion 0.12/assets/_mkdocstrings.css
Original file line number Diff line number Diff line change
Expand Up @@ -106,4 +106,9 @@ code.doc-symbol-module {

code.doc-symbol-module::after {
content: "mod";
}
}

.doc-signature .autorefs {
color: inherit;
border-bottom: 1px dotted currentcolor;
}
6 changes: 6 additions & 0 deletions 0.12/get_started/first_isvc/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1254,6 +1254,12 @@ <h3 id="2-create-an-inferenceservice">2. Create an <code>InferenceService</code>
</div>
</div>
</input></input></div>
<div class="admonition warning">
<p class="admonition-title">Warning</p>
<p>Do not deploy <code>InferenceServices</code> in control plane namespaces (i.e. namespaces with <code>control-plane</code> label). The webhook is configured
in a way to skip these namespaces to avoid any privillage escalations. Deploying InferenceServices to these namespaces will result in the storage initializer not being
injected into the pod, causing the pod to fail with the error <code>No such file or directory: '/mnt/models'</code>.</p>
</div>
<h3 id="3-check-inferenceservice-status">3. Check <code>InferenceService</code> status.<a class="headerlink" href="#3-check-inferenceservice-status" title="Permanent link"></a></h3>
<div class="highlight"><pre><span></span><code>kubectl<span class="w"> </span>get<span class="w"> </span>inferenceservices<span class="w"> </span>sklearn-iris<span class="w"> </span>-n<span class="w"> </span>kserve-test
</code></pre></div>
Expand Down
12 changes: 8 additions & 4 deletions 0.12/get_started/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1191,9 +1191,9 @@ <h3 id="install-the-kubernetes-cli">Install the Kubernetes CLI<a class="headerli
<h2 id="install-the-kserve-quickstart-environment">Install the KServe "Quickstart" environment<a class="headerlink" href="#install-the-kserve-quickstart-environment" title="Permanent link"></a></h2>
<ol>
<li>
<p>After having kind installed, create a <code>kind</code> cluster with:</p>
<div class="highlight"><pre><span></span><code>kind<span class="w"> </span>create<span class="w"> </span>cluster
</code></pre></div>
<p>After having kind installed, create a <code>kind</code> cluster with:
<div class="highlight"><pre><span></span><code>kind<span class="w"> </span>create<span class="w"> </span>cluster
</code></pre></div></p>
</li>
<li>
<p>Then run:</p>
Expand All @@ -1206,8 +1206,12 @@ <h2 id="install-the-kserve-quickstart-environment">Install the KServe "Quickstar
</li>
<li>
<p>You can then get started with a local deployment of KServe by using <em>KServe Quick installation script on Kind</em>:</p>
<div class="highlight"><pre><span></span><code>curl<span class="w"> </span>-s<span class="w"> </span><span class="s2">"https://raw.githubusercontent.com/kserve/kserve/release-0.11/hack/quick_install.sh"</span><span class="w"> </span><span class="p">|</span><span class="w"> </span>bash
<div class="highlight"><pre><span></span><code>curl<span class="w"> </span>-s<span class="w"> </span><span class="s2">"https://raw.githubusercontent.com/kserve/kserve/release-0.12/hack/quick_install.sh"</span><span class="w"> </span><span class="p">|</span><span class="w"> </span>bash
</code></pre></div>
<p>or install via our published Helm Charts:
<div class="highlight"><pre><span></span><code>helm<span class="w"> </span>install<span class="w"> </span>kserve-crd<span class="w"> </span>oci://ghcr.io/kserve/charts/kserve-crd<span class="w"> </span>--version<span class="w"> </span>v0.12.0
helm<span class="w"> </span>install<span class="w"> </span>kserve<span class="w"> </span>oci://ghcr.io/kserve/charts/kserve<span class="w"> </span>--version<span class="w"> </span>v0.12.0
</code></pre></div></p>
</li>
</ol>
</article>
Expand Down
30 changes: 24 additions & 6 deletions 0.12/modelserving/v1beta1/serving_runtime/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -1137,7 +1137,7 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
The KServe prediction protocol is noted as either "v1" or "v2". Some serving runtimes also support their own prediction protocol, these are noted with an <code>*</code>.
The default serving runtime version column defines the source and version of the serving runtime - MLServer, KServe or its own.
These versions can also be found in the <a href="https://github.com/kserve/kserve/blob/master/config/runtimes/kustomization.yaml">runtime kustomization YAML</a>.
All KServe native model serving runtimes use the current KServe release version (v0.11). The supported framework version column lists the <strong>major</strong> version of the model that is supported.
All KServe native model serving runtimes use the current KServe release version (v0.12). The supported framework version column lists the <strong>major</strong> version of the model that is supported.
These can also be found in the respective <a href="https://github.com/kserve/kserve/tree/master/config/runtimes">runtime YAML</a> under the <code>supportedModelFormats</code> field.
For model frameworks using the KServe serving runtime, the specific default version can be found in <a href="https://github.com/kserve/kserve/tree/master/python">kserve/python</a>.
In a given serving runtime directory the pyproject.toml file contains the exact model framework version used. For example, in <a href="https://github.com/kserve/kserve/tree/master/python/lgbserver">kserve/python/lgbserver</a> the <a href="https://github.com/kserve/kserve/blob/master/python/lgbserver/pyproject.toml">pyproject.toml</a> file sets the model framework version to 3.3.2, <code>lightgbm ~= 3.3.2</code>.</p>
Expand Down Expand Up @@ -1177,7 +1177,7 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
<td><a href="https://lightgbm.readthedocs.io/en/latest/pythonapi/lightgbm.Booster.html#lightgbm.Booster.save_model">Saved LightGBM Model</a></td>
<td>v1, v2</td>
<td>v2</td>
<td>v0.11 (KServe)</td>
<td>v0.12 (KServe)</td>
<td>3</td>
<td><a href="../lightgbm/">LightGBM Iris</a></td>
</tr>
Expand All @@ -1195,7 +1195,7 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
<td><a href="http://dmg.org/pmml/v4-4-1/GeneralStructure.html">PMML</a></td>
<td>v1, v2</td>
<td>v2</td>
<td>v0.11 (KServe)</td>
<td>v0.12 (KServe)</td>
<td>3, 4 (<a href="https://github.com/autodeployai/pypmml">PMML4.4.1</a>)</td>
<td><a href="../pmml/">SKLearn PMML</a></td>
</tr>
Expand All @@ -1213,7 +1213,7 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
<td><a href="https://scikit-learn.org/stable/modules/model_persistence.html">Pickled Model</a></td>
<td>v1, v2</td>
<td>v2</td>
<td>v0.11 (KServe)</td>
<td>v0.12 (KServe)</td>
<td>1.3</td>
<td><a href="../sklearn/v2/">SKLearn Iris</a></td>
</tr>
Expand All @@ -1231,7 +1231,7 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
<td><a href="https://pytorch.org/docs/master/generated/torch.save.html">Eager Model/TorchScript</a></td>
<td>v1, v2, *torchserve</td>
<td>*torchserve</td>
<td>0.8.0 (TorchServe)</td>
<td>0.8.2 (TorchServe)</td>
<td>2</td>
<td><a href="../torchserve/">TorchServe mnist</a></td>
</tr>
Expand All @@ -1258,10 +1258,28 @@ <h1 id="model-serving-runtimes">Model Serving Runtimes<a class="headerlink" href
<td><a href="https://xgboost.readthedocs.io/en/latest/tutorials/saving_model.html">Saved Model</a></td>
<td>v1, v2</td>
<td>v2</td>
<td>v0.11 (KServe)</td>
<td>v0.12 (KServe)</td>
<td>1</td>
<td><a href="../xgboost/">XGBoost Iris</a></td>
</tr>
<tr>
<td><a href="https://github.com/kserve/kserve/tree/master/python/huggingfaceserver">HuggingFace ModelServer</a></td>
<td><a href="https://huggingface.co/docs/transformers/v4.39.2/en/main_classes/model#transformers.PreTrainedModel.save_pretrained">Saved Model</a> / <a href="https://huggingface.co/models">Huggingface Hub Model_Id</a></td>
<td>v1, v2</td>
<td>--</td>
<td>v0.12 (KServe)</td>
<td>4 (<a href="https://pypi.org/project/transformers/4.37.2/">Transformers</a>)</td>
<td>--</td>
</tr>
<tr>
<td><a href="https://github.com/kserve/kserve/tree/master/python/huggingfaceserver">HuggingFace VLLM ModelServer</a></td>
<td><a href="https://huggingface.co/docs/transformers/v4.39.2/en/main_classes/model#transformers.PreTrainedModel.save_pretrained">Saved Model</a> / <a href="https://huggingface.co/models">Huggingface Hub Model_Id</a></td>
<td>v2</td>
<td>--</td>
<td>v0.12 (KServe)</td>
<td>0 (<a href="https://pypi.org/project/vllm/0.2.7/">Vllm</a>)</td>
<td>--</td>
</tr>
</tbody>
</table>
<p>*tensorflow - Tensorflow implements its own prediction protocol in addition to KServe's. See: <a href="https://github.com/tensorflow/serving/blob/master/tensorflow_serving/apis/prediction_service.proto">Tensorflow Serving Prediction API</a> documentation</p>
Expand Down
Loading

0 comments on commit 10692ba

Please sign in to comment.