Skip to content

Commit

Permalink
Merge branch 'main' into jtesar/chapter1-web-console-install
Browse files Browse the repository at this point in the history
  • Loading branch information
jtesar-rh authored Nov 13, 2023
2 parents 85e0db8 + a311d28 commit f437b75
Show file tree
Hide file tree
Showing 14 changed files with 180 additions and 24 deletions.
77 changes: 76 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,80 @@ This courses focuses on the installation, upgrades and administration of the pro

Refer to the quick courses [contributor guide](https://redhatquickcourses.github.io/welcome/1/guide/overview.html) to work with this repository.

# Creating Course Content

We use a system called Antora (https://antora.org) to publish courses. Antora expects the files and folders in a source repository to be arranged in a certain opinionated way to simplify the process of writing course content using asciidoc, and then converting the asciidoc source to HTML.

Refer to the quick courses [contributor guide](https://redhatquickcourses.github.io/welcome/1/guide/overview.html) for a detailed guide on how to work with Antora tooling and publish courses.

## TL;DR Quickstart

This section is intended as a quick start guide for technically experienced members. The contributor guide remains the canonical reference for the course content creation process with detailed explanations, commands, video demonstrations, and screenshots.

### Pre-requisites

* You have a macOS or Linux workstation. Windows has not been tested, or supported. You can try using a WSL2 based environment to run these steps - YMMV!
* You have a somewhat recent version of the Git client installed on your workstation
* You have a somewhat new Node.js LTS release (Node.js 16+) installed locally.
* Install a recent version of Visual Studio Code. Other editors with asciidoc editing support may work - YMMV, and you are on your own...

### Antora Files and Folder Structure

The *antora.yml* file lists the chapters/modules/units that make up the course.

Each chapter entry points to a *nav.adoc* file that lists the sections in that chapter. The home page of the course is rendered from *modules/ROOT/pages/index.adoc*.

Each chapter lives in a separate folder under the *modules* directory. All asciidoc source files live under the *modules/CHAPTER/pages* folder.

To create a new chapter in the course, create a new folder under *modules*.

To add a new section under a chapter create an entry in the *modules/CHAPTER/nav.adoc* file and then create the asciidoc file in the *modules/CHAPTER/pages* folder.

### Steps

1. Clone the course repository.
```
$ git clone [email protected]:RedHatQuickCourses/rhods-admin.git
```

2. Install the npm dependencies for the course tooling.
```
$ cd rhods-admin
$ npm install
```

3. Start the asciidoc to HTML compiler in the background. This command watches for changes to the asciidoc source content in the **modules** folder and automatically re-generates the HTML content.
```
$ npm run watch:adoc
```
4. Start a local web server to serve the generated HTML files. Navigate to the URL printed by this command to preview the generated HTML content in a web browser.
```
$ npm run serve
```

5. Before you make any content changes, create a local Git branch based on the **main** branch. As a good practice, prefix the branch name with your GitHub ID. Use a suitable branch naming scheme that reflects the content you are creating or changing.
```
$ git checkout -b rsriniva/ch01s01
```

6. Make your changes to the asciidoc files. Preview the generated HTML and verify that there are no rendering errors.Commit your changes to the local Git branch and push the branch to GitHub.
```
$ git add .
$ git commit -m "Added lecture content for chapter 1 section 1"
$ git push -u origin rsriniva/ch01s01
```

7. Create a GitHub pull request (PR) for your changes using the GitHub web UI.

8. Request a review of the PR from your technical peers and/or a member of the PTL team.

9. Make any changes requested by the reviewer in the **same** branch as the PR, and then commit and push your changes to GitHub. If other team members have made changes to the PR, then do not forget to do a **git pull** before committing your changes.

10. Once reviewer(s) approve your PR, you should merge it into the **main** branch.

11. Wait for a few minutes while the automated GitHub action publishes your changes ot the production GitHub pages website.

12. Verify that your changes have been published to the production GitHub pages website at https://redhatquickcourses.github.io/rhods-admin

## Problems and Feedback
If you run into any issues, report bugs/suggestions/improvements about this course here - https://github.com/RedHatQuickCourses/rhods-admin/issues
If you run into any issues, report bugs/suggestions/improvements about this course here - https://github.com/RedHatQuickCourses/rhods-admin/issues
10 changes: 6 additions & 4 deletions modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
= An Example Quick Course
:navtitle: Home
= Introduction
:navtitle: Intoduction

== Introduction
This course covers basics of the Red{nbsp}Hat Openshift Data Science administration. Mainly it covers following topics:

This is an example quick course demonstrating the usage of Antora for authoring and publishing quick courses.
. Red{nbsp}Hat Openshift Data Science Installation
. Red{nbsp}Hat Openshift Data Science User Management
. Creating and Configuring a Custom Notebook Image
17 changes: 15 additions & 2 deletions modules/chapter1/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,3 +1,16 @@
= Chapter 1
= Installation of Red{nbsp}Hat Openshift Data Science

== Supported configurations
OpenShift Data Science is supported in two configurations:

* A managed cloud service add-on for *Red Hat OpenShift Dedicated* (with a Customer Cloud Subscription for AWS or GCP) or for Red Hat OpenShift Service on Amazon Web Services (ROSA).
For information about OpenShift Data Science on a Red Hat managed environment, see https://access.redhat.com/documentation/en-us/red_hat_openshift_data_science/1[Product Documentation for Red Hat OpenShift Data Science.]

* Self-managed software that you can install on-premise or on the public cloud in a self-managed environment, such as *OpenShift Container Platform*.
For information about OpenShift Data Science as self-managed software on your OpenShift cluster in a connected or a disconnected environment, see https://access.redhat.com/documentation/en-us/red_hat_openshift_data_science_self-managed[Product Documentation for Red Hat OpenShift Data Science self-managed.]

In this course we cover installation of *OpenShift Data Science self-managed* using two different methods:

. Installation using the Openshift Web Console.
. Installation using the Command Line Interface.

This is the home page of _Chapter_ 1 in the *hello* quick course...
18 changes: 17 additions & 1 deletion modules/chapter1/pages/section1.adoc
Original file line number Diff line number Diff line change
@@ -1 +1,17 @@
= Section 1
= General Information about Installation

Red{nbsp}Hat Openshift Data Science is available to install a self-managed version as an operator through OperatorHub or as a fully managed solution through OpenShift Marketplace.

There are some other operators that you may need to install depending on which features and components of the *Red{nbsp}Hat Openshift Data Science* you want to install and use.

https://www.redhat.com/en/technologies/cloud-computing/openshift/pipelines[Red{nbsp}Hat Openshift Pipelines Operator]::
This operator is required if you want to install the *Red{nbsp}Hat Openshift Data Science Pipelines* component.

https://docs.nvidia.com/datacenter/cloud-native/gpu-operator/latest/index.html[NVIDIA GPU Operator]::
This *NVIDIA GPU Operator* is required for GPU support in Red Hat Openshift Data Science.

https://docs.openshift.com/container-platform/4.13/hardware_enablement/psap-node-feature-discovery-operator.html[Node Feature Discovery Operator]::
The *Node Feature Discovery Operator* is a prerequisity for the *NVIDIA GPU Operator*.

[IMPORTANT]
*Red{nbsp}Hat Openshift Data Science* operator is available in two different main versions at the moment - *Red{nbsp}Hat Openshift Data Science Operator V1* and *Red{nbsp}Hat Openshift Data Science operator V2*. There are some differences in installation between the two. Mainly that the V1 operator installs all the *Red{nbsp}Hat Openshift Data Science* components automatically whereas the V2 operator requires you to make a selection of the components to be installed and managed by the operator after the installation. In the class we focus on the V2 operator as it will become the stable version soon. At the moment it is available through the _alpha_ and _embedded_ channels.
3 changes: 2 additions & 1 deletion modules/chapter2/nav.adoc
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
* xref:index.adoc[]
** xref:section1.adoc[]
** xref:users.adoc[]
** xref:resources.adoc[]
12 changes: 10 additions & 2 deletions modules/chapter2/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
= Chapter 2
= Controlling User Access and Managing Resources

This is the home page of _Chapter 2_ in the *hello* quick course....
This chapter teaches you how to managers users and groups within RHODS. You also learn how to control resource (CPU, GPU, memory, disk space, and more) allocation for data science projects.

Goals:

* Manage RedHat OpenShift Data Science users and groups
* Control access to resources
* Enable and configure GPU acceleration of workloads
* Configure storage for data science projects
* Configure policies for cleaning up idle resources
14 changes: 14 additions & 0 deletions modules/chapter2/pages/resources.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
= Managing Resources

Blah....

== Notes
* What kind of resources are created in a RHODS DS project/workbench
* How to control resource allocation - CPU, Storage and other resources
* How to clean up after a user is removed from the group and from RHODS
* Idle culler configuration - free up resources and kill idle notebooks
* Enabling and configuring GPUs for workload acceleration
* Why is GPU acceleration beneficial and how to configure it
* How to provision for certain groups of users and not all


3 changes: 0 additions & 3 deletions modules/chapter2/pages/section1.adoc

This file was deleted.

10 changes: 10 additions & 0 deletions modules/chapter2/pages/users.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
= User Types and Permissions

== Notes
* Brief coverage of how default RHODS authentication uses the underlying OCP OAuth
* Cover how to create “special” RHODS users and groups
* [Trevor] We should be recommending configuring users in the default RHODS admin group instead of relying on the cluster-admin role.
* Implications of users and groups on DS projects, workbenches, data connections, storage
* How to manage different groups of teams working in a large organization - some work needs to be shared, some needs to be isolated from others
* [Trevor] Might be worth at least touching on how group sync works a bit in OCP. Probably a much deeper topic than we want to go in this training, but large organizations should be using group sync and managing those groups outside of the cluster.
* [Trevor] We need to cover how an admin can create data science projects for users when Self Provisioning is disabled on the cluster.
3 changes: 1 addition & 2 deletions modules/chapter3/nav.adoc
Original file line number Diff line number Diff line change
@@ -1,3 +1,2 @@
* xref:index.adoc[]
** xref:section1.adoc[]
** xref:section2.adoc[]
** xref:custom.adoc[]
22 changes: 22 additions & 0 deletions modules/chapter3/pages/custom.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
= Building Custom Workbench Images

Intro blah...

== Notes

* Why would you need a custom image?
* Provide links to Containerfiles for the set of images that we ship with RHODS. Encourage students to build on UBI based base images
* [Trevor] It is important to cover the minimum requirements that the notebook controller requires for the images to work. Expected packages, port requirements, and startup scripts. Additionally it may be good to cover some topics such as the non-root requirement for OCP and use that as a jumping off point to talk about using a RH base image.
* The general recommendation is usually the build off of the upstream notebook base image: https://github.com/opendatahub-io/notebooks/blob/main/base/ubi9-python-3.9/Dockerfile.
* We need to discuss how to actually do that and how to manage the packages when building off those images because it is a non-standard process with pipenv where you need to preserve the upstream packages in order to not break the image.
* How to build and make the custom image available in the RHODS web console
Ensure you cover standard container image building best practices - security, minimal, built for OCP with user and group perms, SELinux etc
* Look for good custom image use cases:
** Python 3.11/3.12 (performance improvements)
** Custom Python libs on top of standard RHODS images
** R based custom images?
** Julia?
* [Trevor] - The most common use cases from the field....
** My team needs version xyz of a specific package and the default images ship with zyx.
** My team needs package xyz and it is not available in any default images.
** My team needs a specific driver in the image that can only be installed with root access to connect to xyz (e.g. DB2 drivers to connect to oracle databases)
9 changes: 7 additions & 2 deletions modules/chapter3/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,3 +1,8 @@
= Chapter 3
= Building Custom Workbench Images

This is the home page of _Chapter 3_ in the *hello* quick course....
In this chapter, you learn how to build custom workbench images.

Goals:

* Create custom workbench images
* How to import custom workbench images into RHODS
3 changes: 0 additions & 3 deletions modules/chapter3/pages/section1.adoc

This file was deleted.

3 changes: 0 additions & 3 deletions modules/chapter3/pages/section2.adoc

This file was deleted.

0 comments on commit f437b75

Please sign in to comment.