AI assistant for the GLPI IT Helpdesk. It includes an integration with its API to open incident reports.
- Login & Auth validation using Google OAuth (TODO) (login)
- Open an incident ticket on GLPI (open_incident)
- Get the current status of a ticket (get_incident_status)
- Reset my password (password_reset)
- Issues with email (problem_email)
- Request biometrics report (request_biometrics_report)
- Request a VM (request_vm)
- FAQ: How to connect to the WiFi? (connect_wifi)
- FAQ: How to create a user? (create_user)
- greet
- goodbye
- thank
- bot_challenge
- help
- show_menu
- inform
- confirm
- deny
- out_of_scope
- Python 3.7+
- Pip3
- Virtualenv or Conda (Recommended for isolated env creation)
If you haven’t installed Rasa NLU and Rasa Core yet, you can do it by navigating to the project directory and running:
pip install -r requirements.txt
You also need to install the spaCy Spanish language model. You can install it by running:
python -m spacy download es_core_news_md
python -m spacy link es_core_news_md es
rasa data validate
The following command trains both the Core and NLU models
rasa train
The following command performs a model evaluation of the latest trained NLU model under the models
directory
rasa data split nlu
rasa test nlu -u train_test_split/test_data.md --out nlu_metrics/
# or 5 (default -f) cross validation
rasa test nlu -u data/nlu.md --cross-validation --out nlu_metrics/
Finally, check the following files for results:
- Intent Confusion Matrix
- [Intent Confidence Prediction][nlu_metrics/hist.png]
- Intent/Entity Metrics Report
Within the nlu_metrics folder there are also other reportes for each the NLU pipeline components (e.g. DIETClassifier report and errors)
The following command performs a model evaluation of the latest trained dialogue model under the models
directory
rasa test core --stories tests/e2e-stories.md --out core_metrics/
Finally, check the results directory for a summary of the performed evaluation
rasa visualize -d domain.yml --stories data/stories.md -u data/nlu.md --out core_metrics/graph.html
rasa run actions --actions actions -p 5055
or as a Docker container
docker build . --tag glpi-action-server
docker run -p 5055:5055 glpi-action-server
or as a background process
sh scripts/startActions.sh
docker run -p 8000:8000 rasa/duckling
rasa shell --endpoints endpoints.yml
Rasa X is a tool designed to make it easier to deploy and improve Rasa-powered assistants by learning from real conversations
rasa x --data data/train/ --endpoints endpoints.yml --cors '*' --enable-api --port 5005 --rasa-x-port 5002
or as a background process
sh scripts/startRasaX.sh
Update admin password
python scripts/manage_user.py create me $RASA_X_PASSWORD admin --update
rasa run --endpoints endpoints.yml --credentials credentials.yml --enable-api --cors "*" --port 5005
cd client
npm install
npm start
or as a background process
sh scripts/startClient.sh
-
Install Rasa + RasaX using Docker Compose Quick Start Guide
-
To upgrade the components to their latest release, follow this instructions
-
To update the admin password run the following command within
RASA_HOME
python rasa_x_commands.py create --update admin me glpi@dmin
- Build the Action Server Docker Image (in case you want to build your own locally)
export GLPI_DOCKER_IMAGE=santteegt/glpi-chatbot-actions:latest
docker build -t $GLPI_DOCKER_IMAGE .
-
Each time a commit is pushed to the
develop
branch, the DockerHub registry deploys a new image with the latest changes -
To do a manual push to DockerHub (Optional)
docker login --username santteegt
docker push $GLPI_DOCKER_IMAGE
-
Copy
socketio
settings from credentials.yml to thecredentials.yml
file in theRASA_HOME
directory -
Copy the docker-compose.override.yml file to the
RASA_HOME
directory -
Set the following environment variables on the
.env
file within theRASA_HOME
directory- GLPI_API_URI: (Setup > General > API) <- where you can find the info on your GLPI instance
- GLPI_APP_TOKEN: (Setup > General > API)
- GLPI_AUTH_TOKEN: (User Preferences > API Token)
- GLPI_LOCALMODE: false (true if you don't want )
-
Start Docker Compose:
docker-compose up -d
- If you started the service for the first time or just updated it, you need to install the ES Spacy language model in both
rasa-production
andrasa-worker
containers:
docker-compose exec -u 0 rasa-production bash -c "python -m spacy download es_core_news_md && python -m spacy link es_core_news_md es"
docker-compose exec -u 0 rasa-worker bash -c "python -m spacy download es_core_news_md && python -m spacy link es_core_news_md es"
- Access the
rasa-x
container shell
docker exec -u 0 -it <rasa-x-container-id> bash
- Add an SSH config file in
/app/.ssh/config
with the following:
Host github.com
Hostname ssh.github.com
Port 443
- Add -F /app/.ssh/config as parameter on the ssh executable script
/usr/local/lib/python3.7/site-packages/rasax/community/services/integrated_version_control/git_service.py
:
def _save_ssh_executable(path: Path, path_to_key: Path) -> None:
command = f"""
#!/bin/sh
ID_RSA={path_to_key}
# Kubernetes tends to reset file permissions on restart. Hence, re-apply the required
# permissions whenever using the key.
chmod 600 $ID_RSA
exec /usr/bin/ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -i $ID_RSA -F /app/.ssh/config "$@"
"""
- Restart the compose environment
See this issue
pip install git+git://github.com/RasaHQ/rasa.git
You may experience the following error on an on-premise and/or cloud server (deployed with Kubernetes): The TensorFlow library was compiled to use AVX instructions
In order to fix this issue, execute the following commands after installing rasa dependencies as shown above:
pip uninstall tensorflow -y
conda create --name glpi-rasax python=3.6.8
conda activate glpi-rasax
conda install -c anaconda -n glpi-rasax tensorflow==1.15.0
conda deactivate glpi-rasax
export PYTHONPATH='${HOME}/anaconda3/envs/glpi-rasax/lib/python3.6/site-packages'
If you're (re-)generating data using Chatito, paste the related files under the data/nlu_chatito
folder and then execute the following commands:
rasa data convert nlu --data data/nlu_chatito/train/ --out data/train/nlu.md -l es -f md
rasa data convert nlu --data data/nlu_chatito/test/ --out data/test/nlu.md -l es -f md