Skip to content

Commit

Permalink
Update requirements and docker related documents
Browse files Browse the repository at this point in the history
  • Loading branch information
DoraDong-2023 committed Oct 17, 2024
1 parent 1823535 commit 8feb800
Show file tree
Hide file tree
Showing 33 changed files with 80 additions and 911 deletions.
18 changes: 14 additions & 4 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ RUN apt-get update && apt-get install -y \
curl \
git \
wget \
nano \
gfortran \
&& apt-get clean && rm -rf /var/lib/apt/lists/*

Expand Down Expand Up @@ -53,19 +54,28 @@ COPY src/scripts /app/src/scripts
COPY src/retrievers /app/src/retrievers
COPY images /app/images
#COPY src/tmp /app/src/tmp
COPY data/standard_process/${LIB}/ /app/data/standard_process/${LIB}/
COPY data/standard_process/scanpy/ /app/data/standard_process/scanpy/
COPY data/standard_process/squidpy/ /app/data/standard_process/squidpy/
COPY data/standard_process/ehrapy/ /app/data/standard_process/ehrapy/
COPY data/standard_process/snapatac2/ /app/data/standard_process/snapatac2/
COPY data/standard_process/base/ /app/data/standard_process/base/
COPY data/autocoop/${LIB}/ /app/data/autocoop/${LIB}/
COPY data/autocoop/scanpy/ /app/data/autocoop/scanpy/
COPY data/autocoop/squidpy/ /app/data/autocoop/squidpy/
COPY data/autocoop/ehrapy/ /app/data/autocoop/ehrapy/
COPY data/autocoop/snapatac2/ /app/data/autocoop/snapatac2/
COPY data/conversations/ /app/data/conversations/
COPY data/others-data/ /app/data/others-data/
COPY hugging_models/retriever_model_finetuned/${LIB}/ /app/hugging_models/retriever_model_finetuned/${LIB}/
COPY hugging_models/retriever_model_finetuned/scanpy/ /app/hugging_models/retriever_model_finetuned/scanpy/
COPY hugging_models/retriever_model_finetuned/squidpy/ /app/hugging_models/retriever_model_finetuned/squidpy/
COPY hugging_models/retriever_model_finetuned/ehrapy/ /app/hugging_models/retriever_model_finetuned/ehrapy/
COPY hugging_models/retriever_model_finetuned/snapatac2/ /app/hugging_models/retriever_model_finetuned/snapatac2/
COPY docker_utils/ /app/docker_utils/

# mkdir tmp
RUN mkdir -p /app/src/tmp

# Install Python dependencies
RUN python3.10 -m pip install --no-cache-dir -r /app/docker_utils/${LIB}/requirements.txt
RUN python3.10 -m pip install --no-cache-dir -r /app/requirements.txt --verbose

# Install dependencies from environment.yml if it exists
RUN if [ -f /app/docker_utils/${LIB}/environment.yml ]; then \
Expand Down
25 changes: 14 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,12 @@
</div>

[![Demo](https://img.shields.io/badge/Demo-BioMANIA-blue?style=flat&logo=appveyor)](https://biomania.ngrok.io/en)
[![Docker Version](https://img.shields.io/badge/Docker-v1.1.12-blue?style=flat&logo=docker)](https://hub.docker.com/repositories/chatbotuibiomania)
[![Paper](https://img.shields.io/badge/Paper-burgundy?style=flat&logo=arxiv)](https://www.biorxiv.org/content/10.1101/2023.10.29.564479)
[![GitHub stars](https://img.shields.io/github/stars/batmen-lab/BioMANIA?style=social)](https://github.com/batmen-lab/BioMANIA)
[![Documentation Status](https://img.shields.io/readthedocs/biomania/latest?style=flat&logo=readthedocs&label=Doc)](https://biomania.readthedocs.io/en/latest/?badge=latest)
[![License](https://img.shields.io/badge/license-Apache%203.0-blue?style=flat&logo=open-source-initiative)](https://github.com/batmen-lab/BioMANIA/blob/main/LICENSE)
[![Docker Version](https://img.shields.io/badge/Docker-v1.1.9-blue?style=flat&logo=docker)](https://hub.docker.com/repositories/chatbotuibiomania)
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/14K4562oeesEz5qMoXmjv9gW_4VeLh6_U?usp=sharing)
[![Documentation Status](https://img.shields.io/readthedocs/biomania/latest?style=flat&logo=readthedocs&label=Doc)](https://biomania.readthedocs.io/en/latest/?badge=latest)
<!--[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/14K4562oeesEz5qMoXmjv9gW_4VeLh6_U?usp=sharing)-->
<!--[![Railway](https://img.shields.io/badge/Railway-purple?style=flat&logo=railway)](https://railway.app/template/qaQEvv)-->
<!--[![Python unit tests](https://github.com/batmen-lab/BioMANIA/actions/workflows/python-test-unit.yml/badge.svg)](https://github.com/batmen-lab/BioMANIA/actions/workflows/python-test-unit.yml)-->

Expand Down Expand Up @@ -79,27 +79,28 @@ export PYTHONPATH=$PYTHONPATH:$(pwd)

```bash
# CLI service quick start!
pip install gradio
python -m BioMANIA.deploy.cli_demo
# or gradio app. (TODO 240509: Images showing are under developing!)
#python -m BioMANIA.deploy.cli_gradio
```

## Run with Docker

For ease of use, we provide Docker images for several tools. You can refer the detailed tools list from [dockerhub](https://hub.docker.com/repositories/chatbotuibiomania).
For ease of use, we provide Docker image containing scanpy, squidpy, ehrapy, snapatac2. You can refer the detailed tools list from [dockerhub](https://hub.docker.com/repositories/chatbotuibiomania).

```bash
# Pull back-end service and front-end UI service with:
# 241001 updated
sudo docker pull chatbotuibiomania/biomania-together:v1.1.12-${LIB}-cuda12.6-ubuntu22.04
# 241016 updated
sudo docker pull chatbotuibiomania/biomania-together:v1.1.12-cuda12.6-ubuntu22.04
```

Start service with
```bash
# run on gpu
sudo docker run -e LIB=${LIB} -e OPENAI_API_KEY=[your_openai_api_key] -e GITHUB_TOKEN=[github_pat_xxx] --gpus all -d -p 3000:3000 chatbotuibiomania/biomania-together:v1.1.12-${LIB}-cuda12.6-ubuntu22.04
sudo docker run -e LIB=scanpy -e OPENAI_API_KEY=[your_openai_api_key] -e GITHUB_TOKEN=[github_pat_xxx] --gpus all -d -p 3000:3000 chatbotuibiomania/biomania-together:v1.1.12-cuda12.6-ubuntu22.04
# or on cpu
sudo docker run -e LIB=${LIB} -e OPENAI_API_KEY=[your_openai_api_key] -e GITHUB_TOKEN=[github_pat_xxx] -d -p 3000:3000 chatbotuibiomania/biomania-together:v1.1.12-${LIB}-cuda12.6-ubuntu22.04
sudo docker run -e LIB=scanpy -e OPENAI_API_KEY=[your_openai_api_key] -e GITHUB_TOKEN=[github_pat_xxx] -d -p 3000:3000 chatbotuibiomania/biomania-together:v1.1.12-cuda12.6-ubuntu22.04
```

Then check UI service with `http://localhost:3000/en`.
Expand Down Expand Up @@ -237,8 +238,8 @@ model.run_pipeline(user_input, library, top_k=1, files=[], conversation_started=

Please refer to the separate README for tutorials that supporting converting different coding tools to our APP.
- [For PyPI Tools](./docs/PyPI2APP.md)
- [For Python Source Code from Git Repo](./docs/Git2APP.md) (240925-Under developing)
- [For R Package](./docs/R2APP.md) (231123-Under developing)
- [For Python Source Code from Git Repo](./docs/Git2APP.md)
- [For R Package](./docs/R2APP.md)

## Share your APP!

Expand Down Expand Up @@ -275,9 +276,11 @@ Thank you for choosing BioMANIA. We hope this guide assists you in navigating th


## **Version History**
- v1.1.12 (2024-10-01)
- v1.1.12 (2024-10-16)
- Update code scripts & upload data and models & update docker which are aligned with paper.
- Will renew the scripts for generating report, documents for Git2APP, R2APP soon.
- Update report generation.
- Update R2APP and Git2APP document.

view [version_history](./docs/version_history.md) for more details!

Expand Down
12 changes: 0 additions & 12 deletions docker_utils/MIOSTONE/docker_start_script.sh

This file was deleted.

58 changes: 0 additions & 58 deletions docker_utils/MIOSTONE/requirements.txt

This file was deleted.

12 changes: 0 additions & 12 deletions docker_utils/biopython/docker_start_script.sh

This file was deleted.

58 changes: 0 additions & 58 deletions docker_utils/biopython/requirements.txt

This file was deleted.

12 changes: 0 additions & 12 deletions docker_utils/biotite/docker_start_script.sh

This file was deleted.

58 changes: 0 additions & 58 deletions docker_utils/biotite/requirements.txt

This file was deleted.

12 changes: 0 additions & 12 deletions docker_utils/deap/docker_start_script.sh

This file was deleted.

Loading

0 comments on commit 8feb800

Please sign in to comment.