This folder contains the following examples for code llama 13B models:
File | Description | Model Used | GPU Minimum Requirement |
---|---|---|---|
01_load_inference | Environment setup and suggested configurations when inferencing Code Llama 13B models on Databricks. | CodeLlama-13b-hf CodeLlama-13b-hf-instructions CodeLlama-13b-hf-python |
2xA10-24GB |