Skip to content

Commit

Permalink
Validation (#901)
Browse files Browse the repository at this point in the history
* add validation script

* update

* change token count function

* reorganize cells

* Add unit tests

* Add a printout for CPT

* update question

* Add questions

* Fix lints

* update format

* update

* nb source

* add validation script

* update

* change token count function

* reorganize cells

* Add unit tests

* Add a printout for CPT

* update question

* Add questions

* Fix lints

* update format

* update

* nb source

* Remove license insert for validation notebook

* Add validation utils

* Minor cleanups (#858)

* nits

* logger

* add log

* lint

* update utils/__init__.py to include extra validation functions

* update notebook

* update

* update

* Read UC delta table (#773)

* initial commit

* use databricks-sql to read delta table and convert to json

* update

* update

* update

* add mocked unittest

* Fix lints

* update

* update

* restructure code

* Add timer for optimizing

* Add db-connect

* add wrapper

* update

* add install dbconnect

* update

* update

* patch dbconnect to allow multiple return formats

* update

* add arrow

* use compression

* clean up

* Add cluster rt check

* Fix lints

* remove patch.py for CI

* update

* update

* updat

* update

* fix tests

* fix lint

* update

* update

* Add more tests

* update

* update

* update

* change to download_json

* update

* fix lints

* Add decompressed option for arrow

* format json to jsonl

* Add comments

* Make cf_collect_type global option

* fix comments

* fix lints

* fix comments

* Fix lints

* change to use workspaceclient

* Add CPT support

* Rewire method assignment logic

* Fix bug in stripping https

* Add tests for rewired method assignment logic

* Fix lints

* Fix lints

* Removed logger set_level

* Remove pyspark. It conflicts with databricks-connect

* Update the comment

* skip cluster version check when cluster_id is serverless

* Add use_serverless flag

* update tests with use_serverless flag

* Fix lints

---------

Co-authored-by: Xiaohan Zhang <[email protected]>

* Add download remote function to util

* update

* remove fused layernorm (#859)

* update

* update

* update

* update

* update

* update

* update

* update

* update

* Remove hardcoded combined.jsonl with a flag (#861)

* Remove hardcoded combined.jsonl with a flag

* update

* change output_json_path output_json_folder

---------

Co-authored-by: Xiaohan Zhang <[email protected]>

* bump (#828)

* Add dask and dataframe_to_mds

* update

* update

* update

* update

* Add notebook

* update

* update

* remove script and tests, keep notebook

* update

* update

* update

* update

* Always initialize dist  (#864)

* fix dev

* lint

* remove gpu

* updated notebook

* remove scripts keep notebook

* update notebook. rephrase.

* update

* Add response tokens

* update

* update

* Disable MDSWrite, return token counts

* Change plot settings

* update notebook

* update

* update notebook

* update

---------

Co-authored-by: Xiaohan Zhang <[email protected]>
Co-authored-by: xiaohanzhan-db <xiaohanzhan-db>
Co-authored-by: Mihir Patel <[email protected]>
  • Loading branch information
3 people authored Jan 23, 2024
1 parent 8498662 commit 205e405
Showing 1 changed file with 19 additions and 1 deletion.
20 changes: 19 additions & 1 deletion notebooks/validate_and_tokenize_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,25 @@
"**Temporary Data Path Configuration:**\n",
"\n",
"- temporary_jsonl_data_path: Defines a filesystem path where temporary data related to the training process will be stored.\n",
"- Environment variables for Hugging Face caches (HF_DATASETS_CACHE) are set to '/tmp/', directing dataset caching to a temporary directory."
"- Environment variables for Hugging Face caches (HF_DATASETS_CACHE) are set to '/tmp/', directing dataset caching to a temporary directory.\n",
"\n",
"**[Supported Models by FT API](https://docs.mosaicml.com/projects/mcli/en/latest/finetuning/finetuning.html#supported-models):**. \n",
"\n",
"You need to specify context length based on the model mapping below.\n",
"```\n",
"ft_models = {\n",
" 'mosaicml/mpt-7b-8k': 8192, \n",
" 'mosaicml/mpt-7b': 2048,\n",
" 'mosaicml/mpt-30b': 8192,\n",
" 'meta-llama/Llama-2-13b-hf': 4096,\n",
" 'meta-llama/Llama-2-7b-hf': 4096,\n",
" 'meta-llama/Llama-2-70b-hf': 4096,\n",
" 'codellama/CodeLlama-7b-hf': 16384,\n",
" 'codellama/CodeLlama-13b-hf': 16384,\n",
" 'codellama/CodeLlama-34b-hf': 16384,\n",
" 'mistralai/Mistral-7B-v0.1': 32768,\n",
"}\n",
"```"
]
},
{
Expand Down

0 comments on commit 205e405

Please sign in to comment.