Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explore one-click windows installer support #8

Open
wawawario2 opened this issue Apr 3, 2023 · 4 comments
Open

Explore one-click windows installer support #8

wawawario2 opened this issue Apr 3, 2023 · 4 comments
Assignees

Comments

@wawawario2
Copy link
Owner

I don't normally use Windows, but this is also a common request.

@wawawario2 wawawario2 self-assigned this Apr 3, 2023
@RebornZA
Copy link

RebornZA commented Apr 9, 2023

Workaround for one-click-installer for now.

Open a CMD in the same location as your batch files (the ones you use to launch web-ui)
Example: "A:\oobabooga-windows"

set MAMBA_ROOT_PREFIX=%cd%\installer_files\mamba
then
set INSTALL_ENV_DIR=%cd%\installer_files\env
then
call "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" activate "%INSTALL_ENV_DIR%" || ( echo MicroMamba hook not found. && goto end )
then
cd text-generation-webui
then
git clone https://github.com/wawawario2/long_term_memory extensions/long_term_memory
then
pip install -r extensions/long_term_memory/requirements.txt
then
python -m pytest -v extensions/long_term_memory/

If you want add --extensions long_term_memory to your .bat file, else tick the extension yourself.
Enjoy.

@2blackbar
Copy link

2blackbar commented Apr 11, 2023

use bing chat to use code above and make cmd file for you, yes i intentionally no pasted the code so you can do it on your own there and learn to use it in future
image

@Trahloc
Copy link

Trahloc commented Apr 14, 2023

I don't normally use Windows, but this is also a common request.

So, I gave it a whirl today and it's not working for me. I'm on windows 11, amd cpu, 2x3090s, I'm using conda/mambaforge is using python 3.9.16 for my base but the bat loads the micromamba python 3.10.9. The error occurs the moment I chat with the bot, it loads fine. I chatted and pasted chat.py and parts of extensions.py to GPT4 and asked it for a bug report to include with my logs in hopes it makes sense / helps:

Description:
An issue has been identified in the chat application related to one or more extension modifier functions. The apply_extensions function itself appears to be functioning correctly, but it is suspected that one of the extension modifier functions (e.g., input_modifier, output_modifier, or bot_prefix_modifier) is causing unexpected behavior during the chat.

Analysis:
Upon reviewing the provided code, no issues were found with the apply_extensions function. However, the function iterates over a list of extensions and applies their respective modifier functions to the text. Since the unexpected behavior occurs during the chat, it is possible that one of these modifier functions has a bug or is not handling certain cases correctly.

Recommendation:
To resolve this issue, the developer should review the list of extensions and their respective modifier functions, paying close attention to their implementations. The issue may be caused by one of these functions not handling specific cases or input text properly. By identifying and addressing any issues in these functions, the overall chat behavior should return to the expected output.

(base) S:\ai\oobabooga-windows>set MAMBA_ROOT_PREFIX=%cd%\installer_files\mamba

(base) S:\ai\oobabooga-windows>set INSTALL_ENV_DIR=%cd%\installer_files\env

(base) S:\ai\oobabooga-windows>call "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" activate "%INSTALL_ENV_DIR%" || ( echo MicroMamba hook not found. && goto end )

(S:\ai\oobabooga-windows\installer_files\env) S:\ai\oobabooga-windows>cd text-generation-webui

(S:\ai\oobabooga-windows\installer_files\env) S:\ai\oobabooga-windows\text-generation-webui>pip install -r extensions/long_term_memory/requirements.txt
Collecting numpy==1.24.2
  Using cached numpy-1.24.2-cp310-cp310-win_amd64.whl (14.8 MB)
Requirement already satisfied: pytest==7.2.2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from -r extensions/long_term_memory/requirements.txt (line 2)) (7.2.2)
Requirement already satisfied: scikit-learn==1.2.2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from -r extensions/long_term_memory/requirements.txt (line 3)) (1.2.2)
Requirement already satisfied: sentence-transformers==2.2.2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from -r extensions/long_term_memory/requirements.txt (line 4)) (2.2.2)
Requirement already satisfied: zarr==2.14.2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from -r extensions/long_term_memory/requirements.txt (line 5)) (2.14.2)
Requirement already satisfied: attrs>=19.2.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (22.2.0)
Requirement already satisfied: colorama in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (0.4.6)
Requirement already satisfied: exceptiongroup>=1.0.0rc8 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (1.1.1)
Requirement already satisfied: pluggy<2.0,>=0.12 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (1.0.0)
Requirement already satisfied: iniconfig in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (2.0.0)
Requirement already satisfied: tomli>=1.0.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (2.0.1)
Requirement already satisfied: packaging in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from pytest==7.2.2->-r extensions/long_term_memory/requirements.txt (line 2)) (23.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from scikit-learn==1.2.2->-r extensions/long_term_memory/requirements.txt (line 3)) (3.1.0)
Requirement already satisfied: scipy>=1.3.2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from scikit-learn==1.2.2->-r extensions/long_term_memory/requirements.txt (line 3)) (1.10.1)
Requirement already satisfied: joblib>=1.1.1 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from scikit-learn==1.2.2->-r extensions/long_term_memory/requirements.txt (line 3)) (1.2.0)
Requirement already satisfied: torchvision in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (0.15.0)
Requirement already satisfied: tqdm in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (4.65.0)
Requirement already satisfied: transformers<5.0.0,>=4.6.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (4.29.0.dev0)
Requirement already satisfied: nltk in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.8.1)
Requirement already satisfied: sentencepiece in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (0.1.98)
Requirement already satisfied: torch>=1.6.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (2.0.0)
Requirement already satisfied: huggingface-hub>=0.4.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (0.13.4)
Requirement already satisfied: fasteners in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from zarr==2.14.2->-r extensions/long_term_memory/requirements.txt (line 5)) (0.18)
Requirement already satisfied: numcodecs>=0.10.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from zarr==2.14.2->-r extensions/long_term_memory/requirements.txt (line 5)) (0.11.0)
Requirement already satisfied: asciitree in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from zarr==2.14.2->-r extensions/long_term_memory/requirements.txt (line 5)) (0.3.3)
Requirement already satisfied: requests in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (2.28.2)
Requirement already satisfied: filelock in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.11.0)
Requirement already satisfied: pyyaml>=5.1 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (6.0)
Requirement already satisfied: typing-extensions>=3.7.4.3 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (4.5.0)
Requirement already satisfied: entrypoints in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from numcodecs>=0.10.0->zarr==2.14.2->-r extensions/long_term_memory/requirements.txt (line 5)) (0.4)
Requirement already satisfied: sympy in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from torch>=1.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (1.11.1)
Requirement already satisfied: networkx in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from torch>=1.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.1)
Requirement already satisfied: jinja2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from torch>=1.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.1.2)
Requirement already satisfied: tokenizers!=0.11.3,<0.14,>=0.11.1 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from transformers<5.0.0,>=4.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (0.13.3)
Requirement already satisfied: regex!=2019.12.17 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from transformers<5.0.0,>=4.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (2023.3.23)
Requirement already satisfied: click in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from nltk->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (8.1.3)
Requirement already satisfied: pillow!=8.3.*,>=5.3.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from torchvision->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (9.5.0)
Requirement already satisfied: MarkupSafe>=2.0 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from jinja2->torch>=1.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (2.1.2)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from requests->huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (1.26.15)
Requirement already satisfied: charset-normalizer<4,>=2 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from requests->huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.1.0)
Requirement already satisfied: certifi>=2017.4.17 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from requests->huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (2022.12.7)
Requirement already satisfied: idna<4,>=2.5 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from requests->huggingface-hub>=0.4.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (3.4)
Requirement already satisfied: mpmath>=0.19 in s:\ai\oobabooga-windows\installer_files\env\lib\site-packages (from sympy->torch>=1.6.0->sentence-transformers==2.2.2->-r extensions/long_term_memory/requirements.txt (line 4)) (1.3.0)
Installing collected packages: numpy
  Attempting uninstall: numpy
    Found existing installation: numpy 1.23.5
    Uninstalling numpy-1.23.5:
      Successfully uninstalled numpy-1.23.5
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
numba 0.56.4 requires numpy<1.24,>=1.18, but you have numpy 1.24.2 which is incompatible.
Successfully installed numpy-1.24.2

(S:\ai\oobabooga-windows\installer_files\env) S:\ai\oobabooga-windows\text-generation-webui>python -m pytest -v extensions/long_term_memory
========================================================================================================================================================================================================== test session starts ==========================================================================================================================================================================================================
platform win32 -- Python 3.10.9, pytest-7.2.2, pluggy-1.0.0 -- S:\ai\oobabooga-windows\installer_files\env\python.exe
cachedir: .pytest_cache
rootdir: S:\ai\oobabooga-windows\text-generation-webui
plugins: anyio-3.6.2
collected 9 items

extensions/long_term_memory/core/_test/test_memory_database.py::test_typical_usage PASSED                                                                                                                                                                                                                                                                                                                                          [ 11%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_duplicate_messages PASSED                                                                                                                                                                                                                                                                                                                                     [ 22%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_inconsistent_state PASSED                                                                                                                                                                                                                                                                                                                                     [ 33%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_extended_usage PASSED                                                                                                                                                                                                                                                                                                                                         [ 44%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_reload_embeddings_from_disk PASSED                                                                                                                                                                                                                                                                                                                            [ 55%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_destroy_fake_memories PASSED                                                                                                                                                                                                                                                                                                                                  [ 66%]
extensions/long_term_memory/core/_test/test_memory_database.py::test_multi_fetch PASSED                                                                                                                                                                                                                                                                                                                                            [ 77%]
extensions/long_term_memory/utils/_test/test_chat_parsing.py::test_clean_character_message PASSED                                                                                                                                                                                                                                                                                                                                  [ 88%]
extensions/long_term_memory/utils/_test/test_timestamp_parsing.py::test_get_time_difference_message PASSED                                                                                                                                                                                                                                                                                                                         [100%]

=========================================================================================================================================================================================================== warnings summary ============================================================================================================================================================================================================
extensions/long_term_memory/core/_test/test_memory_database.py::test_typical_usage
  S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\pkg_resources\__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API
    warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)

extensions/long_term_memory/core/_test/test_memory_database.py::test_typical_usage
  S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\pkg_resources\__init__.py:2870: DeprecationWarning: Deprecated call to `pkg_resources.declare_namespace('mpl_toolkits')`.
  Implementing implicit namespace packages (as specified in PEP 420) is preferred to `pkg_resources.declare_namespace`. See https://setuptools.pypa.io/en/latest/references/keywords.html#keyword-namespace-packages
    declare_namespace(pkg)

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
==================================================================================================================================================================================================== 9 passed, 2 warnings in 18.68s =====================================================================================================================================================================================================

(S:\ai\oobabooga-windows\installer_files\env) S:\ai\oobabooga-windows\text-generation-webui>

The warnings don't look as dire as the numba error but I give it a shot running the start bat file:

Starting the web UI...

===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: CUDA runtime path found: S:\ai\oobabooga-windows\installer_files\env\bin\cudart64_110.dll
CUDA SETUP: Highest compute capability among GPUs detected: 8.6
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll...
Loading Alpaca-30B-Int4-128G-Safetensors...
Found the following quantized model: models\Alpaca-30B-Int4-128G-Safetensors\alpaca-30b-128g-4bit.safetensors
Loading model ...
S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\safetensors\torch.py:99: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  with safe_open(filename, framework="pt", device=device) as f:
S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\torch\_utils.py:776: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  return self.fget.__get__(instance, owner)()
S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\torch\storage.py:899: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly.  To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
  storage = cls(wrap_storage=untyped_storage)
Done.
Using the following device map for the 4-bit model: {'model.embed_tokens': 0, 'model.layers.0': 0, 'model.layers.1': 0, 'model.layers.2': 0, 'model.layers.3': 0, 'model.layers.4': 0, 'model.layers.5': 0, 'model.layers.6': 0, 'model.layers.7': 0, 'model.layers.8': 0, 'model.layers.9': 0, 'model.layers.10': 0, 'model.layers.11': 1, 'model.layers.12': 1, 'model.layers.13': 1, 'model.layers.14': 1, 'model.layers.15': 1, 'model.layers.16': 1, 'model.layers.17': 1, 'model.layers.18': 1, 'model.layers.19': 1, 'model.layers.20': 1, 'model.layers.21': 1, 'model.layers.22': 1, 'model.layers.23': 1, 'model.layers.24': 1, 'model.layers.25': 1, 'model.layers.26': 1, 'model.layers.27': 1, 'model.layers.28': 1, 'model.layers.29': 1, 'model.layers.30': 1, 'model.layers.31': 1, 'model.layers.32': 1, 'model.layers.33': 1, 'model.layers.34': 1, 'model.layers.35': 1, 'model.layers.36': 1, 'model.layers.37': 1, 'model.layers.38': 1, 'model.layers.39': 1, 'model.layers.40': 1, 'model.layers.41': 1, 'model.layers.42': 1, 'model.layers.43': 1, 'model.layers.44': 1, 'model.layers.45': 1, 'model.layers.46': 1, 'model.layers.47': 1, 'model.layers.48': 1, 'model.layers.49': 1, 'model.layers.50': 1, 'model.layers.51': 1, 'model.layers.52': 1, 'model.layers.53': 1, 'model.layers.54': 1, 'model.layers.55': 1, 'model.layers.56': 1, 'model.layers.57': 1, 'model.layers.58': 1, 'model.layers.59': 1, 'model.norm': 1, 'lm_head': 1}
Loaded the model in 7.61 seconds.
Loading the extension "long_term_memory"...
-----------------------------------------
IMPORTANT LONG TERM MEMORY NOTES TO USER:
-----------------------------------------
Please remember that LTM-stored memories will only be visible to the bot during your NEXT session. This prevents the loaded memory from being flooded with messages from the current conversation which would defeat the original purpose of this module. This can be overridden by pressing 'Force reload memories'
----------
LTM CONFIG
----------
change these values in ltm_config.json
{'ltm_context': {'injection_location': 'BEFORE_NORMAL_CONTEXT',
                 'memory_context_template': "{name2}'s memory log:\n"
                                            '{all_memories}\n'
                                            'During conversations between '
                                            '{name1} and {name2}, {name2} will '
                                            'try to remember the memory '
                                            'described above and naturally '
                                            'integrate it with the '
                                            'conversation.',
                 'memory_template': '{time_difference}, {memory_name} said:\n'
                                    '"{memory_message}"'},
 'ltm_reads': {'max_cosine_distance': 0.6,
               'memory_length_cutoff_in_chars': 1000,
               'num_memories_to_fetch': 2},
 'ltm_writes': {'min_message_length': 100}}
----------
-----------------------------------------
Ok.
Loading the extension "gallery"... Ok.
Running on local URL:  http://0.0.0.0:8005

To create a public link, set `share=True` in `launch()`.
Traceback (most recent call last):
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict
    output = await app.get_blocks().process_api(
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api
    result = await self.call_function(
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function
    prediction = await anyio.to_thread.run_sync(
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
    return await get_asynclib().run_sync_in_worker_thread(
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 937, in run_sync_in_worker_thread
    return await future
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\anyio\_backends\_asyncio.py", line 867, in run
    result = context.run(func, *args)
  File "S:\ai\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration
    return next(iterator)
  File "S:\ai\oobabooga-windows\text-generation-webui\modules\chat.py", line 224, in cai_chatbot_wrapper
    for history in chatbot_wrapper(text, generate_state, name1, name2, context, mode, end_of_turn):
  File "S:\ai\oobabooga-windows\text-generation-webui\modules\chat.py", line 146, in chatbot_wrapper
    prompt = custom_generate_chat_prompt(text, generate_state['max_new_tokens'], name1, name2, context, generate_state['chat_prompt_size'], **kwargs)
TypeError: custom_generate_chat_prompt() takes 2 positional arguments but 6 were given

Works fine without the extension.

@Kozonak
Copy link

Kozonak commented May 12, 2023

I fixed it by adding this once before the webui call

call "%CONDA_ROOT_PREFIX%\_conda.exe" install -c conda-forge zarr

INFO:Loading the extension "long_term_memory"...
No existing memories found, will create a new database.
INFO:Load pretrained SentenceTransformer: sentence-transformers/all-mpnet-base-v2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants