Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Loading the extension "long_term_memory"... Fail. #25

Open
highjohnconquer opened this issue Apr 16, 2023 · 9 comments
Open

Loading the extension "long_term_memory"... Fail. #25

highjohnconquer opened this issue Apr 16, 2023 · 9 comments

Comments

@highjohnconquer
Copy link

i keep getting this error everytime i try to load the extension:

Starting the web UI...
Loading the extension "long_term_memory"... Fail.
Traceback (most recent call last):
File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\modules\extensions.py", line 19, in load_extensions
exec(f"import extensions.{name}.script")
File "", line 1, in
File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\extensions\long_term_memory\script.py", line 14, in
from extensions.long_term_memory.core.memory_database import LtmDatabase
File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\extensions\long_term_memory\core\memory_database.py", line 8, in
from sentence_transformers import SentenceTransformer
File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers_init_.py", line 3, in
from .datasets import SentencesDataset, ParallelSentencesDataset
File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers\datasets_init_.py", line 1, in
from .DenoisingAutoEncoderDataset import DenoisingAutoEncoderDataset
File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers\datasets\DenoisingAutoEncoderDataset.py", line 5, in
import nltk
ModuleNotFoundError: No module named 'nltk'
Loading the extension "gallery"... Ok.
Running on local URL: http://127.0.0.1:7861

To create a public link, set share=True in launch().

@Samjack1533
Copy link

I also get same error
image
image

@Writekenny
Copy link

Writekenny commented Apr 17, 2023

I just figured this out. I guess installing the compoenets in requirement doesn't install them in the instance that runs text-generation-webui. Probably a better way to do this, but i'm very new to all of this, but I just modified my start-webui.bat file to include a pip install command for everything in the long-term-memory extension requirements.txt file like this and ran it once, then I didn't need those commands anymore:

@echo off

@echo Starting the web UI...

cd /D "%~dp0"

set MAMBA_ROOT_PREFIX=%cd%\installer_files\mamba
set INSTALL_ENV_DIR=%cd%\installer_files\env

if not exist "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" (
call "%MAMBA_ROOT_PREFIX%\micromamba.exe" shell hook >nul 2>&1
)
call "%MAMBA_ROOT_PREFIX%\condabin\micromamba.bat" activate "%INSTALL_ENV_DIR%" || ( echo MicroMamba hook not found. && goto end )
cd text-generation-webui
pip install numpy
pip install pytest
pip install scikit-learn
pip install sentence-transformers
pip install zarr

call python server.py --auto-devices --chat --wbits 4 --groupsize 128 --pre_layer 20 --model gpt4-x-alpaca-13b-native-4bit-128g --extensions long_term_memory --no-stream
:end
pause

@BarfingLemurs
Copy link

Thanks^ I use the one click linux install and adding the packages to start script was a breeze.

To create a public link, set `share=True` in `launch()`.
Closing server running on port: 7860
Loading the extension "gallery"... Ok.
Loading the extension "long_term_memory"... No existing memories found, will create a new database.
Downloading (…)a8e1d/.gitattributes: 100%|█| 1.18k/1.18k [00:00<00:00, 14.3MB/s]
Downloading (…)_Pooling/config.json: 100%|█████| 190/190 [00:00<00:00, 2.74MB/s]
Downloading (…)b20bca8e1d/README.md: 100%|█| 10.6k/10.6k [00:00<00:00, 5.23MB/s]
Downloading (…)0bca8e1d/config.json: 100%|█████| 571/571 [00:00<00:00, 5.65MB/s]
Downloading (…)ce_transformers.json: 100%|█████| 116/116 [00:00<00:00, 1.51MB/s]
Downloading (…)e1d/data_config.json: 100%|█| 39.3k/39.3k [00:00<00:00, 2.47MB/s]
Downloading pytorch_model.bin: 100%|█████████| 438M/438M [00:20<00:00, 21.5MB/s]
Downloading (…)nce_bert_config.json: 100%|████| 53.0/53.0 [00:00<00:00, 567kB/s]
Downloading (…)cial_tokens_map.json: 100%|█████| 239/239 [00:00<00:00, 1.92MB/s]
Downloading (…)a8e1d/tokenizer.json: 100%|███| 466k/466k [00:00<00:00, 8.98MB/s]
Downloading (…)okenizer_config.json: 100%|█████| 363/363 [00:00<00:00, 2.92MB/s]
Downloading (…)8e1d/train_script.py: 100%|█| 13.1k/13.1k [00:00<00:00, 65.2MB/s]
Downloading (…)b20bca8e1d/vocab.txt: 100%|███| 232k/232k [00:00<00:00, 17.0MB/s]
Downloading (…)bca8e1d/modules.json: 100%|█████| 349/349 [00:00<00:00, 2.77MB/s]

-----------------------------------------
IMPORTANT LONG TERM MEMORY NOTES TO USER:
-----------------------------------------
Please remember that LTM-stored memories will only be visible to the bot during your NEXT session. This prevents the loaded memory from being flooded with messages from the current conversation which would defeat the original purpose of this module. This can be overridden by pressing 'Force reload memories'
----------
LTM CONFIG
----------
change these values in ltm_config.json
{'ltm_context': {'injection_location': 'BEFORE_NORMAL_CONTEXT',
                 'memory_context_template': "{name2}'s memory log:\n"
                                            '{all_memories}\n'
                                            'During conversations between '
                                            '{name1} and {name2}, {name2} will '
                                            'try to remember the memory '
                                            'described above and naturally '
                                            'integrate it with the '
                                            'conversation.',
                 'memory_template': '{time_difference}, {memory_name} said:\n'
                                    '"{memory_message}"'},
 'ltm_reads': {'max_cosine_distance': 0.6,
               'memory_length_cutoff_in_chars': 1000,
               'num_memories_to_fetch': 2},
 'ltm_writes': {'min_message_length': 100}}
----------
-----------------------------------------
Ok.
Running on local URL:  http://127.0.0.1:7860

@Woisek
Copy link

Woisek commented Jun 18, 2023

I got a slightly different problem:

2023-06-18 20:06:53 INFO:Loading the extension "long_term_memory"...
2023-06-18 20:06:53 ERROR:Failed to load the extension "long_term_memory".
Traceback (most recent call last):
File "F:\Programme\oobabooga_windows\text-generation-webui\modules\extensions.py", line 34, in load_extensions
exec(f"import extensions.{name}.script")
File "", line 1, in
File "F:\Programme\oobabooga_windows\text-generation-webui\extensions\long_term_memory\script.py", line 14, in
from extensions.long_term_memory.core.memory_database import LtmDatabase
File "F:\Programme\oobabooga_windows\text-generation-webui\extensions\long_term_memory\core\memory_database.py", line 10, in
import zarr
ModuleNotFoundError: No module named 'zarr'
2023-06-18 20:06:53 INFO:Loading the extension "gallery"...

But zarr seems to be installed

F:\Programme\oobabooga_windows>pip install zarr
Requirement already satisfied: zarr in f:\programme\python310\lib\site-packages (2.14.2)
Requirement already satisfied: numpy>=1.20 in f:\programme\python310\lib\site-packages (from zarr) (1.24.2)
Requirement already satisfied: fasteners in f:\programme\python310\lib\site-packages (from zarr) (0.18)
Requirement already satisfied: numcodecs>=0.10.0 in f:\programme\python310\lib\site-packages (from zarr) (0.11.0)
Requirement already satisfied: asciitree in f:\programme\python310\lib\site-packages (from zarr) (0.3.3)
Requirement already satisfied: entrypoints in f:\programme\python310\lib\site-packages (from numcodecs>=0.10.0->zarr) (0.4)

[notice] A new release of pip available: 22.3.1 -> 23.1.2
[notice] To update, run: python.exe -m pip install --upgrade pip

F:\Programme\oobabooga_windows>

Any suggestions? 🤔

@miles-du
Copy link

It's there but Ooba is not finding it. If you run the command to show the Version of zarr, Does it work? Or essentially, is zarr in a location in your $PATH?

@Woisek
Copy link

Woisek commented Jun 19, 2023

Uh ... a short reminder on how I do this, please .. ? 😐
And in my $PATH I can't see anything about zarr ... 🤔

@Wisdawn
Copy link

Wisdawn commented Jun 19, 2023

I'm having the same zarr error message despite it being installed. I also have C:\ProgramData\anaconda3\lib\site-packages\ and C:\ProgramData\anaconda3\lib\site-packages\zarr\ in $PATH, but I added those manually, and I'm not sure if they're the correct paths that are needed; that's where zarr is installed, though, as this is the line I see when I try to pip install zarr:
Requirement already satisfied: zarr in c:\programdata\anaconda3\lib\site-packages (2.14.2)

I also have a zarr-2.15.0-py3-none-any.whl file in text-generation-webui\modules.

Here is the whole error message:

2023-06-19 22:14:03 ERROR:Failed to load the extension "long_term_memory".
Traceback (most recent call last):
  File "C:\AI\oobabooga\oobabooga_windows\text-generation-webui\modules\extensions.py", line 34, in load_extensions
    exec(f"import extensions.{name}.script")
  File "<string>", line 1, in <module>
  File "C:\AI\oobabooga\oobabooga_windows\text-generation-webui\extensions\long_term_memory\script.py", line 14, in <module>
    from extensions.long_term_memory.core.memory_database import LtmDatabase
  File "C:\AI\oobabooga\oobabooga_windows\text-generation-webui\extensions\long_term_memory\core\memory_database.py", line 10, in <module>
    import zarr
ModuleNotFoundError: No module named 'zarr'

The strange thing is that I was able to run the long_term_memory extension just fine right after its installation. I started seeing that error messages after shutting down oobabooga and rerunning it.

Does anyone know what's going on here?

@Wisdawn
Copy link

Wisdawn commented Jun 20, 2023

Summary: Skip start_windows.bat and run text-generation-webui using the command prompt within the textgen conda environment and the text-generation-webui folder.

Guys, I think I might've hopefully found solutions that can work for you too:

First thing to try is to NOT run text-generation-webui using the default provided start_windows.bat file. Instead, every time you run text-generation-webui and you want the long_term_memory extension to work, open a command prompt, browse via the cd command to your oobabooga\oobabooga_windows\text-generation-webui> folder, and within the folder activate the textgen conda environment (this step is crucial) via the conda activate textgen command. Then, finally, enter the python server.py --chat --extensions long_term_memory command to run text-generation-webui.

Consider deleting or renaming the start_windows.bat file to something like start_windows_not_compatible_with_LTM.bat so you don't forget later on, and always run text-generation-webui with the textgen conda environment activated in a command prompt and within the text-generation-webui subfolder.

This should hopefully work for the majority of users. If it doesn't, then try the below.


Secondly, one major root for this problem may be having a blank space or more in any of Python's paths. This is likely due to installing Python using the default method (only for yourself), and you having a Windows username that has a space in it. Thus, the beginning of Python's paths would be something like C:\User\John Smith\

The easiest way I can think of to resolve this issue, if it applies to you, is to uninstall Python and then reinstall it into C:\Python311 or whatever the current version is when you read these words.

After that, make sure that Edit the system environment variables -> Path -> Edit has C:\Python311 and C:\Python311\Scripts at the top.

Next, try running text-generation-webui again with the long_term_memory extension (remember: NOT using the start_windows.bat file!). If it still doesn't work, then keep reading.

You may need to change your Windows username to one without spaces if Python insists on having some of its components in your Windows user's folder. For this, you may want to check the top answer on this StackOverflow question and ensure that you read all the comments on the top answer because I give a tip in my own comment on that answer on how to speed up the process of replacing occurrences of your old username with the new space-free one. You can probably continue reading without attempting this complex process, yet, though, just in case things work for you without having to change your username. I just wanted to get this out of the way in case you already confirmed that something is trying to access some Python components in your space-infested user directory.

Next, rename your old long_term_memory subfolder or simply back up its user_data subfolder, then make sure that you install the long_term_memory extension again using the exact instructions given on its github page. For example, do NOT attempt some of the steps outside the textgen conda environment because those steps may work outside the textgen environment but will likely create problems. After following the exact instructions, keep trying the python server.py --chat --extensions long_term_memory command. If you receive any error messages about modules not existing, keep trying the pip install <module_name_here> command until you install all the missing modules showing in the error messages. Keep doing this until the python server.py --chat --extensions long_term_memory works within the text-generation-webui subdirectory and the textgen conda environment in the command prompt.

Hopefully, all this should work for the vast majority of users.

And remember to not use the default provided start_windows.bat file and always run text-generation-webui with the textgen conda environment activated in a command prompt and within the text-generation-webui subfolder.

@luancyrne
Copy link

luancyrne commented Jul 20, 2023

i keep getting this error everytime i try to load the extension:

Starting the web UI... Loading the extension "long_term_memory"... Fail. Traceback (most recent call last): File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\modules\extensions.py", line 19, in load_extensions exec(f"import extensions.{name}.script") File "", line 1, in File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\extensions\long_term_memory\script.py", line 14, in from extensions.long_term_memory.core.memory_database import LtmDatabase File "E:\Documents\AI\one-click-installers-oobabooga-windows\text-generation-webui\extensions\long_term_memory\core\memory_database.py", line 8, in from sentence_transformers import SentenceTransformer File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers__init__.py", line 3, in from .datasets import SentencesDataset, ParallelSentencesDataset File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers\datasets__init__.py", line 1, in from .DenoisingAutoEncoderDataset import DenoisingAutoEncoderDataset File "C:\Users\raziq\AppData\Roaming\Python\Python310\site-packages\sentence_transformers\datasets\DenoisingAutoEncoderDataset.py", line 5, in import nltk ModuleNotFoundError: No module named 'nltk' Loading the extension "gallery"... Ok. Running on local URL: http://127.0.0.1:7861

To create a public link, set share=True in launch().

For those who want to open through the start-webui.bat, the module errors that are occurring after installing the extension, just run the command pip install modulename inside the root folder of text-generation-webui, in my case I was getting the error zarr, just ran the command pip install zarr, remember to run these commands in the raz folder of text-generation-webui to avoid more errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants