Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[bug]ValueError:Cannot use llm_chat_callback on an instance without a callback_manager attribute. #40

Open
CapitalWilliam opened this issue Feb 15, 2024 · 6 comments

Comments

@CapitalWilliam
Copy link

问题描述

按照项目中的 Windows 说明按步骤执行过程中,遇到了问题。

环境信息

  • 操作系统:Windows 11
  • 开发工具:VSCode
  • Python 版本:3.9.12
  • llama-index 版本:0.9.39

执行步骤列表

  1. clone项目到本地。

  2. 使用 .venv 创建虚拟环境——Python 版本是 3.9.12。

  3. pip——过程中,会自动安装最新版本的 llama-index(版本是 0.10.x)。

  4. pip——我手动升级为 0.9.39。

  5. 首次运行前——在 VSCode 下遇到了模块与文件名重名的情况,需要修改 3 处才能运行 python cli.py。修改的地方分别是:

    • from llama_index.core.query_pipeline.components import (...) (components模块下的components.py无法识别)

    • from llama_index.core.llms.llm import LLM(llm 模块下的llm.py无法识别)

    • from llama_index.core.llms.llm import ChatMessage, MessageRole

    • from llama_index.core import global_handler(llama_index.core模块下没有global_handler)

  6. build txt 过程没有遇到问题。

  7. 在执行 ask -d 过程中可以正确返回结果,但随后产生了 ValueError。

Traceback (most recent call last):
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 120, in <module>
    cli.run()
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 53, in run
    self.parse_input(command_text)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 74, in parse_input
    self.question_answer()
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 109, in question_answer
    self.query(question)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 86, in query
    ans = self._executor.query(question)
  File "C:\Users\A\VscodeProjects\history_rag\executor.py", line 237, in query
    response = self.query_engine.query(question)
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\core\base_query_engine.py", line 40, in query
    return self._query(str_or_query_bundle)
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\query_engine\retriever_query_engine.py", line 172, in _query 
    response = self._response_synthesizer.synthesize(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\response_synthesizers\base.py", line 168, in synthesize      
    response_str = self.get_response(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\response_synthesizers\compact_and_refine.py", line 38, in get_response
    return super().get_response(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\response_synthesizers\refine.py", line 146, in get_response  
    response = self._give_response_single(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\response_synthesizers\refine.py", line 202, in _give_response_single
    program(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\response_synthesizers\refine.py", line 64, in __call__       
    answer = self._llm.predict(
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\core\llms\llm_.py", line 239, in predict
    chat_response = self.chat(messages)
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\core\llms\callbacks.py", line 84, in wrapped_llm_chat        
    with wrapper_logic(_self) as callback_manager:
  File "C:\Users\A\AppData\Local\Programs\Python\Python39\lib\contextlib.py", line 119, in __enter__
    return next(self.gen)
  File "c:\Users\A\VscodeProjects\history_rag\.venv\lib\site-packages\llama_index\core\llms\callbacks.py", line 30, in wrapper_logic
    raise ValueError(
ValueError: Cannot use llm_chat_callback on an instance without a callback_manager attribute.
@CapitalWilliam CapitalWilliam changed the title [windows+vscode]ValueError:Cannot use llm_chat_callback on an instance without a callback_manager attribute. [bug][windows+vscode]ValueError:Cannot use llm_chat_callback on an instance without a callback_manager attribute. Feb 15, 2024
@CapitalWilliam CapitalWilliam changed the title [bug][windows+vscode]ValueError:Cannot use llm_chat_callback on an instance without a callback_manager attribute. [bug]ValueError:Cannot use llm_chat_callback on an instance without a callback_manager attribute. Feb 15, 2024
@wxywb
Copy link
Owner

wxywb commented Feb 15, 2024

from llama_index.llms import OpenAI

from llama_index.prompts import ChatPromptTemplate, ChatMessage, MessageRole, PromptTemplate

我先确认一下你遇到的问题,你的意思是说executor.py中的这几行无法运行吗,并且在llama-index 0.9.39的情况下。

@CapitalWilliam
Copy link
Author

CapitalWilliam commented Feb 15, 2024

from llama_index.llms import OpenAI

from llama_index.prompts import ChatPromptTemplate, ChatMessage, MessageRole, PromptTemplate

我先确认一下你遇到的问题,你的意思是说executor.py中的这几行无法运行吗,并且在llama-index 0.9.39的情况下。

我遇到的情况应该是两部分,一个是 项目在vscode下遇到的"模块名和py文件名相同时的冲突",这部分可能是vscode的python extension或者是其他部分有关系,我准备稍后用pycharm试试,如果没有问题,我再更新.
【更新】在pycharm尝试了一次,两部分情况又都遇到了

第二个就是我修改了以下几处的,替换同名 或者是参考llama-index仓库里其他版本的类似写法(global_handler)

from llama_index.core.query_pipeline.components import (...) (components模块下的components.py无法识别)

from llama_index.core.llms.llm import LLM(llm 模块下的llm.py无法识别)

from llama_index.core.llms.llm import ChatMessage, MessageRole

from llama_index.core import global_handler(llama_index.core模块下没有global_handler)

以上操作后,可以顺利执行:
1 cmd窗口执行python cli.py
2 项目窗口执行milvus
3 项目窗口执行build *.txt
4 项目窗口输入ask -dask
5 项目窗口输入华雄是被谁杀死的

上述步骤后,我就会遇到标题里的ValueError报错.

我也尝试通过给以上逐行代码添加断点,来尝试捕捉,但我在vscode 里面打开debug模式后,第一次触发断点会是在
Raised Exceptions情况下,出现在 executor.py文件中

具体的Exception位置如下:

response = self.query_engine.query(question)

Exception has occurred: ValueError
Cannot use llm_chat_callback on an instance without a callback_manager attribute.
  File "C:\Users\A\VscodeProjects\history_rag\executor.py", line 237, in query
    response = self.query_engine.query(question)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 86, in query
    ans = self._executor.query(question)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 109, in question_answer
    self.query(question)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 74, in parse_input
    self.question_answer()
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 53, in run
    self.parse_input(command_text)
  File "C:\Users\A\VscodeProjects\history_rag\cli.py", line 120, in <module>
    cli.run()
ValueError: Cannot use llm_chat_callback on an instance without a callback_manager attribute.

@wxywb
Copy link
Owner

wxywb commented Feb 15, 2024

你再确认一下llamaindex的版本(pip list |grep llama),clone一个原版
试一下直接使用cmdline而不是ide(vscode, pycharm...etc)

@CapitalWilliam
Copy link
Author

你再确认一下llamaindex的版本(pip list |grep llama),clone一个原版的 试一下直接使用cmdline而不是ide(vscode, pycharm...etc)

尝试了一下使用windows11自带的terminal,成功实现了调用。

看来应该是我自己电脑上ide配置的问题,后续如果我解决了我再继续更新办法

@IshitaArora-246
Copy link

I am using llama-index version 0.10.0 and getting the same error. The command pip list | grep llama also didn't work.

@CapitalWilliam
Copy link
Author

I am using llama-index version 0.10.0 and getting the same error. The command pip list | grep llama also didn't work.

Hi Ishita. About this, I have searched answer and
it seems like that the llama-index library have been rewritten after version 0.10.x.

I believe llama-index works good at version>=0.9.4x

You can run pip install llama-index==0.9.39 --upgrade and (if you are facing same problems I mentioned,) run history_rag via cmd instead of some IDEs .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants