Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A question and a couple of suggestions #139

Open
timechips opened this issue Jun 11, 2024 · 0 comments
Open

A question and a couple of suggestions #139

timechips opened this issue Jun 11, 2024 · 0 comments

Comments

@timechips
Copy link

timechips commented Jun 11, 2024

Hello, I am new to using this plugin, AI, Github and Obsidian in general.

Question

  • I am running Smart2Brain with Ollama-llama3 8b, it is using a tremendous amounts of time (~5min) for even the simplest queries. I consider my PC powerful enough (R7 5700x, RX 6700 XT, 32gb RAM, ~6GB/s w/r ssd) and the response of AI is blazing fast (<1s) if I unchecked the cute octopus. Is this the problem of:
    • A lot of notes (~1500)
    • They are in Slovenian (language not supported by nomic-embed-text, still waiting for multilingual-e5-large-instruct)
    • I am doing something else wrong (using llama3, mostly trained on English content)

Suggestions

  • Custom commands considering only the current note (similar to what you can do with the Ollama plugin) you could add commands with custom prompts (for example, translate my note, summarise my note, spellcheck or writing asist etc.):

    • A icon toolbar at the bottom of the obsidian notes (not the side menu). In settings you could assign custom icons for your commands.
    • In setting where you would add custom commands, you could also choose to ignore/copy certain context in the note (e.g. ignore links or copy images so that are used in summary but the links are not butchered, ignore LaTeX,...)
    • A spicy UI idea. A drop down summary (or anything else, configurable in settings) that would be added in the note as a code block render (idk. how to say properly, but similar to the functionplot plugin)
      Smart2Brain 2024-06-11 18.04.49.excalidraw.png
  • A context choose menu (in the side menu, where the chat is) as a drop down menu, where you would have some default options e.g. current note, current folder, name of the folder the current folder is in, and a choose folder.

  • Considering ollama, the recommend models should be sorted as e.g. (could even add the system requirements for them, and the same for embed models):

    • best English only light model - llama3,
    • best English only ultralight model - gemma,
    • best multi-language but heavy model - mixtral:8x7b

some slightly more far fetched ideas

  • Add gpt researcher so that a model can do online research for you
  • Add image recognition (chatgpt has an option already, and also has ollama with e.g. llava)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant