Clico is a command-line utility that enables you to use AI to manipulate output, generate commands, or query an LLM using contextual data. It's not just an AI assistant, it's the CLI COmpanion tool for your shell.
It's designed to enable using natural language to perform complex operations in data, assist you in building shell commands, and help you troubleshoot all types of issues in any system. A simple, AI-powered swiss army knife, if you will.
Clico was built with old school shell scripting nerds in mind. It aims to be as minimal as possible and work
seamlessly with classic tools like grep
, sed
and awk
. It's also designed to be used in automation.
No ncurses, no fancy UI, just plain old command line.
Clico started as a personal project to learn more about AI, LLMs and the like. It's still just a PoC, I wouldn't recommend using it for anything serious. I'm pretty sure there are much better, more mature alternatives around.
At the moment, only Ollama is supported.
$ ls -l | clico pipe "print table with just mode, size and name columns in json format" | jq
[
{
"mode": "-rw-r--r--",
"size": 43,
"name": "Makefile"
},
{
"mode": "-rw-r--r--",
"size": 148,
"name": "flags.go"
},
{
"mode": "-rw-r--r--",
"size": 185,
"name": "go.mod"
},
{
"mode": "-rw-r--r--",
"size": 2297,
"name": "go.sum"
},
...
]
$ clico run "list all .go files in the current directory and their sizes"
ls -l *.go | awk '{print $9, $5}'
$ clico run --execute "list all .go files in the current directory and their sizes"
executing `ls -l *.go | awk '{print $9, $5}'`:
api.go 1372
cmd_explain.go 2156
cmd_pipe.go 2074
cmd_run.go 1912
main.go 2519
$ cat /etc/sudoers
cat: /etc/sudoers: Permission denied
$ cat /etc/sudoers 2>&1 | clico explain "what does this mean?"
The command "cat" was used to try and display the contents of the file "/etc/sudoers",
but it failed because the user running the command doesn't have permission to access that
file. The error message indicates that the file is protected by sudo, which means only
users with superuser privileges can read or modify its contents.
All that's required is that you have an Ollama server running, and a working Go installation.
There's no prebuilt binaries available for download at the moment, so you'll need to build the binary from source:
go get -u github.com/torvall/clico
If your Ollama server is running locally and listening on port 11434
, you are set to go.
Otherwise, set the CLICO_SERVER
env var to the address of your Ollama server:
export CLICO_SERVER=http://192.168.1.5:11434
If you don't have the llama3.1
model or want to use a different one, you can set the
CLICO_MODEL
env var:
export CLICO_MODEL=codellama
Clico can work in three different ways:
clico pipe
- takes the data from stdin and feeds it to the LLM together with the prompt, useful to manipulate dataclico run
- generates a command line from a natural language prompt, and optionally executes itclico explain
- takes the data from stdin and runs it through the LLM along with a specific question
The command line of Clico is pretty simple. It takes the following form:
clico <command> "prompt"
Where <command>
is one of the three commands above, and "prompt"
is a natural language prompt that
describes what you want to do. It's recommended to wrap the prompt in double quotes.
The model can be specified using the --model
global flag. The default model is llama3.1
. You can use the
CLICO_MODEL
environment variable to override this value. Other environment variables are available: CLICO_SERVER
,
to set the address of the Ollama server, and CLICO_TEMPERATURE
to set the temperature of the LLM.
By default, Clico will pass the OS, architecture and shell to the LLM. You can override these values by using the
--os
, --arch
or --shell
global flags. This can be helpful to generate commands or get explanations for
a different system than the one you're on. Run clico --help
to know the values detected for your system.
- Expose more model options (context size, top_k/top_p, etc.)
- Add support for "no solution" from the LLM
- Customise internal prompts according to available data
- Support image upload in requests
- Add option to include shell history in
explain
requests - Add ANSI support
- Add some sort of testing
- Add a custom system prompt
- Allow custom prompts
- Support more providers other than Ollama
MIT License
Copyright (c) 2024 António Maria Torre do Valle
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.