Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
34918c0
docs: fix formatting in README
wincent Oct 2, 2023
9d42b2f
chore: remove debugging `println!`
wincent Jul 19, 2024
91be267
feat: refactor health check to allow spaces in SHELLBOT
wincent Jul 19, 2024
7d5ad27
style: make some cosmetic improvements
wincent Oct 20, 2023
1a5ca7d
refactor!: use NBSP at front of headers as well
wincent Oct 21, 2023
841a2ec
docs: fix outdated instruction in README.md
wincent Jul 19, 2024
d98d049
chore: try GPT-4o
wincent Jul 19, 2024
d95b524
refactor: move Neovim files into `lua/` subdirectory
wincent Oct 11, 2024
0682fbd
feat: add syntax file
wincent Oct 11, 2024
3201989
feat: add `:ChatGPT` command
wincent Oct 11, 2024
a089bc6
docs: add instructions for installation
wincent Oct 11, 2024
0809fdb
feat: add an ftplugin file
wincent Oct 11, 2024
a24dafd
refactor: move settings from module into ftplugin
wincent Oct 11, 2024
321ea3e
refactor: replace deprecated `nvim_buf_set_option()` call
wincent Oct 11, 2024
c934b21
refactor: move mappings into ftplugin
wincent Oct 11, 2024
7a92356
feat: add ability to set model at runtime via env var
wincent Nov 20, 2024
f848d28
fix: suppress system prompt for "o1-preview" and "o1-mini" models
wincent Nov 20, 2024
f63da96
refactor!: rename `:ChatGPT` to `:Shellbot`
wincent Jan 6, 2025
c9f2d71
chore: default Anthropic to "claude-3-5-sonnet-20241022"
wincent Jan 6, 2025
eb6d184
docs: document new environment variables
wincent Jan 6, 2025
78a6114
chore: bump default Anthropic model to claude-3-7-sonnet-20250219
wincent Feb 24, 2025
4f06149
feat: teach `chatbot()` to take an `env` table
wincent Apr 29, 2025
84b1c35
chore: bump OpenAI model to GPT-4.1
wincent May 12, 2025
f8470e5
chore: bump models to latest
wincent Sep 29, 2025
9164ff4
chore: bump ANTHROPIC_MODEL default from old Sonnet to new Opus
wincent Feb 6, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 70 additions & 12 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,89 @@
# Streaming ChatGPT for Shell and Neovim
AI assistants are transformational for programmers. However, ChatGPT 4 is also relatively slow. Streaming its responses greatly improves the user experience. These utilities attempts to bring these tools closer to the command-line and editor while preserving streaming. There are three parts here:
# Streaming LLM for Shell and Neovim

AI assistants are transformational for programmers. However, models like ChatGPT 4 are also relatively slow. Streaming their responses greatly improves the user experience. These utilities attempts to bring these tools closer to the command-line and editor while preserving streaming. There are three parts here:

1. A Rust binary that streams completion responses to stdin
2. A shell script that builds a little REPL over that binary
3. A Neovim Lua plug-in that brings this functionality into the editor


## Rust program

The Rust program can be built with `cargo build`. It expects an `OPENAI_API_KEY` and/or an `ANTHROPIC_API_KEY` environment variable. If both keys are provided, Anthropic is used. The Rust program can take two kinds of input, read from stdin:
1. Raw input
In this case, a System prompt is provided in the compiled code
2. Transcript
The Rust program also accepts a homegrown "transcript" format in which transcript sections are delineated by lines which look like this

1. **Raw input:** In this case, a System prompt is provided in the compiled code
2. **Transcript:** The Rust program also accepts a homegrown "transcript" format in which transcript sections are delineated by lines which look like this

```
===USER===
```
If a transcript does not start with a System section, then the default System prompt is used.

## Lua script
The included lua script can be copied to `.config/nvim/lua` and installed with something like
If a transcript does not start with a System section, then the default System prompt is used. The default prompt can be customized with contents from a file passed as a first argument to the executable.

To override the default Anthropic model (`claude-opus-4-6`), specify the desired model via the `ANTHROPIC_MODEL` environment variable.

To override the default OpenAI model (`gpt-5`), set the `OPENAI_MODEL` environment variable to the desired value.

## Installation

### Using `git clone`

```
mkdir -p ~/.config/nvim/pack/bundle/start
git clone https://github.com/wolffiex/shellbot.git ~/.config/nvim/pack/bundle/start/shellbot
cd ~/.config/nvim/pack/bundle/start/shellbot
cargo build
```

### Using `packer.nvim`

```lua
use {
'wolffiex/shellbot',
run = 'cargo build'
}
```
vim.cmd("command! ChatGPT lua require'chatgpt'.chatgpt()")

### Using `vim-plug`

```vim
Plug 'wolffiex/shellbot', { 'do': 'cargo build' }
```

This command locates the Rust binary through the `SHELLBOT` environment variable. This should be set to the absolute path of the rust binary built in the step above.
### Using `dein.vim`

```vim
call dein#add('wolffiex/shellbot', { 'build': 'cargo build' })
```

### Using `lazy.nvim`

```lua
{
'wolffiex/shellbot',
build = 'cargo build'
}
```

### Using `Vundle`

```vim
Plugin 'wolffiex/shellbot'
```

After installation, run `:!cargo build` in the plugin directory.

## Neovim commands

### `:Shellbot`

The plugin defines a `:Shellbot` command that locates the Rust binary through the `SHELLBOT` environment variable. This should be set to the absolute path of the Rust binary built in the step above.

This plugin is optimized to allow for streaming. It attempts to keep new input in view by repositioning the cursor at the end of the buffer as new text is appended. The plugin takes care to work in the case that the user switches away from the window where the response is coming in. To turn off the cursor movement while a response is streaming, hit "Enter" or "Space." This will free the cursor for the rest of the response.

### `:checkhealth shellbot`

Verifies that the file defined by `SHELLBOT` exists and is executable.

## Shell script

`shellbot.sh` can be used from the command line in cases where the editor isn't active. Because it uses `fold` for word wrap, it works best in a narrow window. The first prompt comes from $EDITOR. Subsequent prompts are taken with `read`. Hitting enter on a blank line does submit.
18 changes: 18 additions & 0 deletions ftplugin/shellbot.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
vim.bo.buflisted = true
vim.bo.buftype = 'nofile'
vim.bo.modified = false
vim.bo.textwidth = 0
vim.wo.breakindent = true
vim.wo.linebreak = true
vim.wo.list = false
vim.wo.number = false
vim.wo.relativenumber = false
vim.wo.showbreak = 'NONE'
vim.wo.wrap = true

local has_shellbot = pcall(require, 'chatbot')
if has_shellbot then
vim.keymap.set({ 'i', 'n' }, '<M-CR>', ChatBotSubmit, { buffer = true })
vim.keymap.set({ 'i', 'n' }, '<C-Enter>', ChatBotSubmit, { buffer = true })
vim.keymap.set({ 'i', 'n' }, '<C-o>', ChatBotNewBuf, { buffer = true })
end
48 changes: 22 additions & 26 deletions chatbot.lua → lua/chatbot.lua
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,14 @@ local is_receiving = false
local bot_cmd = os.getenv("SHELLBOT")
local separator = "==="

local nbsp = ' '
local roles = {
USER = "◭🧑 " .. os.getenv('USER'),
ASSISTANT = "◮🤖 vimbot",
USER = nbsp .. "🤓 «" .. os.getenv('USER') .. "»" .. nbsp,
ASSISTANT = nbsp .. "🤖 «vimbot»" .. nbsp,
}

local buffer_env = {}

local buffer_sync_cursor = {}
function ChatBotCancelCursorSync()
local bufnr = vim.api.nvim_get_current_buf()
Expand Down Expand Up @@ -43,6 +46,10 @@ function ChatBotSubmit()
vim.cmd("normal! Go")
local winnr = vim.api.nvim_get_current_win()
local bufnr = vim.api.nvim_get_current_buf()
local env = buffer_env[bufnr] and vim.tbl_extend('keep', buffer_env[bufnr], {
SHELLBOT_LOG_FILE = vim.env['SHELLBOT_LOG_FILE'],
})
local clear_env = not not env
buffer_sync_cursor[bufnr] = true
local function receive_stream(_, data, _)
if #data > 1 or data[1] ~= '' then
Expand Down Expand Up @@ -95,9 +102,9 @@ function ChatBotSubmit()
local function get_transcript()
local lines = vim.api.nvim_buf_get_lines(bufnr, 0, -1, false)
for i, line in ipairs(lines) do
if line:match("^◭") then -- '^' means start of line
if line:match('^' .. nbsp .. '🤓') then -- '^' means start of line
lines[i] = separator .. "USER" .. separator
elseif line:match("^◮") then
elseif line:match('^' .. nbsp ..'🤖') then
lines[i] = separator .. "ASSISTANT" .. separator
end
end
Expand All @@ -112,6 +119,8 @@ function ChatBotSubmit()
local output = {}

local job_id = vim.fn.jobstart(bot_cmd, {
clear_env = clear_env,
env = env,
on_stdout = function(_, data, _)
if data[1] ~= "" then
table.insert(output, data[1])
Expand Down Expand Up @@ -154,6 +163,8 @@ function ChatBotSubmit()
end

local job_id = vim.fn.jobstart(bot_cmd, {
clear_env = clear_env,
env = env,
on_stdout = receive_stream,
on_exit = stream_done,
on_stderr = function(_, data, _)
Expand Down Expand Up @@ -202,40 +213,25 @@ function ChatBotSubmit()
end

function ChatBotNewBuf()
local bufnr = vim.api.nvim_get_current_buf()
vim.cmd("enew")
ChatBotInit()
ChatBotInit(buffer_env[bufnr])
end

function ChatBotInit()
function ChatBotInit(env)
local winnr = vim.api.nvim_get_current_win()
local bufnr = vim.api.nvim_get_current_buf()
buffer_env[bufnr] = env
buffer_sync_cursor[bufnr] = true
vim.wo.breakindent = true
vim.wo.wrap = true
vim.wo.linebreak = true
vim.api.nvim_buf_set_option(bufnr, 'filetype', 'shellbot')
vim.api.nvim_buf_set_option(bufnr, 'buftype', 'nofile')
vim.api.nvim_buf_set_option(bufnr, 'buflisted', true)
vim.api.nvim_buf_set_option(bufnr, 'modified', false)
vim.api.nvim_set_option_value('filetype', 'shellbot', { buf = bufnr })
add_transcript_header(winnr, bufnr, "USER", 0)
local modes = { 'n', 'i' }
for _, mode in ipairs(modes) do
vim.api.nvim_buf_set_keymap(bufnr, mode, '<C-Enter>', '<ESC>:lua ChatBotSubmit()<CR>',
{ noremap = true, silent = true })
vim.api.nvim_buf_set_keymap(bufnr, mode, '<C-o>', '<ESC>:lua ChatBotNewBuf()<CR>',
{ noremap = true, silent = true })
end
end

function M.chatbot()
function M.chatbot(env)
vim.cmd("botright vnew")
vim.cmd("set winfixwidth")
vim.cmd("vertical resize 60")
ChatBotInit()
end

function M.chatbot_init()
ChatBotInit()
ChatBotInit(env)
end

function ChatBotCancelResponse()
Expand Down
21 changes: 21 additions & 0 deletions lua/shellbot/health.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
local health = vim.health -- after: https://github.com/neovim/neovim/pull/18720
or require('health') -- before: v0.8.x

return {
-- Run with `:checkhealth shellbot`
check = function()
local shellbot = vim.env['SHELLBOT']
if shellbot == nil then
health.warn('SHELLBOT environment variable is not set')
else
local executable = vim.fn.split(shellbot, ' ')[1]
if executable == nil then
health.warn('SHELLBOT environment variable is empty')
elseif vim.fn.executable(executable) ~= 1 then
health.warn('SHELLBOT (' .. vim.inspect(shellbot) .. ') is not executable')
else
health.ok('SHELLBOT environment variable is set to an executable')
end
end
end,
}
12 changes: 12 additions & 0 deletions plugin/shellbot.lua
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
vim.api.nvim_create_user_command('Shellbot', function()
local shellbot = require('chatbot')
local env = vim.env['SHELLBOT']
if env ~= nil then
local executable = vim.fn.split(env, ' ')[1]
if executable ~= nil and vim.fn.executable(executable) == 1 then
shellbot.chatbot()
return
end
end
vim.api.nvim_err_writeln('error: SHELLBOT does not appear to be executable')
end, {})
16 changes: 0 additions & 16 deletions shellbot/health.lua

This file was deleted.

10 changes: 7 additions & 3 deletions src/anthropic.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ use reqwest::header::{HeaderMap, HeaderValue};
use reqwest::{Client, RequestBuilder};
use serde::{Deserialize, Serialize};

const MODEL: &str = "claude-3-opus-20240229";
pub fn get_request(api_key: &str, request: ChatRequest) -> RequestBuilder {
const MODEL: &str = "claude-opus-4-6";
pub fn get_request(api_key: &str, model: &str, request: ChatRequest) -> RequestBuilder {
let client = Client::new();
let url = "https://api.anthropic.com/v1/messages";
let mut headers = HeaderMap::new();
Expand All @@ -28,7 +28,11 @@ pub fn get_request(api_key: &str, request: ChatRequest) -> RequestBuilder {
);

let request = RequestJSON {
model: MODEL.to_string(),
model: if model.is_empty() {
MODEL.to_string()
} else {
model.to_string()
},
system: request.system_prompt,
messages: request.transcript,
stream: true,
Expand Down
16 changes: 10 additions & 6 deletions src/api.rs
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,18 @@ use crate::sse::SSEConverter;
use crate::sse::SSEvent;

pub enum ApiProvider {
OpenAI(String),
Anthropic(String),
OpenAI(String, String),
Anthropic(String, String),
}

pub fn stream_response<'a>(provider: ApiProvider, request: ChatRequest) -> Receiver<String> {
let request = match provider {
ApiProvider::OpenAI(ref api_key) => openai::get_request(&api_key, request),
ApiProvider::Anthropic(ref api_key) => anthropic::get_request(&api_key, request),
ApiProvider::OpenAI(ref api_key, ref model) => {
openai::get_request(&api_key, &model, request)
}
ApiProvider::Anthropic(ref api_key, ref model) => {
anthropic::get_request(&api_key, &model, request)
}
};
let (sender, receiver) = mpsc::channel(100);
tokio::spawn(async move { send_response(&provider, request, sender).await });
Expand Down Expand Up @@ -98,8 +102,8 @@ fn convert_chunk(chunk: Bytes) -> String {

fn process_sse(provider: &ApiProvider, event: SSEvent) -> Option<String> {
match provider {
ApiProvider::Anthropic(_) => anthropic::convert_sse(event),
ApiProvider::OpenAI(_) => openai::convert_sse(event),
ApiProvider::Anthropic(_, _) => anthropic::convert_sse(event),
ApiProvider::OpenAI(_, _) => openai::convert_sse(event),
}
}

Expand Down
14 changes: 10 additions & 4 deletions src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,16 @@ async fn main() {
}
let request = structure_input();
let provider = std::env::var("ANTHROPIC_API_KEY")
.map(ApiProvider::Anthropic)
.or_else(|_| std::env::var("OPENAI_API_KEY").map(ApiProvider::OpenAI))
.map(|key| {
let model = std::env::var("ANTHROPIC_MODEL").unwrap_or_default();
ApiProvider::Anthropic(key, model)
})
.or_else(|_| {
std::env::var("OPENAI_API_KEY").map(|key| {
let model = std::env::var("OPENAI_MODEL").unwrap_or_default();
ApiProvider::OpenAI(key, model)
})
})
.unwrap_or_else(|_| panic!("No API key provided"));
let mut receiver = stream_response(provider, request);

Expand Down Expand Up @@ -71,13 +79,11 @@ fn structure_input() -> ChatRequest {
let args: Vec<String> = std::env::args().collect();
let system_prompt = if args.len() > 1 {
let file_path = &args[1];
println!("FILE {:?}", file_path);
let mut file = File::open(file_path).unwrap_or_else(|_| {
panic!("Failed to open file: {}", file_path);
});
let mut contents = String::new();
file.read_to_string(&mut contents).unwrap();
println!("contents {:?}", contents);
contents
} else {
get_default_prompt()
Expand Down
Loading