Introducing Codey: Piping LLMs on the Command Line

2025-06-25 · Ryan X. Charles

Today I’m happy to introduce Codey, a new tool for piping LLMs on the command line.

The basic idea is that you can prompt an LLM and get a streaming response back in your terminal.

Your input looks like this:

codey prompt "What is the capital of France?"

Output:

The capital of France is Paris.

Codey supports pipes, so you can pipe input in and out again. For instance:

codey prompt "The capital of France is Paris. \
(Please respond in the form of a question.)" | codey prompt

Output:

The capital of France is Paris.

It is designed to be configurable, and support many different LLMs, including those from OpenAI, Anthropic, and xAI.

Because Codey is a command-line tool, it can easily be integrated into other tools. I am already using Codey in ChatVim, with plans to integrate it into other projects as well.

Check it out at github.com/codeybeaver/codey.


Earlier Blog Posts


Back to Blog

Copyright © 2025 Ryan X. Charles
Home · Blog · Software · Social · CV
Light mode is the bright mode.