AWK-like simple text processor but working with LLM (Large Language Model). It allows you to easily perform text conversion, extraction, translation, and any other text processing tasks.
For example:
$ ls -l | llawk -f json "Extract file names and sizes"
{
"files": [
{
"name": "LICENSE",
"size": 1079
},
{
"name": "README.md",
"size": 3358
},
.
.
.
]
}
- OpenAI
- GPT-4.1
- GPT-4.1-mini
- GPT-4.1-nano
- GPT-4o
- GPT-4o-mini
- o3
- o4-mini
- Google
- gemini-2.0-flash
- gemini-2.0-flash-lite
- gemini-1.5-pro
- Anthropic
- claude-3.7-sonnet
- claude-3.5-haiku
- Ollama
- any models which supports chat completion
To install llawk
, use the following command:
$ go install github.com/llawk/llawk@latest
llawk takes an instruction from the command line, and an input text from stdin or a file.
$ echo hello | llawk "Translate to Japanese"
^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^
Input Instruction
$ llawk -i input.txt "Translate to Japanese"
^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^
InputFile Instruction
It supports a JSON output format using the -f json
option:
$ ls -l | llawk -f json "Extract file names and sizes"
You can also specify a JSON Schema for the output:
$ echo hello | llawk -f '{"type":"object","properties":{"japanese":{"type":"string"},"french":{"type":"string"}}}' "Translate to Japanese"
You can specify the model using the -m
option or the LLAWK_MODEL
environment variable:
$ echo hello | llawk -m gemini-2.0-flash "Translate to Japanese"
To see the list of supported models, run llawk -m list
.
OpenAI models require an API key to access.
Please set OPENAI_API_KEY
environment variable before using OpenAI models.
$ export OPENAI_API_KEY=sk-xxxxxx
$ echo hello | llawk -m gpt-4o "Translate to Japanese"
You can specify the OpenAI API organization using the OPENAI_ORG_ID
environment variable.
Google models require an API key to access.
Please set GEMINI_API_KEY
environment variable before using Google models.
$ export GEMINI_API_KEY=xxxxxx
$ echo hello | llawk -m gemini-2.0-flash "Translate to Japanese"
Anthropic models require an API key to access.
Please set ANTHROPIC_API_KEY
environment variable before using Anthropic models.
$ export ANTHROPIC_API_KEY=xxxxxx
$ echo hello | llawk -m claude-3.5-haiku "Translate to Japanese"
You can use Ollama models without any special settings because Ollama does not require an API key.
Any model supported by Ollama that offers chat completion can be used.
$ echo hello | llawk -m ollama:llama3.2 "Translate to Japanese"
$ echo hello | llawk -m ollama:hf.co/elyza/Llama-3-ELYZA-JP-8B-GGUF "Translate to Japanese"
If you want to connect to a specific Ollama server, you can specify the server URL with the OLLAMA_HOST
environment variable.
$ export OLLAMA_HOST=https://192.168.1.123:11434
$ echo hello | llawk -m ollama:llama3.2 "Translate to Japanese"