What is Monty?
Monty combines the intelligence of LLMs with the context of your code using a technique called RAG (retrieval augmented generation). Monty will automatically update its context to find code relevant to the task at hand.
Try Monty for free. After the trail, Monty is free for personal use with your own LLM API key. Currently Monty works with OpenAI, Azure OpenAI, Anthropic, and Google AI Studio, as well as OpenAI API-compatible providers such as Groq and Ollama.
This allows Monty to automatically find relevant code
The index is built locally and never leaves your machine. Code is only seen by your LLM.
Codebases are constantly changing. Monty keeps its code index up to date so its knowledge is never stale.
As long as your code is in text files, Monty can read it. Doesn't matter what language you use.
Now with manual context control
Monty provides helpers that let you precisely specify the relevant files, functions, classes, methods, and code blocks.
Try Document Mode
Monty can optionally use a markdown document instead of chat history when you’re trying to produce a shareable document, ticket, or specification.
In this mode, Monty uses the content of your document and instruction to search for relevant code. As you update the document, Monty’s context updates as well.
Interested in a commercial license? Drop us a line
Monty requires either Windows (x86_64), Linux (x86_64), or macOS (Apple Silicon) with VSCode and Git installed.
At this time only VSCode is supported, though we plan to support other IDEs in the future (e.g. JetBrains).
Monty works with local repositories and builds local indices. The only code that leaves your system is what's sent to your LLM.
VSCode remote workspaces and WSL (Windows Subsystem for Linux) are both supported.
Monty currently supports several LLM providers, including OpenAI and OpenAI compatible providers such as Groq and Ollama, as well as Azure OpenAI, Anthropic, and Google AI Studio.