How to set up a new LLM?
January 21, 2026 ยท View on GitHub
If you want to test out a model that isn't yet available in the runner, you can add support for it by following these steps:
- Ensure that the provider of the model is supported by AI SDK.
- Find the provider for the model in
runner/codegen/ai-sdk. If the provider doesn't exist, implement it by following the pattern from the existing providers. - Add your model to the
SUPPORTED_MODELSarray. - Done! ๐ You can now run your model by passing
--model=<your model ID>.