.with_telnyx() helper:
Hosted models
These run on Telnyx infrastructure — no external API key needed, just yourTELNYX_API_KEY:
| Model | Description |
|---|---|
moonshotai/Kimi-K2.5 | Moonshot AI’s latest |
zai-org/GLM-5 | Zhipu AI’s latest |
MiniMaxAI/MiniMax-M2.5 | MiniMax’s latest |
Proprietary models (BYOK)
For models like GPT-4o or Claude, Telnyx proxies the request using your own API key. Add your provider key in the Telnyx Portal under Inference settings.How it works
The.with_telnyx() helper points the standard livekit-plugins-openai package at the Telnyx inference endpoint. Same API shape as OpenAI — tool calling, streaming, and function calls all work.