Proxy chat completion via OpenRouter
POST/llm.v1.services.gateway.v1.LLMGatewayService/OpenRouterChatCompletion
Transparent pass-through to the OpenRouter chat completion API. Sends messages with a generation config and returns the model's response. No business logic is applied -- this is a direct proxy for ad-hoc LLM calls outside of conversation flows.
Request
Responses
- 200
Chat completion returned