POST /v1/chat/completions — Gemini-Compatible Chat
Call Gemini models using the native Google generateContent API format. Use the Qhaigc endpoint as a drop-in replacement for the Google AI API.
Qhaigc supports the native Gemini API format through the /v1beta/models/{model}:generateContent path. Send requests exactly as you would to the Google AI API — Qhaigc proxies and authenticates them for you using your Qhaigc API key, so you do not need a separate Google API key.Endpoint:POST https://api.qhaigc.net/v1beta/models/{model}:generateContentReplace {model} in the path with the model name, such as gemini-2.5-flash or gemini-2.5-pro.
You can also pass "style": "gemini" in the body of POST /v1/chat/completions to receive a Gemini-format response from the unified endpoint. The native path shown here is recommended when you are migrating existing Gemini SDK code.
{ "candidates": [ { "content": { "role": "model", "parts": [ { "text": "RAG works in three steps:\n\n1. **Retrieve** — search a knowledge base for documents relevant to the user's query...\n2. **Augment** — prepend those documents to the prompt as context...\n3. **Generate** — the LLM produces a grounded answer using the retrieved context." } ] }, "finishReason": "STOP" } ], "usageMetadata": { "promptTokenCount": 18, "candidatesTokenCount": 76, "totalTokenCount": 94 }}