跳转到主要内容
LobeChat supports Qhaigc through its built-in OpenAI provider. This tutorial covers two scenarios: configuring Qhaigc on the user side in the online or desktop app, and setting it up at the server level for a self-hosted deployment.

Prerequisites

Before you begin, make sure you have:
  • Access to a running LobeChat instance (online, desktop, or self-hosted)
  • A Qhaigc API key — get yours from the API Tokens page
  • For self-hosted deployments: the ability to set environment variables for your deployment

Option 1: Online or Desktop App

Use this path if you are running LobeChat at lobechat.com or as a desktop application.
1

Open LobeChat settings

Launch LobeChat and open the Settings panel.
2

Navigate to Language Model settings

Go to the Language Model (or Model) settings page.
3

Select the OpenAI provider

Find and select OpenAI from the list of providers.
4

Enter your Qhaigc API key

Paste your Qhaigc API key (starting with sk-) into the API Key field.
5

Save and choose a model

Save the configuration, return to the chat page, and select an available model to start chatting.
The online version of LobeChat does not expose a custom Base URL input for the OpenAI provider. If you need to point to a custom endpoint, use the self-hosted path below.

Option 2: Self-Hosted Deployment

Use this path if you deploy LobeChat yourself and can modify environment variables.
1

Set the API key environment variable

In your deployment environment, set:
OPENAI_API_KEY=your-qhaigc-api-key
2

Set the proxy URL environment variable

Point LobeChat to the Qhaigc endpoint:
OPENAI_PROXY_URL=https://api.qhaigc.net/v1
3

Restart or redeploy

Restart your service or redeploy so the new environment variables take effect.
4

Verify the configuration

Open LobeChat and go to the model settings page to confirm that the configuration has loaded correctly.

Verify the Connection

  • Open a new chat and send a message.
  • If you receive a response, the connection is working.
  • For self-hosted deployments that remain unresponsive, verify that the environment variables are actually being read by the running instance and not overridden elsewhere.

Troubleshooting

Model list is empty. Confirm that the OpenAI provider is saved and refresh the page. Then check that the upstream service is reachable. Connection failed. Check your API key, network connectivity, and — for self-hosted — whether the environment variables have taken effect after the restart.