跳转到主要内容
Langflow provides OpenAI integration through its OpenAI bundle components. This tutorial shows how to configure Qhaigc for text generation and, separately, for embeddings — because the two component types expose different fields.

Prerequisites

  • A running Langflow instance (local, Docker, or self-hosted)
  • A Qhaigc API key — get one from the API Tokens page
  • For RAG flows: an embedding component configured separately

Configuration Steps

1

Open Langflow and create a flow

Log in to the Langflow console and create a new flow.
2

Add the OpenAI text generation component

From the component library, drag in the OpenAI text generation component from the OpenAI bundle. This is your primary chat/LLM node.
3

Fill in the text component parameters

Configure the following fields in the component panel:
FieldValue
OpenAI API KeyYour Qhaigc API key (starts with sk-)
Model Namee.g. gpt-4o, gpt-4o-mini, or another model
TemperatureYour preferred value (e.g. 0.7)
Max TokensSet as needed
Langflow’s OpenAI text generation component may not expose a Base URL field in the current version. Configure the fields listed above; refer to Langflow’s OpenAI bundle documentation for the exact fields available in your version.
4

Add an OpenAI Embeddings component for RAG flows

If your flow includes a knowledge base or vector store, drag in a separate OpenAI Embeddings component and configure it:
FieldValue
OpenAI API KeyYour Qhaigc API key
OpenAI API Basehttps://api.qhaigc.net/v1
Modele.g. bge-m3 or text-embedding-3-large
The embedding component does expose an API base field that accepts a custom URL.
5

Connect input and output components

Wire the Prompt, Memory, Vector Store, or Chat Output nodes to your main model node as needed by your flow.
6

Run a test

Execute a minimal flow to confirm the components return results from Qhaigc.

Verifying the Connection

Your setup is working when:
  • The text generation component runs and returns a reply from Qhaigc
  • If you configured embeddings, the vectorization and retrieval steps complete without errors

Frequently Asked Questions

Why does the text component not have a Base URL field? Langflow’s OpenAI bundle text generation component lists api_key, model, max_tokens, and temperature as its primary fields. A custom Base URL field may or may not be present depending on your Langflow version. Check your version’s component panel or the official docs for confirmation. Why do I need a separate embedding component for RAG? Langflow’s knowledge base and retrieval chain requires an independent embedding component — the text generation component does not provide embeddings automatically.

References