1. Using REST API

When making a request using the REST API, use the Flowstack base URL: https://openai.flowstack.ai/v1.

Include the following headers:

  • Authorization: This should contain your OpenAI API key.
  • Flowstack-Auth: This should contain your Flowstack key.

Here’s a sample curl command:

curl --request POST \
     --url https://openai.flowstack.ai/v1/chat/completions \
     --header 'Authorization: Bearer YOUR_OPENAI_KEY' \
     --header 'Content-Type: application/json' \
     --header 'Flowstack-Auth: Bearer YOUR_FLOWSTACK_KEY' \
     --data '{
       "model": "gpt-3.5-turbo",
       "messages": [
           {"role": "user", "content": "Hi"}
       ]
     }'

2. Using OpenAI Python Library

First, ensure you have the OpenAI library installed:

pip install openai

Set the proxy by replacing openai.api_base with the Flowstack base URL. Additionally, include the necessary headers in your request.

import openai

openai.api_base = "https://openai.flowstack.ai/v1"
openai.api_key = "YOUR_OPENAI_KEY"

response = openai.ChatCompletion.create(
    headers={"Flowstack-Auth": "Bearer YOUR_FLOWSTACK_KEY"},
    model="gpt-3.5-turbo",
    messages=[{"role": "user", "content": "Hello!"}]
)

3. Using OpenAI NodeJS Library

Ensure you have the OpenAI NodeJS library installed:

npm install openai

Set Flowstack URL as the base URL and add the necessary headers:

import OpenAI from "openai";

const openai = new OpenAI({
    apiKey: "YOUR_OPENAI_KEY",
    baseURL: "https://openai.flowstack.ai/v1",
    defaultHeaders: {
      "Flowstack-Auth": "Bearer YOUR_FLOWSTACK_KEY",
    },
});

const chatCompletion = await openai.chat.completions.create({
    messages: [{ role: "user", content: "Hello!" }],
    model: "gpt-3.5-turbo",
});

Streaming Responses

Flowstack’s OpenAI proxy supports streaming responses, enabling real-time interaction with the model. To integrate streaming in your Python or Node.js application, direct your API calls to Flowstack’s proxy and include the necessary Flowstack authentication header.