Supported Models
OpenAI
1. Using REST API
When making a request using the REST API, use the Flowstack base URL: https://openai.flowstack.ai/v1
.
Include the following headers:
- Authorization: This should contain your OpenAI API key.
- Flowstack-Auth: This should contain your Flowstack key.
Here’s a sample curl command:
2. Using OpenAI Python Library
First, ensure you have the OpenAI library installed:
Set the proxy by replacing openai.api_base
with the Flowstack base URL. Additionally, include the necessary headers in your request.
3. Using OpenAI NodeJS Library
Ensure you have the OpenAI NodeJS library installed:
Set Flowstack URL as the base URL and add the necessary headers:
Streaming Responses
Flowstack’s OpenAI proxy supports streaming responses, enabling real-time interaction with the model. To integrate streaming in your Python or Node.js application, direct your API calls to Flowstack’s proxy and include the necessary Flowstack authentication header.