Supported Models
Anthropic
1. Using REST API
When making a request using the REST API, use the Flowstack base URL: https://anthropic.flowstack.ai/v1
.
Include the following headers:
- Authorization: This should contain your Anthropic API key.
- Flowstack-Auth: This should contain your Flowstack key.
Here’s a sample curl command:
curl --request POST \
--url https://anthropic.flowstack.ai/v1/complete \
--header 'Authorization: Bearer YOUR_ANTHROPIC_KEY' \
--header 'Content-Type: application/json' \
--header 'Flowstack-Auth: Bearer YOUR_FLOWSTACK_KEY' \
--data '{
"model": "claude-2",
"max_tokens_to_sample": 300,
"prompt": "Hello"
}'
2. Using Anthropic Python Library
First, ensure you have the Anthropic library installed:
pip install anthropic
Set the proxy by replacing anthropic.api_base
with the Flowstack base URL. Additionally, include the necessary headers in your request.
from anthropic import Anthropic, HUMAN_PROMPT, AI_PROMPT
anthropic = Anthropic(
api_key="YOUR_ANTHROPIC_API_KEY",
base_url="https://anthropic.flowstack.ai",
default_headers={
"Flowstack-Auth": "Bearer YOUR_FLOWSTACK_KEY"
}
)
completion = anthropic.completions.create(
model="claude-2",
max_tokens_to_sample=300,
prompt=f"{HUMAN_PROMPT} How many toes do dogs have?{AI_PROMPT}",
)
print(completion.completion)
3. Using Anthropic NodeJS Library
Ensure you have the Anthropic NodeJS library installed:
npm install anthropic
Set Flowstack URL as the base URL and add the necessary headers:
import Anthropic from '@anthropic-ai/sdk';
const anthropic = new Anthropic({
apiKey: 'YOUR_ANTHROPIC_API_KEY',
baseUrl="https://anthropic.flowstack.ai",
});
async function main() {
const completion = await anthropic.completions.create(
{
model: 'claude-2',
max_tokens_to_sample: 300,
prompt: `${Anthropic.HUMAN_PROMPT} Hello! ${Anthropic.AI_PROMPT}`,
},
{
headers: { 'Flowstack-Auth': "Bearer YOUR_FLOWSTACK_KEY" }
});
console.log(completion.completion);
}
main().catch(console.error);
Streaming Responses
Flowstack’s Anthropic proxy supports streaming responses, enabling real-time interaction with the model. To integrate streaming in your Python or Node.js application, direct your API calls to Flowstack’s proxy and include the necessary Flowstack authentication header.