Why Use Prompt Integration?
Instead of hardcoding prompts in your application, reference them by ID:Gateway vs SDK Integration
Without the AI Gateway, using managed prompts requires multiple steps:Why the gateway is better:
- No extra packages - Works with your existing OpenAI SDK
- Single API call - Gateway fetches and compiles automatically
- Lower latency - Everything happens server-side in one request
- Automatic error handling - Invalid inputs return clear error messages
- Cleaner code - No prompt management logic in your application
Integration Steps
Create prompts in Helicone
Build and test prompts with variables in the dashboard
API Parameters
Use these parameters in your chat completions request to integrate with saved prompts:The ID of your saved prompt from the Helicone dashboard
Which environment version to use:
development, staging, or productionVariables to fill in your prompt template (e.g.,
{"customer_name": "John", "issue_type": "billing"})Any supported model - works with the unified gateway format