Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including DeepSeekโs models. With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.Documentation Index
Fetch the complete documentation index at: https://portkey-docs-log-export-guide-1773064217.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start
Get DeepSeek working in 3 steps:Tip: You can also set
provider="@deepseek" in Portkey() and use just model="deepseek-chat" in the request.Add Provider in Model Catalog
- Go to Model Catalog โ Add Provider
- Select DeepSeek
- Choose existing credentials or create new by entering your DeepSeek API key
- Name your provider (e.g.,
deepseek-prod)
Complete Setup Guide โ
See all setup options, code examples, and detailed instructions
Advanced Features
Multi-round Conversations
DeepSeek supports multi-turn conversations where context is maintained across messages:JSON Output
Force structured JSON responses from DeepSeek models:Managing DeepSeek Prompts
Manage all prompt templates to DeepSeek in the Prompt Library. All current DeepSeek models are supported, and you can easily test different prompts. Use theportkey.prompts.completions.create interface to use the prompt in an application.
Supported Endpoints
- Chat Completions
- Streaming Chat Completions
Next Steps
Add Metadata
Add metadata to your DeepSeek requests
Gateway Configs
Add gateway configs to your DeepSeek requests
Tracing
Trace your DeepSeek requests
Fallbacks
Setup fallback from OpenAI to DeepSeek
SDK Reference
Complete Portkey SDK documentation

