Skip to content

DeepSeek Plugin

The genkit-plugin-deepseek package includes a pre-configured plugin for DeepSeek models, including the powerful R1 reasoning model and the efficient V3 chat model.

Terminal window
uv add genkit-plugin-deepseek

To use this plugin, import DeepSeek and specify it when you initialize Genkit:

from genkit import Genkit
from genkit.plugins.deepseek import DeepSeek
ai = Genkit(
plugins=[DeepSeek()],
)

You must provide an API key from DeepSeek. You can get an API key from your DeepSeek account settings.

Configure the plugin to use your API key by doing one of the following:

  • Set the DEEPSEEK_API_KEY environment variable to your API key.

  • Specify the API key when you initialize the plugin:

    DeepSeek(api_key='YOUR_API_KEY')

As always, avoid embedding API keys directly in your code.

Use the deepseek_name() helper to reference a DeepSeek model.

from genkit import Genkit
from genkit.plugins.deepseek import DeepSeek, deepseek_name
ai = Genkit(
plugins=[DeepSeek()],
)
@ai.flow()
async def deepseek_flow(subject: str) -> str:
"""Generate information about a subject using DeepSeek.
Args:
subject: The subject to generate information about.
Returns:
Information about the subject.
"""
response = await ai.generate(
model=deepseek_name('deepseek-chat'),
prompt=f'Tell me something about {subject}.',
)
return response.text

Available Models:

The DeepSeek plugin provides access to several models:

  • deepseek-chat: Standard chat model for most conversational tasks
  • deepseek-reasoner: R1 reasoning model with chain-of-thought capabilities
  • deepseek-v3: Latest V3 model with improved performance
  • deepseek-r1: Advanced R1 reasoning model

The deepseek-reasoner model shows step-by-step reasoning, making it ideal for complex logic, math, and coding problems:

@ai.flow()
async def reasoning_flow(problem: str) -> str:
"""Solve a problem using DeepSeek's reasoning model.
Args:
problem: The problem to solve.
Returns:
The solution with reasoning steps.
"""
response = await ai.generate(
model=deepseek_name('deepseek-reasoner'),
prompt=f'Solve this problem step by step: {problem}',
)
return response.text

Example with a classic reasoning problem:

response = await ai.generate(
model=deepseek_name('deepseek-reasoner'),
prompt='What is heavier, one kilo of steel or one kilo of feathers?',
)
print(response.text) # Shows reasoning steps before the answer

DeepSeek models support tool calling, allowing them to use functions you define:

from pydantic import BaseModel, Field
class WeatherInput(BaseModel):
"""Input for weather tool."""
location: str = Field(description='City name')
@ai.tool(description='Get current weather for a location')
def get_weather(input: WeatherInput) -> str:
"""Get the current weather for a location."""
# In a real implementation, call a weather API
return f'22°C and sunny in {input.location}'
@ai.flow()
async def weather_flow(location: str) -> str:
"""Get weather information using DeepSeek with tool calling.
Args:
location: The location to get weather for.
Returns:
Weather information for the location.
"""
response = await ai.generate(
model=deepseek_name('deepseek-chat'),
prompt=f'What is the weather in {location}?',
tools=['get_weather'],
)
return response.text

The plugin supports streaming responses for real-time output:

from genkit import ActionRunContext
@ai.flow()
async def streaming_flow(topic: str, ctx: ActionRunContext | None = None) -> str:
"""Generate content with streaming output.
Args:
topic: Topic to generate content about.
ctx: Action context for streaming chunks.
Returns:
The complete generated content.
"""
response = await ai.generate(
model=deepseek_name('deepseek-chat'),
prompt=f'Tell me about {topic}',
on_chunk=ctx.send_chunk if ctx else None,
)
return response.text

Maintain conversation context across multiple turns:

from genkit import Message, Part, Role, TextPart
@ai.flow()
async def chat_flow() -> str:
"""Example of multi-turn conversation with context.
Returns:
The final response.
"""
history = []
# First message
response1 = await ai.generate(
model=deepseek_name('deepseek-chat'),
prompt='I love Japanese food, especially ramen.',
system='You are a helpful assistant.',
)
# Build conversation history
history.append(Message(
role=Role.USER,
content=[Part(root=TextPart(text='I love Japanese food, especially ramen.'))]
))
if response1.message:
history.append(response1.message)
# Follow-up using context
response2 = await ai.generate(
model=deepseek_name('deepseek-chat'),
messages=[
*history,
Message(
role=Role.USER,
content=[Part(root=TextPart(text='What food did I mention?'))]
),
],
system='You are a helpful assistant.',
)
return response2.text

Generate structured data using Pydantic models:

from pydantic import BaseModel, Field
from genkit import Output
class BookRecommendation(BaseModel):
"""A book recommendation."""
title: str = Field(description='Book title')
author: str = Field(description='Book author')
genre: str = Field(description='Primary genre')
summary: str = Field(description='Brief summary')
why_recommended: str = Field(description='Why this book is recommended')
@ai.flow()
async def recommend_book(preferences: str) -> BookRecommendation:
"""Get a book recommendation with structured output.
Args:
preferences: User's reading preferences.
Returns:
A structured book recommendation.
"""
response = await ai.generate(
model=deepseek_name('deepseek-chat'),
prompt=f'Recommend a book for someone who likes: {preferences}',
output=Output(schema=BookRecommendation),
)
return response.output

You can pass configuration options that are not defined in the plugin’s custom configuration schema. This permits you to access new models and features without having to update your Genkit version.

from genkit import Genkit
from genkit.plugins.deepseek import DeepSeek, deepseek_name
ai = Genkit(plugins=[DeepSeek()])
response = await ai.generate(
prompt='Tell me a cool story',
model=deepseek_name('deepseek-new'), # hypothetical new model
config={
'new_feature_parameter': ..., # hypothetical config needed for new model
},
)

Genkit passes this configuration as-is to the DeepSeek API giving you access to the new model features. Note that the field name and types are not validated by Genkit and should match the DeepSeek API specification to work.

The @genkit-ai/compat-oai package includes a pre-configured plugin for DeepSeek models.

Terminal window
npm install @genkit-ai/compat-oai

To use this plugin, import deepSeek and specify it when you initialize Genkit.

import { genkit } from 'genkit';
import { deepSeek } from '@genkit-ai/compat-oai/deepseek';
export const ai = genkit({
plugins: [deepSeek()],
});

You must provide an API key from DeepSeek. You can get an API key from your DeepSeek account settings.

Configure the plugin to use your API key by doing one of the following:

  • Set the DEEPSEEK_API_KEY environment variable to your API key.

  • Specify the API key when you initialize the plugin:

    deepSeek({ apiKey: yourKey });

As always, avoid embedding API keys directly in your code.

Use the deepSeek.model() helper to reference a DeepSeek model.

import { genkit, z } from 'genkit';
import { deepSeek } from '@genkit-ai/compat-oai/deepseek';
const ai = genkit({
plugins: [deepSeek({ apiKey: process.env.DEEPSEEK_API_KEY })],
});
export const deepseekFlow = ai.defineFlow(
{
name: 'deepseekFlow',
inputSchema: z.object({ subject: z.string() }),
outputSchema: z.object({ information: z.string() }),
},
async ({ subject }) => {
// Reference a model
const deepseekChat = deepSeek.model('deepseek-chat');
// Use it in a generate call
const llmResponse = await ai.generate({
model: deepseekChat,
prompt: `Tell me something about ${subject}.`,
});
return { information: llmResponse.text };
},
);

You can also pass model-specific configuration:

const llmResponse = await ai.generate({
model: deepSeek.model('deepseek-chat'),
prompt: 'Tell me something about deep learning.',
config: {
temperature: 0.8,
maxTokens: 1024,
},
});

You can pass configuration options that are not defined in the plugin’s custom config schema. This permits you to access new models and features without having to update your Genkit version.

import { genkit } from 'genkit';
import { deepSeek } from '@genkit-ai/compat-oai/deepseek';
const ai = genkit({
plugins: [deepSeek()],
});
const llmResponse = await ai.generate({
prompt: `Tell me a cool story`,
model: deepSeek.model('deepseek-new'), // hypothetical new model
config: {
new_feature_parameter: ... // hypothetical config needed for new model
},
});

Genkit passes this configuration as-is to the DeepSeek API giving you access to the new model features. Note that the field name and types are not validated by Genkit and should match the DeepSeek API specification to work.