BACK TO RESOURCES

Supercharging AI Agents with Function Calling on DeepSeek!

by Justin Woo
August 19, 2025

Function calling is a key capability of most LLMs—it allows pre-defined tools (or functions) to be called and provide relevant input parameters based on the user prompts. Many of the models on SambaCloud already support function calling, such as Llama 4 Maverick and DeepSeek V3. 

Today, we are excited to announce the arrival of function calling for DeepSeek-R1-0528 on SambaCloud! As one of the best open-source reasoning models on the market, the function calling capabilities available on this model unlock better planning for many agentic AI applications that can now run with fast inference powered by SambaCloud. See how DeepSeek R1 with function calling on SambaCloud compares to OpenAI’s o3 when both are run on an agent framework like CrewAI.

 

 

What is DeepSeek-R1-0528?

DeepSeek-R1-0528 is the latest checkpoint for the DeepSeek R1 model, which was originally open sourced in January. This checkpoint not only significantly improves its reasoning performance on par with proprietary models like OpenAI and Gemini, but also enables function calling which was not available in the original release. 

Reasoning models like DeepSeek are LLMs trained with reinforcement learning to perform reasoning, which improves the accuracy of its response. Reasoning models think before they answer and excel in complex problem solving, coding, scientific reasoning, and multi-step planning for agentic workflows. Because DeepSeek is open sourced, we are also able to see its reasoning tokens unlike many proprietary models.

Here are some benchmarks:

2025 08 14_DeepSeek+SN_RGB_1600x900_02

 

Why is AI function calling useful?

Function calling bridges the gap between your AI model and the real world by letting it trigger your own custom code on demand. Instead of just generating text or audio, the model can decide—based on your system prompt and conversation—to call a specific function, pass it the right parameters, and get results back in real-time.

This means your AI agent can go beyond conversation: It can fetch live data, update databases, run calculations, control external systems, and interact with any API you define. The function executes, returns the output, and the model seamlessly weaves that data into its next response—powering richer, smarter, and more action-oriented applications.

Here’s a simple diagram of AI function calling steps:

2025 08 14_DeepSeek+SN_RGB_1600x900_03.2

 

How do I use function calling on SambaCloud?

It is FIVE EASY steps!

  1. Log in to SambaCloud. Create an account, if you haven't already.
  2. Define a JSON schema for your function.
  3. Configure function calling in your request to SambaCloud.
  4. Handle tool calls by extracting the function call details and execute the corresponding function with the provided parameters.
  5. Once the result has been computed, pass the results back to the model to continue the conversation or confirm the output.

Here’s a full end-to-end-example that shows a fake weather lookup that returns a random temperature. In a real example, you will probably use a proper weather API.

Note: Be sure to enter in your SAMBANOVA_API_KEY, which you can create on our cloud.

import openai
import cmath
import random
import json

# Define the OpenAI client
client = openai.OpenAI(
    base_url="https://api.sambanova.ai/v1", 
    api_key="YOUR SAMBANOVA API KEY"
)

MODEL = 'DeepSeek-R1-0528'

def get_weather(city: str) -> dict: 
    """
    Fake weather lookup: returns a random temperature between 20°C and 50°C.
    """
    temp = random.randint(20, 50)
    return {
        "city": city,
        "temperature_fahrenheit": temp
    }

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_weather",
            "description": "Get weather of an location, the user shoud supply a location first",
            "parameters": {
                "type": "object",
                "properties": {
                    "city": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    }
                },
                "required": ["city"]
            },
        }
    },
]

messages = [{"role": "user", "content": "What's the weather like in Palo Alto today?"}]

completion = client.chat.completions.create(
    model=MODEL,
    messages=messages,
    tools=tools
)

print(completion)

tool_call = completion.choices[0].message.tool_calls[0]
args = json.loads(tool_call.function.arguments)

result = get_weather(args["city"])

messages.append(completion.choices[0].message)  # append model's function call message
messages.append({                               # append result message
    "role": "tool",
    "tool_call_id": tool_call.id,
    "content": str(result)
})

completion_2 = client.chat.completions.create(
    model=MODEL,
    messages=messages,
 
)
print(completion_2.choices[0].message.content)

 

How can I integrate function calling with agents?

Our partner, CrewAI is an open-source Python framework that helps developers create autonomous AI agent teams that work together to tackle complex tasks tailored to any scenario. CrewAI tools can also empower agents with capabilities ranging from web searching and data analysis to collaborate and delegate tasks among coworkers.

Here’s a simple example demonstrating how to use tools to get data from your file directory:

import os
from crewai import Agent, Task, Crew
# Importing crewAI tools
from crewai_tools import (
    DirectoryReadTool,
    FileReadTool
)

# Set up API keys
os.environ["SAMBANOVA_API_KEY"] = "Your Key"

# Instantiate tools
docs_tool = DirectoryReadTool(directory='./blog-posts')
file_tool = FileReadTool()

# Create agents

writer = Agent(
    role='Content Writer',
    goal='Craft engaging blog posts about the AI industry',
    backstory='A skilled writer with a passion for technology.',
    tools=[docs_tool, file_tool],
    verbose=True
)

# Define tasks
research = Task(
    description='Research the latest trends in the AI industry and provide a summary.',
    expected_output='A summary of the top 3 trending developments in the AI industry with a unique perspective on their significance.',
    agent=researcher
)

write = Task(
    description='Write an engaging blog post about the AI industry, based on the research analyst's summary. Draw inspiration from the latest blog posts in the directory.',
    expected_output='A 4-paragraph blog post formatted in markdown with engaging, informative, and accessible content, avoiding complex jargon.',
    agent=writer,
    output_file='blog-posts/new_post.md'  # The final blog post will be saved here
)

# Assemble a crew with planning enabled
crew = Crew(
    agents=[researcher, writer],
    tasks=[research, write],
    verbose=True,
    planning=True,  # Enable planning feature
)

# Execute tasks
crew.kickoff()

 

Build with relentless intelligence on SambaCloud

SambaCloud is a powerful platform that enables developers to easily integrate the best open-source models with the fastest inference speeds. Powered by our state-of-the-art AI chip, the SN40L, SambaCloud provides a seamless and efficient way to build AI applications. Get started today and experience the benefits of fast inference speeds, maximum accuracy, and an enhanced developer experience.