AI Agents

Building Durable AI Agents

AI agents are built on the primitive of LLM and tool-call loops, often with additional processes for data fetching, resource provisioning, or reacting to external events.

Workflow DevKit makes your agents production-ready, by turning them into durable, resumable workflows, and managing your LLM calls, tool executions, and other async operations as retryable and observable steps.

chatWorkflow
agent.stream
searchWeb
agent.stream
waitForHumanApproval
agent.stream

This guide walks you through converting a basic AI SDK agent into a durable agent using Workflow DevKit.

Why Durable Agents?

Aside from the usual challenges of getting your long-running tasks to be production-ready, building mature AI agents typically requires solving several additional challenges:

  • Durability: Persisting chat sessions and turning all LLM and tool calls into separate async jobs, with workers, queues, and state management, which repeatedly save and re-load state from a database.
  • Observability: Using services to collect traces and metrics, separately storing your messages and user history, and then combining them to get a complete picture of your agent's behavior.
  • Resumability: Separately from storing messages for replayability, most LLM calls are streams, and recovering from a call failure without re-doing the entire call requires piping and storing streams separately from your messages, usually in a separate service.
  • Human-in-the-loop: Your client, API, and async job orchestration need to work together to create, track, route to, and display human approval requests, or similar webhook operations. If your stack is disjointed, this simple feature becomes a major orchestration challenge.

Workflow DevKit provides all of these capabilities out of the box. Your agent becomes a workflow, your tools become steps, and the framework handles interplay with your existing infrastructure.

Getting Started

We start out with a basic Next.js application using the AI SDK's Agent class, which is a simple wrapper around AI SDK's streamText function. It comes with a simple tool for reading web pages.

app/api/chat/route.ts
import { createUIMessageStreamResponse, Experimental_Agent as Agent, stepCountIs } from 'ai';
import { tools } from '@/ai/tools';

export async function POST(req: Request) {
  const { messages, modelId } = await req.json();

  const agent = new Agent({
    model: modelId,
    system: 'You are a helpful assistant.',
    tools,
  });

  const stream = agent.stream({ messages });

  return createUIMessageStreamResponse({
    stream: stream.toUIMessageStream(),
  });
}
ai/tools/index.ts
import { tool } from 'ai';
import { z } from 'zod';

export const tools = {
  searchWeb: tool({
    description: 'Read a web page and return the content',
    inputSchema: z.object({ url: z.string() }),
    execute: async ({ url }: { url: string }) => {
      const response = await fetch(url);
      return response.text();
    },
  }),
};
app/chat.tsx
'use client';

import { useChat } from '@ai-sdk/react';

export function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
  });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong>
          {m.parts.map((part, i) => {
            if (part.type === 'text') {
              return <span key={i}>{part.text}</span>;
            }
            if (part.type === 'tool-invocation') {
              return <div key={i}>Calling {part.toolInvocation.toolName}...</div>;
            }
            return null;
          })}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Type a message..."
        />
      </form>
    </div>
  );
}

Install Dependencies

Add the Workflow DevKit packages to your project:

npm i workflow @workflow/ai

and extend the NextJS compiler to transform the code:

next.config.ts
import { withWorkflow } from 'workflow/next';
import type { NextConfig } from 'next';

const nextConfig: NextConfig = {
  // ... rest of your Next.js config
};

export default withWorkflow(nextConfig);

Create a Workflow Function

Move the agent logic into a separate workflow function:

app/api/chat/workflow.ts
import { DurableAgent } from '@workflow/ai/agent'; 
import { getWritable } from 'workflow'; 
import { stepCountIs } from 'ai';
import { tools } from '@/ai/tools';
import type { ModelMessage, UIMessageChunk } from 'ai';

export async function chatWorkflow({
  messages,
  modelId,
}: {
  messages: ModelMessage[];
  modelId: string;
}) {
  'use workflow'; 

  const writable = getWritable<UIMessageChunk>(); 

  const agent = new DurableAgent({ 
    model: modelId,
    system: 'You are a helpful assistant.',
    tools: tools(),
  });

  await agent.stream({ 
    messages,
    writable,
    stopWhen: stepCountIs(20),
  });
}

Key changes:

  • Replace Agent with DurableAgent from @workflow/ai/agent
  • Add the "use workflow" directive to mark this as a workflow function
  • Use getWritable() to get a stream for agent output
  • Pass the writable to agent.stream() instead of returning a stream directly

Update the API Route

Replace the agent call with start() to run the workflow:

app/api/chat/route.ts
import { createUIMessageStreamResponse, convertToModelMessages } from 'ai';
import { start } from 'workflow/api'; 
import { chatWorkflow } from './workflow'; 

export async function POST(req: Request) {
  const { messages, modelId } = await req.json();
  const modelMessages = convertToModelMessages(messages);

  const run = await start(chatWorkflow, [{ messages: modelMessages, modelId }]); 

  return createUIMessageStreamResponse({
    stream: run.readable, 
  });
}

Convert Tools to Steps

Mark tool execution functions with "use step" to make them durable. This enables automatic retries and observability:

ai/tools/search-web.ts
import { tool } from 'ai';
import { z } from 'zod';

async function executeSearch({ query }: { query: string }) {
  'use step'; 

  const response = await fetch(`https://api.search.com?q=${query}`);
  return response.json();
}

export const searchWeb = tool({
  description: 'Search the web for information',
  inputSchema: z.object({ query: z.string() }),
  execute: executeSearch,
});

With "use step":

  • The tool execution runs in a separate step with full Node.js access
  • Failed tool calls are automatically retried (up to 3 times by default)
  • Each tool execution appears as a discrete step in observability tools
  • Results are persisted, so replays skip already-completed tools

Stream Progress Updates from Tools

Tools can emit progress updates to the same stream the agent uses. This allows the UI to display tool status.

ai/tools/search-web.ts
import { tool } from 'ai';
import { getWritable } from 'workflow';
import { z } from 'zod';
import type { UIMessageChunk } from 'ai';

async function executeSearch(
  { url }: { url: string },
  { toolCallId }: { toolCallId: string }
) {
  'use step';

  const writable = getWritable<UIMessageChunk>(); 
  const writer = writable.getWriter(); 

  // Emit a progress update
  await writer.write({ 
    id: toolCallId, 
    type: 'data-search-web', 
    data: { url, status: 'fetching' }, 
  }); 

  const response = await fetch(url);
  const content = await response.text();

  await writer.write({ 
    id: toolCallId, 
    type: 'data-search-web', 
    data: { url, status: 'done' }, 
  }); 

  writer.releaseLock(); 

  return content;
}

export const searchWeb = tool({
  description: 'Read a web page and return the content',
  inputSchema: z.object({ url: z.string() }),
  execute: executeSearch,
});

Handle the data-search-web chunks in your client to display progress. Data parts are stored in the message, so you can find the latest status directly:

app/chat.tsx
'use client';

import { useChat } from '@ai-sdk/react';

export function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
  });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong>
          {m.parts.map((part, i) => {
            if (part.type === 'text') {
              return <span key={i}>{part.text}</span>;
            }
            if (part.type === 'tool-invocation') {
              // Find the latest data part for this tool call
              const dataPart = m.parts.findLast( 
                (p) => p.type === 'data' && p.id === part.toolInvocation.toolCallId 
              ); 
              const status = dataPart?.type === 'data' ? dataPart.data : null; 
              return (
                <div key={i}>
                  {status?.status === 'fetching' 
                    ? `Fetching ${status.url}...` 
                    : `Called ${part.toolInvocation.toolName}`} 
                </div>
              );
            }
            return null;
          })}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Type a message..."
        />
      </form>
    </div>
  );
}

Running the Workflow

Run your development server, then open the observability dashboard to see your workflow in action:

npx workflow web

This opens a local dashboard showing all workflow runs and their status, as well as a trace viewer to inspect the workflow in detail, including retry attempts, and the data being passed between steps.

Next Steps

Now that you have a basic durable agent, explore these additional features:

Complete Example

Here is the complete code for the durable agent after all the steps above:

app/api/chat/route.ts
import { createUIMessageStreamResponse, convertToModelMessages } from 'ai';
import { start } from 'workflow/api';
import { chatWorkflow } from './workflow';

export async function POST(req: Request) {
  const { messages, modelId } = await req.json();
  const modelMessages = convertToModelMessages(messages);

  const run = await start(chatWorkflow, [{ messages: modelMessages, modelId }]);

  return createUIMessageStreamResponse({
    stream: run.readable,
  });
}
app/api/chat/workflow.ts
import { DurableAgent } from '@workflow/ai/agent';
import { getWritable } from 'workflow';
import { stepCountIs } from 'ai';
import { tools } from '@/ai/tools';
import type { ModelMessage, UIMessageChunk } from 'ai';

export async function chatWorkflow({
  messages,
  modelId,
}: {
  messages: ModelMessage[];
  modelId: string;
}) {
  'use workflow';

  const writable = getWritable<UIMessageChunk>();

  const agent = new DurableAgent({
    model: modelId,
    system: 'You are a helpful assistant.',
    tools,
  });

  await agent.stream({
    messages,
    writable,
    stopWhen: stepCountIs(20),
  });
}
ai/tools/index.ts
import { tool } from 'ai';
import { getWritable } from 'workflow';
import { z } from 'zod';
import type { UIMessageChunk } from 'ai';

async function executeSearch(
  { url }: { url: string },
  { toolCallId }: { toolCallId: string }
) {
  'use step';

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  await writer.write({
    id: toolCallId,
    type: 'data-search-web',
    data: { url, status: 'fetching' },
  });

  const response = await fetch(url);
  const content = await response.text();

  await writer.write({
    id: toolCallId,
    type: 'data-search-web',
    data: { url, status: 'done' },
  });

  writer.releaseLock();

  return content;
}

export const tools = {
  searchWeb: tool({
    description: 'Read a web page and return the content',
    inputSchema: z.object({ url: z.string() }),
    execute: executeSearch,
  }),
};
app/chat.tsx
'use client';

import { useChat } from '@ai-sdk/react';

export function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
  });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong>
          {m.parts.map((part, i) => {
            if (part.type === 'text') {
              return <span key={i}>{part.text}</span>;
            }
            if (part.type === 'tool-invocation') {
              // Find the latest data part for this tool call
              const dataPart = m.parts.findLast(
                (p) => p.type === 'data' && p.id === part.toolInvocation.toolCallId
              );
              const status = dataPart?.type === 'data' ? dataPart.data : null;
              return (
                <div key={i}>
                  {status?.status === 'fetching'
                    ? `Fetching ${status.url}...`
                    : `Called ${part.toolInvocation.toolName}`}
                </div>
              );
            }
            return null;
          })}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Type a message..."
        />
      </form>
    </div>
  );
}