The @wattdata/sdk package provides an agentic client built on Vercel's AI SDK v6 and Watt's MCP tools. You provide API keys — the SDK handles model selection, tool discovery, the agent loop, extended thinking, and context management.
Quick Start
npm install @wattdata/sdkThe agent discovers available tools (entity_resolve, entity_enrich, trait_search, etc.) and calls them as needed to fulfill the request.
Usage Patterns
Four methods, from simplest to most flexible:
prompt() — Simple text response
const response = await watt.prompt("Find tech executives in San Francisco");
console.log(response);generate() — Full result with steps and usage
const result = await watt.generate({
prompt: "Resolve these emails: alice@acme.com, bob@corp.com",
});
console.log(result.text); // Final response
console.log(result.steps); // Tool call history
console.log(result.usage); // { promptTokens, completionTokens, totalTokens }stream() — Streaming text
const result = await watt.stream({
prompt: "Find all golf enthusiasts in Austin, TX",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}streamChat() — Chat UI integration
Returns a streaming Response compatible with AI SDK's useChat hook.
// Client-side (React)
import { useChat } from "@ai-sdk/react";
function ChatUI() {
const { messages, input, handleInputChange, handleSubmit, status } = useChat();
return (
<div>
{messages.map((m) => (
<div key={m.id}>{m.role}: {m.content}</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit" disabled={status === "streaming"}>Send</button>
</form>
</div>
);
}Structured Output
Pass output to generate(), stream(), or streamChat() to get typed, validated responses. The agent still runs its full tool loop — structured output only constrains the final response.
Output and z (Zod) are re-exported from @wattdata/sdk so you don't need separate installations.
Available output types: Output.object(), Output.array(), Output.choice(), Output.text(), Output.json()
Structured output also works with streamChat() — the structured JSON streams as the last assistant message:
return watt.streamChat(messages, {
output: Output.object({
schema: z.object({ response: z.string(), summary: z.string() }),
}),
});Custom Tools
Pass tools in the client config to inject custom tools into the agent alongside MCP and SDK tools. Custom tools are available in generate(), stream(), and streamChat().
Custom tools take precedence over MCP and SDK tools with the same name. A warning is logged when a collision is detected. Avoid naming custom tools after built-in tools (e.g., upload_file, entity_resolve) unless you intend to override them.
File Upload and Analysis
Pass file to have the agent upload and analyze a local file. Works with generate(), stream(), and streamChat().
const result = await watt.generate({
prompt: "Analyze my customer list and identify the ideal customer profile",
file: "/data/customers.csv",
});The agent orchestrates the full pipeline autonomously — uploading the file, inspecting CSV structure, running identity resolution and enrichment, and calling analysis tools as needed. No manual orchestration required.
Tools
MCP Tools (from the server)
Discovered dynamically at runtime. See the MCP Tool Reference for the full list and parameter details.
SDK-Local Tools
Run in your Node.js environment, not on the MCP server:
| Tool | Description |
|---|---|
upload_file | Upload local files to S3 via MCP |
resources_read | Read workflow resources (CSVs, artifacts) |
resources_list | List workflow resources |
list_prompts | Discover available workflow templates |
get_prompt | Retrieve workflow instructions |
Reference
Authentication
| Key | Purpose | Where to get it |
|---|---|---|
wattApiKey | Authenticates with the Watt MCP server | API Keys dashboard |
wattMcpUrl | MCP server endpoint: https://api.wattdata.ai/v2/mcp | |
anthropicApiKey | Claude model access | console.anthropic.com |
Configuration
const watt = createWattClient({
wattApiKey: string, // Required
wattMcpUrl: string, // Required
anthropicApiKey: string, // Required
systemPrompt?: string, // Override default instructions
tools?: ToolSet, // Custom tools merged into the agent
});The built-in prompt is exported as SYSTEM_PROMPT so you can extend it:
import { createWattClient, SYSTEM_PROMPT } from "@wattdata/sdk";
const watt = createWattClient({
...keys,
systemPrompt: SYSTEM_PROMPT + "\nAlways format results as markdown tables.",
});Agent defaults:
| Setting | Value |
|---|---|
| Model | claude-sonnet-4-6 |
| Max steps | 30 |
| Extended thinking | 10,000 token budget |
| Context management | Clears old tool results at 150K input tokens |
API Reference
| Method | Returns | Description |
|---|---|---|
prompt(text) | Promise<string> | Simple text response |
generate(opts) | Promise<GenerateTextResult> | Full result with steps and usage. Pass output for structured responses |
stream(opts) | Promise<StreamTextResult> | Streaming text with async iterables. Pass output for structured responses |
streamChat(messages, opts?) | Promise<Response> | Streaming response for chat UIs. Pass output for structured responses |
close() | Promise<void> | Close MCP connection |
interface WattGenerateOptions {
prompt?: string; // Simple text prompt
messages?: ModelMessage[]; // Full message history (multi-turn)
file?: string; // Local file path to upload
output?: Output; // Structured output (Output.object(), Output.array(), etc.)
onStepFinish?: (step) => void;
}
interface WattStreamOptions {
prompt?: string; // Simple text prompt
messages?: ModelMessage[]; // Full message history (multi-turn)
file?: string; // Local file path to upload
output?: Output; // Structured output (Output.object(), Output.array(), etc.)
onStepFinish?: (step) => void;
}
interface StreamChatOptions {
sendReasoning?: boolean; // Include thinking tokens
file?: string; // Local file path to upload
output?: Output; // Structured output (Output.object(), Output.array(), etc.)
abortSignal?: AbortSignal; // Cancel on disconnect
}Examples
Backend pipeline:
const result = await watt.generate({
prompt: "Analyze my customer list, build an ICP, and find 50K lookalikes",
file: "/data/customers.csv",
});Express endpoint:
app.post("/chat", async (req, res) => {
const watt = createWattClient({ ...keys });
return watt.streamChat(req.body.messages);
});Streaming file analysis:
const result = await watt.stream({
prompt: "Analyze my customers and find patterns",
file: "./exports/customers.csv",
});
for await (const chunk of result.textStream) {
process.stdout.write(chunk);
}New to the SDK? Start with the Watt SDK Quickstart for a minimal setup guide.
If you need to call Watt Data tools directly as JSON-RPC without the SDK, see the API Integration guide.