---
title: "Building an Agent with Mastra Framework: TypeScript-First Agent Development"
description: "Learn how to build AI agents using the Mastra framework. This guide covers project setup, agent definition with typed tools, persistent memory, workflow orchestration, and deployment strategies for TypeScript-first agent applications."
canonical: https://callsphere.ai/blog/mastra-framework-typescript-first-agent-development-guide
category: "Learn Agentic AI"
tags: ["Mastra", "TypeScript", "AI Agents", "Framework", "Tool Calling", "Agent Memory"]
author: "CallSphere Team"
published: 2026-03-17T00:00:00.000Z
updated: 2026-05-07T18:53:12.472Z
---

# Building an Agent with Mastra Framework: TypeScript-First Agent Development

> Learn how to build AI agents using the Mastra framework. This guide covers project setup, agent definition with typed tools, persistent memory, workflow orchestration, and deployment strategies for TypeScript-first agent applications.

## What Is Mastra

Mastra is an open-source TypeScript framework designed specifically for building AI agents, workflows, and RAG pipelines. Unlike general-purpose libraries that bolt agent capabilities onto existing chat abstractions, Mastra treats agents as first-class primitives with built-in support for tools, memory, structured outputs, and multi-step workflows.

The framework follows a "TypeScript-first" philosophy — every component is fully typed, schemas are defined with Zod, and the developer experience prioritizes IDE autocompletion and compile-time safety.

## Project Setup

Scaffold a new Mastra project using the CLI:

```mermaid
flowchart TD
    USER(["User message"])
    LLM["LLM call
with tools schema"]
    DECIDE{"Model wants
to call a tool?"}
    EXEC["Execute tool
sandboxed runtime"]
    RESULT["Append tool_result
to messages"]
    GUARD{"Output passes
guardrails?"}
    DONE(["Final reply"])
    BLOCK(["Refuse and log"])
    USER --> LLM --> DECIDE
    DECIDE -->|Yes| EXEC --> RESULT --> LLM
    DECIDE -->|No| GUARD
    GUARD -->|Yes| DONE
    GUARD -->|No| BLOCK
    style LLM fill:#4f46e5,stroke:#4338ca,color:#fff
    style EXEC fill:#ede9fe,stroke:#7c3aed,color:#1e1b4b
    style GUARD fill:#f59e0b,stroke:#d97706,color:#1f2937
    style DONE fill:#059669,stroke:#047857,color:#fff
    style BLOCK fill:#dc2626,stroke:#b91c1c,color:#fff
```

```bash
npx create-mastra@latest my-agent-app
cd my-agent-app
```

The CLI prompts you for your preferred LLM provider and generates a project structure:

```
my-agent-app/
  src/
    mastra/
      agents/
        index.ts       # Agent definitions
      tools/
        index.ts       # Tool definitions
      index.ts         # Mastra instance
  .env
  package.json
```

Install dependencies and set your API key:

```bash
npm install
echo "OPENAI_API_KEY=sk-proj-your-key" > .env
```

## Defining Tools

Tools give your agent capabilities beyond text generation. Each tool has a typed input schema, a description for the LLM, and an `execute` function:

```typescript
// src/mastra/tools/index.ts
import { createTool } from "@mastra/core";
import { z } from "zod";

export const searchDocsTool = createTool({
  id: "search_docs",
  description: "Search the documentation for relevant articles",
  inputSchema: z.object({
    query: z.string().describe("The search query"),
    limit: z.number().default(5).describe("Max results to return"),
  }),
  outputSchema: z.object({
    results: z.array(
      z.object({
        title: z.string(),
        snippet: z.string(),
        url: z.string(),
      })
    ),
  }),
  execute: async ({ context }) => {
    const { query, limit } = context;
    const results = await searchKnowledgeBase(query, limit);
    return { results };
  },
});

export const createTicketTool = createTool({
  id: "create_support_ticket",
  description: "Create a support ticket for unresolved issues",
  inputSchema: z.object({
    title: z.string(),
    description: z.string(),
    priority: z.enum(["low", "medium", "high"]),
  }),
  execute: async ({ context }) => {
    const ticket = await ticketService.create(context);
    return { ticketId: ticket.id, status: "created" };
  },
});
```

The `inputSchema` serves dual purpose: it generates the JSON Schema sent to the LLM for function calling and it validates the arguments at runtime before `execute` runs.

## Defining an Agent

Agents combine a model, system instructions, and tools into a coherent unit:

```typescript
// src/mastra/agents/index.ts
import { Agent } from "@mastra/core";
import { searchDocsTool, createTicketTool } from "../tools";

export const supportAgent = new Agent({
  name: "Support Agent",
  instructions: `You are a customer support agent for a SaaS platform.
Your primary task is to answer user questions by searching documentation.
If you cannot resolve an issue after searching, create a support ticket.
Always be concise and reference specific documentation links.`,
  model: {
    provider: "OPEN_AI",
    name: "gpt-4o",
    toolChoice: "auto",
  },
  tools: {
    search_docs: searchDocsTool,
    create_support_ticket: createTicketTool,
  },
});
```

## Registering with the Mastra Instance

The Mastra instance is the central registry for all agents, tools, and workflows:

```typescript
// src/mastra/index.ts
import { Mastra } from "@mastra/core";
import { supportAgent } from "./agents";

export const mastra = new Mastra({
  agents: { supportAgent },
});
```

## Running the Agent

Execute the agent programmatically or through the built-in dev server:

```typescript
import { mastra } from "./mastra";

async function main() {
  const agent = mastra.getAgent("supportAgent");

  const response = await agent.generate(
    "How do I reset my password? I've tried the forgot password link but it's not sending emails."
  );

  console.log(response.text);
}

main();
```

For development, Mastra provides a playground:

```bash
npx mastra dev
```

This launches a local web interface where you can interact with your agents, inspect tool calls, and debug conversation flows.

## Adding Memory

Mastra supports persistent memory so agents remember context across conversations:

```typescript
import { Agent } from "@mastra/core";
import { PostgresMemory } from "@mastra/memory";

const memory = new PostgresMemory({
  connectionString: process.env.DATABASE_URL!,
});

export const supportAgent = new Agent({
  name: "Support Agent",
  instructions: "...",
  model: { provider: "OPEN_AI", name: "gpt-4o" },
  tools: { /* ... */ },
  memory,
});
```

With memory enabled, calling `agent.generate()` with a `threadId` parameter automatically loads and saves conversation history.

## Workflows for Multi-Step Processes

For complex operations that go beyond a single agent loop, Mastra provides typed workflows:

```typescript
import { Workflow, Step } from "@mastra/core";
import { z } from "zod";

const onboardingWorkflow = new Workflow({
  name: "user-onboarding",
  triggerSchema: z.object({
    userId: z.string(),
    plan: z.enum(["free", "pro", "enterprise"]),
  }),
});

onboardingWorkflow
  .step(new Step({
    id: "create-workspace",
    execute: async ({ context }) => {
      return { workspaceId: await createWorkspace(context.userId) };
    },
  }))
  .then(new Step({
    id: "send-welcome",
    execute: async ({ context }) => {
      await sendWelcomeEmail(context.userId, context.workspaceId);
      return { emailSent: true };
    },
  }));
```

## FAQ

### How does Mastra compare to LangChain.js?

Mastra is more opinionated and TypeScript-native. LangChain.js offers broader integrations and a larger community, but Mastra provides tighter type safety, a built-in dev playground, and a cleaner API surface. Mastra is a good choice if you want a batteries-included framework specifically for agent applications rather than a general-purpose LLM toolkit.

### Can I use Mastra with providers other than OpenAI?

Yes. Mastra supports Anthropic, Google Gemini, and Groq out of the box. Specify the provider in the agent's model configuration. The tool calling interface remains identical regardless of the underlying model provider.

### Is Mastra suitable for production deployments?

Mastra is designed for production use. It supports deployment to Vercel, Cloudflare Workers, and any Node.js server. The framework includes built-in observability hooks, error handling, and structured logging for production monitoring.

---

#Mastra #TypeScript #AIAgents #Framework #ToolCalling #AgentMemory #AgenticAI #LearnAI #AIEngineering

---

Source: https://callsphere.ai/blog/mastra-framework-typescript-first-agent-development-guide
