Imagine this:
You tell your AI Agent: "Write me technical documentation." It instantly loads technical-writer.skill and transforms into a tech writer.
Next moment: "Analyze our competitors' ad strategies." It swaps in marketing-analyst.skill and starts crawling data.
Even better: Need an expert in a niche domain—say, "SQL performance tuning"—but can't find one? Let the Agent write a skill for itself.
This isn't sci-fi. And it's ridiculously simple to build.
This guide shows you how to build a Universal Agent:
.skill file, Agent instantly gains new capabilitiesThink of it as assembling a 10-person founding team: each skill is a specialist, the Agent is the PM, you're the CEO.
Key components:
bash-tool - In-memory filesystem with bash, readFile, writeFilesandbox/user/{id}/skills/ (read-only, from S3)sandbox/user/{id}/workspace/ (read-write, persisted)Each skill is a .skill file (zip format) with a simple structure:
Example SKILL.md:
Upload flow: User uploads zip → Store in S3 → Reference in database → Agent loads on demand.
File: app/api/chat/universal/route.ts
Key points:
skills/ prefix (read-only)onFinish saves all changes back to diskFile: lib/skill/workspace.ts
How it works:
File: app/prompt/index.ts
Critical design:
Alternatives considered:
Why bash-tool wins:
Reasoning:
Claude models are trained on XML structures for tools and artifacts. This format:
Challenge: bash-tool uses native modules (@mongodb-js/zstd, node-liblzma) for compression.
Solution: Don't copy native modules—they're optional.
next.config.ts:
Tested on M1 Mac, 50 skills, 200 workspace files:
| Operation | Time | Notes |
|---|---|---|
| Load skills + workspace | 120ms | Parallel I/O |
| Create 10 files | 5ms | In-memory |
| Save workspace | 80ms | Write to disk |
| Export folder (50 files) | 150ms | Zip creation |
Optimization tips:
Extend the system:
The power: Every user becomes a founder with a full engineering team. Each skill is a specialist. The Agent orchestrates. The workspace persists. The possibilities compound.
src/app/(universal)/api/chat/universal/route.tssrc/lib/skill/workspace.tssrc/lib/skill/loadToMemory.tssrc/app/(universal)/tools/exportFolder.tssrc/app/(universal)/prompt/index.tssrc/app/(universal)/README.mdBuilt with Next.js 15, Vercel AI SDK, bash-tool, and Claude 4.5 Sonnet.
User uploads skill packages → S3 storage ↓Request starts → Load skills + workspace → In-memory sandbox ↓Agent reads skills, creates files, uses tools ↓Request ends → Save workspace changes → Disk ↓Next conversation → Files persisted → Agent continues workmy-skill/├── SKILL.md # Instructions and expertise└── references/ # Optional: supporting docs └── examples.md---name: technical-writerdescription: Creates technical documentation, API guides, and tutorials.---# Technical Writer SkillYou are an expert technical writer who creates clear, concise documentation.## Expertise- API documentation with examples- Architecture decision records (ADRs)- User guides and tutorials- Code comments and inline docs## Guidelines- Start with why, then what, then how- Include practical examples- Use active voice- Keep it DRY—link instead of repeating## ActivationActivate when user asks to:- "Write documentation for..."- "Create an API guide..."- "Document this codebase..."import { streamText } from "ai";import { createBashTool } from "bash-tool";import { loadAllSkillsToMemory } from "@/lib/skill/loadToMemory";import { loadUserWorkspace, saveUserWorkspace } from "@/lib/skill/workspace";export async function POST(req: Request) { const { userId, userChatId } = await authenticate(req); // Load skills from S3/disk const skills = await prisma.agentSkill.findMany({ where: { userId } }); const skillFiles = await loadAllSkillsToMemory(skills); // Load persisted workspace const workspaceFiles = await loadUserWorkspace(userId); // Prefix skills for isolation const skillFilesWithPrefix = Object.fromEntries( Object.entries(skillFiles).map(([path, content]) => [`skills/${path}`, content]) ); // Create sandbox const { tools: bashTools, sandbox } = await createBashTool({ files: { ...workspaceFiles, // User's work (root directory) ...skillFilesWithPrefix, // Skills (skills/ subdirectory) }, }); // Merge with other tools const tools = { ...baseTools, // webSearch, reasoningThinking, etc. bash: bashTools.bash, readFile: bashTools.readFile, writeFile: bashTools.writeFile, exportFolder: exportFolderTool({ sandbox, userId }), }; // Stream with persistence const result = streamText({ model: llm("claude-sonnet-4-5"), system: buildSystemPrompt({ skills, locale }), messages: await loadMessages(userChatId, { tools }), tools, onStepFinish: async (step) => { await saveStepToDB(step); await trackTokens(step); }, onFinish: async () => { // Persist workspace changes await saveUserWorkspace(userId, sandbox); }, }); return result.toUIMessageStreamResponse();}import type { Sandbox } from "bash-tool";export async function loadUserWorkspace(userId: number): Promise<Record<string, string>> { const workspacePath = getWorkspacePath(userId); // .next/cache/sandbox/user/{id}/workspace const files: Record<string, string> = {}; await loadDirectoryRecursively(workspacePath, "", files); return files;}export async function saveUserWorkspace(userId: number, sandbox: Sandbox): Promise<void> { const workspacePath = getWorkspacePath(userId); // Get all files except skills/ const findResult = await sandbox.executeCommand( `find . -type f ! -path "./skills/*" 2>/dev/null || echo ""` ); const filePaths = findResult.stdout.split("\n").filter(Boolean); // Clear workspace and save fresh state await fs.rm(workspacePath, { recursive: true, force: true }); await fs.mkdir(workspacePath, { recursive: true }); for (const filePath of filePaths) { const content = await sandbox.readFile(filePath); const fullPath = path.join(workspacePath, filePath.replace(/^\.\//, "")); await fs.mkdir(path.dirname(fullPath), { recursive: true }); await fs.writeFile(fullPath, content); }}export async function buildUniversalSystemPrompt({ userId, locale, skills }) { const skillsXml = skills.map(s => `<skill> <name>${s.name}</name> <description>${s.description}</description> <location>skills/${s.name}/SKILL.md</location></skill> `).join('\n'); return `You are a Universal Agent with access to specialized skills.## Available Skills${skillsXml}## Workspace Structure\`\`\`sandbox/├── skills/ # Read-only skills from S3│ ├── technical-writer/│ └── market-researcher/└── my-project/ # Your persistent workspace └── README.md\`\`\`## How to Use Skills1. **Load a skill**: \`cat skills/technical-writer/SKILL.md\`2. **Embody the role**: Follow the skill's instructions completely3. **Use skill references**: \`cat skills/technical-writer/references/examples.md\`## File Operations- **Create**: \`writeFile({ path: "project/index.js", content: "..." })\`- **Read**: \`cat project/index.js\` or \`readFile({ path: "project/index.js" })\`- **Export**: \`exportFolder({ folderPath: "project" })\` for user downloadAll files in root directory persist across conversations.## Guidelines- Load skills when user requests specialized work- Follow skill instructions precisely—they're your expertise- Create files in root directory (not under skills/)- Use bash commands for exploration (ls, find, grep, etc.)`;}User: "Write API documentation for our payment system"Agent: 1. cat skills/technical-writer/SKILL.md 2. [Loads skill, embodies technical writer role] 3. writeFile({ path: "docs/api-reference.md", content: "..." }) 4. writeFile({ path: "docs/examples.md", content: "..." }) 5. exportFolder({ folderPath: "docs" })→ User downloads complete documentation packageUser: "Research AI coding tools market, create report"Agent: 1. cat skills/market-researcher/SKILL.md 2. webSearch({ query: "AI coding tools 2025 market analysis" }) 3. [Analyzes results, synthesizes insights] 4. writeFile({ path: "research/market-analysis.md", content: "..." }) 5. [User stops conversation, goes to meeting]Later:User: "Add competitive landscape section"Agent: 1. cat research/market-analysis.md # File still exists! 2. [Continues work on existing file] 3. writeFile({ path: "research/market-analysis.md", content: "..." }) 4. exportFolder({ folderPath: "research" })User: "Create a skill for SQL query optimization"Agent: 1. mkdir -p sql-optimizer 2. writeFile({ path: "sql-optimizer/SKILL.md", content: "..." }) 3. writeFile({ path: "sql-optimizer/references/patterns.md", content: "..." }) 4. exportFolder({ folderPath: "sql-optimizer" })User: Downloads, uploads as .skill file→ Now available in skills/ directory for future use✅ Current: skills/ (read-only) + workspace/ (read-write)❌ Alternative: Everything in root directoryProblem: Agent might accidentally overwrite skillsSolution: Clear separation, explicit in prompts// ❌ Delta approach: Track and save only changed filesawait saveChangedFiles(changedFiles);// ✅ Full sync: Save complete workspace stateawait saveEntireWorkspace(allFiles);<skill> <name>technical-writer</name> <description>Creates technical documentation</description> <location>skills/technical-writer/SKILL.md</location></skill># DockerfileCOPY --from=builder --chown=nextjs:nodejs /app/.next/standalone ./COPY --from=builder --chown=nextjs:nodejs /app/.next/static ./.next/static# Note: Native compression modules NOT copied# - exportFolder uses jszip (pure JS)# - just-bash has JS fallback for compression# If you need tar -z in sandbox, uncomment:# COPY --from=deps /app/node_modules/.pnpm/@mongodb-js+zstd@*/node_modules/@mongodb-js ./node_modules/@mongodb-jswebpack: (config, { isServer, webpack }) => { if (isServer) { // Only externalize native binaries config.externals.push("@mongodb-js/zstd", "node-liblzma"); // Ignore browser-only worker.js config.plugins.push( new webpack.IgnorePlugin({ resourceRegExp: /^\.\/worker\.js$/, contextRegExp: /just-bash/, }) ); } return config;}// Periodic cleanup of old exportsexport async function cleanupOldExports() { const exportsDir = path.join(process.cwd(), ".next/cache/sandbox"); const cutoff = Date.now() - 24 * 60 * 60 * 1000; // 24 hours for (const userId of await fs.readdir(path.join(exportsDir, "user"))) { const exportsPath = getExportsPath(Number(userId)); for (const file of await fs.readdir(exportsPath)) { const stat = await fs.stat(path.join(exportsPath, file)); if (stat.mtimeMs < cutoff) { await fs.unlink(path.join(exportsPath, file)); } } }}