Integration

The FullStackVibes MCP server — drop FSV retrieval into any agent.

Model Context Protocol is the standard agents use to discover and call tools provided by external services. The FullStackVibes MCP server exposes the Handshake retrieval primitive (and three adjacent read endpoints) as MCP tools. Plug it into Claude Desktop, Claude Code, Cursor, Cline, Continue, or any other MCP-aware client, and the agent can pull typed context windows from the public corpus on demand — no custom retrieval glue, no per-agent integration code.

Tools exposed

  • fsv_handshake — the Precision Bundle retrieval call. Filter by Space, window type, window tags, and pattern tags; pack to a character budget; receive typed windows with full provenance. See the Handshake spec.
  • fsv_list_spaces — list all Spaces (broad cross-cutting domains). Use this so the agent can discover what slugs to filter on.
  • fsv_get_artifact — fetch a single artifact by contextId + slug. Returns full body markdown plus its windows / tags / spaces.
  • fsv_search — full-text search across the corpus (optionally filtered by Space). Useful when the agent's request is open-ended — search first, then fsv_handshake or fsv_get_artifact on the matches.

Install

The server is a single Node.js file. No compilation, no build step. It depends only on the official @modelcontextprotocol/sdk package.

mkdir -p ~/.fsv-mcp && cd ~/.fsv-mcp
curl -sSL https://fullstackvibes.com/integrations/mcp/server.js     > server.js
curl -sSL https://fullstackvibes.com/integrations/mcp/package.json  > package.json
npm install

That installs @modelcontextprotocol/sdk alongside the server. Total install: ~12 MB, no native deps.

Configure your agent

Claude Desktop

Edit your claude_desktop_config.json (path varies by OS — search "Claude Desktop config" in Claude's docs for your platform):

{
  "mcpServers": {
    "fullstackvibes": {
      "command": "node",
      "args": ["/Users/you/.fsv-mcp/server.js"]
    }
  }
}

Claude Code

Add the server via the CLI:

claude mcp add fullstackvibes -- node /Users/you/.fsv-mcp/server.js

Cursor / Cline / Continue / generic stdio MCP

Same shape — point your client at node /path/to/server.js. No flags, no env vars required for the default configuration.

Optional environment

  • FSV_API_BASE — override the API host (default https://api.osenv.io). Useful for self-hosted FSV deployments.
  • FSV_INCLUDE_OWNER_KEPT — if true, the default includeOwnerKept for fsv_handshake flips on. Useful while the corpus is small — surfaces preview content (community-unreviewed) so an agent has something to retrieve.

Smoke test

You can drive the server over stdio without any MCP client to confirm install:

echo '{"jsonrpc":"2.0","id":1,"method":"tools/list"}' | node server.js

Expect a JSON response listing the four tools (fsv_handshake, fsv_list_spaces, fsv_get_artifact, fsv_search) with their input schemas.

Example agent prompt

Once the server is wired up, the agent decides when to call fsv_handshake automatically. A user asking "show me how to handle webhooks safely from TradingView" might cause the agent to issue:

{
  "tool": "fsv_handshake",
  "args": {
    "spaces":     ["fintech", "security"],
    "windowTypes": ["SCHEMA", "ANTI_PATTERN", "INSTRUCTION"],
    "windowTags":  ["gotcha"],
    "maxChars":    5000
  }
}

The bundle returns — typed windows ranked by quality, packed under 5000 characters, each with its source artifact's title, slug, and quality score. The agent folds those into its working context and answers the user with citation-ready provenance.

Provenance is preserved

Every window the MCP server returns carries the same provenance the underlying Handshake API exposes: contentHash, sourceInferenceRunId, and the parent manifest block (publicContextId, slug, qualityScore, spaces). An agent that surfaces FSV content to a user can include "from Title (quality 0.85)" without a second round-trip — the data is already in the tool result.

Source

The server is a single ~200-line Node.js file. Read it in place — nothing is hidden:

License: MIT. Retrieved corpus content carries its own per-artifact license (typically CC BY-SA 4.0); preserve attribution.

Next

This is the first integration we ship. The strategic posture is that FullStackVibes ships with whatever framework the agent is built on — langchain, llamaindex, vercel-ai-sdk, the Vercel AI SDK Tools format, OpenAI function-calling, etc. MCP is the highest-leverage starting point because every modern agent stack supports it. Adapters for the framework-specific shapes are next.