The Datafye MCP Servers

Datafye provides two complementary MCP (Model Context Protocol) servers that enable AI-assisted development workflows:

Overview

Model Context Protocol (MCP) is an open standard developed by Anthropic that allows AI tools and agents to securely connect to external data sources and tools. Datafye leverages MCP to provide AI tools with deep context about your environment and development workflow.

The Two MCP Servers

1. Environment MCP Server

The Environment MCP Server runs within your provisioned Datafye deployment and connects AI agents to your live environment components.

Where it runs: In your provisioned environment (AWS or local)

Available in: All deployment scenarios - provides information based on what's actually deployed

What it provides:

  • Real-time access to your data cloud datasets and schemas

  • Information about data flowing through your deployment

  • Access to backtest results and scorecards (if Algo Container is deployed)

  • Deployed algo configuration and parameters (if Algo Container is deployed)

  • Broker connectivity status (if Broker Connector is deployed)

  • Live environment status and metrics

Use cases:

  • Exploring what data is available in your deployment

  • Analyzing backtest results conversationally

  • Debugging deployed algos

  • Monitoring environment health

  • Understanding your specific deployment configuration

2. SDK MCP Server

The SDK MCP Server runs on your local development machine and provides AI agents with knowledge about how to develop algos using the Datafye SDK.

Where it runs: On your local development machine

Available in: All scenarios (works regardless of deployment type)

What it provides:

  • Datafye SDK API reference and method signatures

  • Code examples and best practices

  • Algo development patterns

  • SDK usage guidance

  • Language-specific SDK knowledge (Python, Java, etc.)

Use cases:

  • Learning how to code with the Datafye SDK

  • Getting code suggestions and examples

  • Understanding SDK best practices

  • Writing algos conversationally with AI assistance

  • Discovering SDK capabilities

How They Work Together

These two MCP servers complement each other:

Example workflow:

  1. Use the SDK MCP Server to learn how to write an EMA crossover strategy

  2. Your AI tool generates SDK code following best practices

  3. Deploy the strategy to your environment

  4. Use the Environment MCP Server to analyze backtest results and explore available data

  5. Iterate on your strategy using insights from both servers

Getting Started

For Concepts:

For How-To Guides:

Why "Vibe Coding"?

"Vibe coding" refers to a development style where you work conversationally with AI rather than looking up documentation or writing boilerplate from scratch:

Traditional approach:

  1. Read docs to find the right API

  2. Look up method signatures

  3. Copy examples and adapt them

  4. Debug when things don't work

  5. Repeat

Vibe coding with MCP:

  1. Describe what you want in natural language

  2. Your AI tool (via MCP) understands your context and generates appropriate code

  3. Iterate conversationally until it works

  4. Learn the patterns as you go

The Datafye MCP servers make this possible by giving your AI tool deep context about both your environment and the SDK.

Last updated