The Datafye MCP Servers
Datafye provides two complementary MCP (Model Context Protocol) servers that enable AI-assisted development workflows:
Overview
Model Context Protocol (MCP) is an open standard developed by Anthropic that allows AI tools and agents to securely connect to external data sources and tools. Datafye leverages MCP to provide AI tools with deep context about your environment and development workflow.
The Two MCP Servers
1. Environment MCP Server
The Environment MCP Server runs within your provisioned Datafye deployment and connects AI agents to your live environment components.
Where it runs: In your provisioned environment (AWS or local)
Available in: All deployment scenarios - provides information based on what's actually deployed
What it provides:
Real-time access to your data cloud datasets and schemas
Information about data flowing through your deployment
Access to backtest results and scorecards (if Algo Container is deployed)
Deployed algo configuration and parameters (if Algo Container is deployed)
Broker connectivity status (if Broker Connector is deployed)
Live environment status and metrics
Use cases:
Exploring what data is available in your deployment
Analyzing backtest results conversationally
Debugging deployed algos
Monitoring environment health
Understanding your specific deployment configuration
2. SDK MCP Server
The SDK MCP Server runs on your local development machine and provides AI agents with knowledge about how to develop algos using the Datafye SDK.
Where it runs: On your local development machine
Available in: All scenarios (works regardless of deployment type)
What it provides:
Datafye SDK API reference and method signatures
Code examples and best practices
Algo development patterns
SDK usage guidance
Language-specific SDK knowledge (Python, Java, etc.)
Use cases:
Learning how to code with the Datafye SDK
Getting code suggestions and examples
Understanding SDK best practices
Writing algos conversationally with AI assistance
Discovering SDK capabilities
How They Work Together
These two MCP servers complement each other:
Example workflow:
Use the SDK MCP Server to learn how to write an EMA crossover strategy
Your AI tool generates SDK code following best practices
Deploy the strategy to your environment
Use the Environment MCP Server to analyze backtest results and explore available data
Iterate on your strategy using insights from both servers
Getting Started
For Concepts:
Environment MCP Server - Learn about deployment-connected AI assistance
SDK MCP Server - Learn about SDK-focused AI assistance
For How-To Guides:
Integrating with AI Agents - Practical guides for using both MCP servers
Exploring Deployments with AI - Using the Environment MCP Server
Vibe Coding Algos with AI - Using the SDK MCP Server
Why "Vibe Coding"?
"Vibe coding" refers to a development style where you work conversationally with AI rather than looking up documentation or writing boilerplate from scratch:
Traditional approach:
Read docs to find the right API
Look up method signatures
Copy examples and adapt them
Debug when things don't work
Repeat
Vibe coding with MCP:
Describe what you want in natural language
Your AI tool (via MCP) understands your context and generates appropriate code
Iterate conversationally until it works
Learn the patterns as you go
The Datafye MCP servers make this possible by giving your AI tool deep context about both your environment and the SDK.
Last updated

