Environment MCP Server

The Datafye Environment MCP Server runs within your provisioned Datafye deployment and connects AI agents to your live environment components, providing real-time access to your data cloud, deployed algos, and backtest results.

What It Does

The Environment MCP Server acts as a bridge between AI tools and your running Datafye deployment, giving your AI tool context-aware access to:

  • Your deployed data cloud - Datasets, schemas, and actual data flowing through your environment (available in all scenarios)

  • Your algo runtime - Deployed algo configurations, parameters, and state (if Algo Container is deployed)

  • Your backtest results - Scorecards, performance metrics, and comparison data (if Backtesting Engine is deployed)

  • Your broker connectivity - Trading status, positions, and execution info (if Broker Connector is deployed)

  • Your environment status - Health checks, component status, and deployment information (available in all scenarios)

Where It Runs

The Environment MCP Server runs within your provisioned Datafye environment, either:

  • In your AWS cloud deployment (EC2 instances)

  • In your local Docker-based deployment

It is available in all deployment scenarios and provides information based on what's actually deployed:

Foundry: Data Cloud Only

  • Environment MCP Server provides access to Data Cloud datasets, schemas, and market data

  • No algo or backtesting capabilities (you're using your own containers)

Foundry: Full Stack

  • Environment MCP Server provides access to Data Cloud, Algo Container runtime, and Backtesting Engine

  • Full access to backtest results, scorecards, and deployed algo configurations

Trading: Data Cloud + Broker

  • Environment MCP Server provides access to Data Cloud and Broker Connector

  • Includes broker connectivity status, positions, and trading information

  • No algo container capabilities (you're using your own containers)

Trading: Full Stack

  • Environment MCP Server provides access to all components: Data Cloud, Algo Container, Broker Connector

  • Complete visibility into data, algos, backtests, and live trading

How It Works

  1. You ask your AI tool a question about your deployment (e.g., "What data do I have for AAPL?")

  2. Your AI tool connects to the Environment MCP Server running in your deployment via MCP protocol

  3. The MCP Server queries your Datafye components (Data Cloud, Algo Container, etc.) via internal APIs

  4. Results are returned to your AI tool, which synthesizes them into helpful responses

  5. Your data stays private - it never leaves your environment; only the MCP server has access

What It Provides

Data Exploration Tools

The Environment MCP Server gives your AI tool real-time access to your data cloud:

Dataset Discovery:

You: "What datasets do I have access to?"

AI Assistant (via Environment MCP): "You have access to SIP data from Polygon with trades and quotes schemas for SP500 constituents. Your data descriptor includes 1-minute OHLC bars and 20-period EMA aggregates with 180 days of history."

Schema Inspection:

You: "What fields are in the trades schema?"

AI Assistant: "The trades schema includes: timestamp, symbol, price, size, exchange, conditions, and tape. Let me show you the field types and descriptions..."

Data Sampling:

You: "Get me some sample trade data for AAPL from today"

AI Assistant: "Here are the last 10 trades for AAPL: [shows actual data from your deployment]"

Historical Queries:

You: "What's the price range for NVDA over the last month?"

AI Assistant: "Looking at your historical OHLC data, NVDA ranged from $127.34 to $145.89 over the last 30 days, with a current price of $142.50."

Algo Runtime Inspection

The Environment MCP Server understands your deployed algo's configuration:

Configuration Inspection:

You: "What parameters is my momentum algo configured with?"

AI Assistant: "Your momentum algo is configured with: lookback_period=20, entry_threshold=0.02, exit_threshold=0.01, max_position_size=1000. It's currently subscribed to SPY, QQQ, and IWM with 1-minute bars."

State Monitoring:

You: "Is my algo running? What positions does it have?"

AI Assistant: "Your algo is running and healthy. Current positions: Long 500 shares of SPY at $450.25 (entry time: 10:35 AM, current P&L: +$127.50)."

Backtesting Support

The Environment MCP Server helps you interpret backtest results:

Scorecard Analysis:

You: "What does my Sharpe ratio of 1.2 tell me about this strategy?"

AI Assistant: "A Sharpe ratio of 1.2 indicates your strategy generates decent risk-adjusted returns. For context, your backtest shows annualized return of 18% with volatility of 15%. This is above the typical threshold of 1.0 for acceptable strategies, but there's room for improvement..."

Performance Metrics:

You: "Why is my max drawdown 25%? That seems high."

AI Assistant: "Looking at your backtest results, the 25% drawdown occurred during the March 2020 market crash. Your strategy's momentum signals led to late exits. Consider adding volatility filters or circuit breakers for extreme market conditions..."

Comparison:

You: "Compare my last two backtest runs and explain the differences"

AI Assistant: "Run #47 (lookback=20) vs Run #48 (lookback=30):

  • Sharpe improved from 1.2 to 1.4

  • Max drawdown reduced from 25% to 18%

  • Win rate increased from 52% to 57%

  • Total trades decreased from 450 to 320 (longer lookback = fewer signals) The longer lookback period appears to filter out false signals..."

Security and Privacy

The Environment MCP Server runs within your Datafye private cloud deployment:

  • Your data stays private - The MCP server runs in your environment, not on third-party AI provider servers

  • API credentials are yours - AI tools don't have direct access to your Datafye APIs; the MCP server mediates all access

  • You control permissions - You configure which MCP tools your AI tool can access

  • Audit trail - All MCP interactions are logged in your deployment

Data flow:

  1. Your prompts go to your AI provider's servers (e.g., Anthropic for Claude)

  2. The AI tool requests data through MCP protocol to the server running in your environment

  3. The MCP server queries your Datafye deployment

  4. Results come back to the AI tool, which synthesizes them into helpful responses

  5. Your actual market data and algo logic never leave your private cloud

Integration Options

The Environment MCP Server works with any AI tool, agent, or LLM that integrates with MCP servers:

Common Integrations:

  • Development IDEs - VS Code, Cursor, IntelliJ, Replit (with MCP-compatible AI extensions)

  • Standalone AI Tools - Claude (desktop/web with MCP), ChatGPT (if MCP-enabled), Claude Code

  • Other AI Agents - Any LLM or AI agent that supports the MCP protocol

Setup:

  1. Note the Environment MCP Server endpoint from your deployment output (shown during provisioning)

  2. Configure the endpoint in your AI tool's MCP settings

  3. Authenticate with your Datafye deployment

  4. Start exploring your environment conversationally

See the Exploring Deployments with AI guide for detailed setup instructions.

Use Cases

Great for:

  • Exploring unfamiliar datasets in your deployment

  • Understanding what data is available for specific symbols

  • Debugging why backtests aren't producing expected results

  • Analyzing backtest scorecards and performance metrics

  • Comparing multiple backtest runs to find optimal parameters

  • Monitoring deployed algo health and positions

  • Understanding your environment's configuration

Not designed for:

  • Learning how to write SDK code (use the SDK MCP Server instead)

  • Production algo execution (algos run independently of the MCP server)

  • Data ingestion (the data cloud handles this)

  • Trade execution (the broker connector handles this)

The Environment MCP Server is a development and monitoring tool - it helps you understand and debug your deployment, but your algos don't depend on it at runtime.

Relationship with SDK MCP Server

The Environment MCP Server complements the SDK MCP Server:

  • SDK MCP Server teaches you how to write algo code

  • Environment MCP Server helps you understand what's running in your deployment

Use them together for a complete AI-assisted workflow:

  1. Learn SDK patterns from the SDK MCP Server

  2. Write your algo code

  3. Deploy and backtest

  4. Analyze results using the Environment MCP Server

  5. Iterate based on insights

Next Steps


Last updated: 2025-10-23

Last updated