Build an MCP Server for 6 Million Rows

Upload a 6-million-record Parquet file, configure an MCP tool with server-side filtering, and connect to Claude Desktop for instant queries.

Build an MCP Server for 6 Million Rows

This guide demonstrates InstantRows' ability to handle massive datasets by uploading a Parquet file with 6 million records, configuring an MCP (Model Context Protocol) tool with server-side filtering, and connecting it to Claude Desktop for instant AI-powered queries.

Prerequisites

  • Be logged in to your InstantRows account
  • Have Claude Desktop installed
  • A large Parquet file (6M+ rows) - or use the sample dataset

Step 1 — Upload Your Parquet File

  1. Navigate to the InstantRows home page
  2. Click Upload or drag and drop your Parquet file
  3. The file will be processed and you'll see a record count displayed
  4. You should see confirmation showing 6,000,000 rows (or your file's count)
  5. Click through to view the uploaded dataset

The system handles the 6 million records instantly, converting the columnar Parquet format for efficient querying.

Step 2 — Review Your Data

  1. After upload, you'll be redirected to the dataset view page
  2. Note the total record count displayed prominently on the page (6,000,000 rows)
  3. Browse through the available columns in your dataset
  4. The data is ready for MCP configuration

Step 3 — Configure MCP Tool

  1. On the dataset view page, locate the MCP or Configure MCP button
  2. Click to open the MCP configuration panel
  3. Fill in the MCP tool details:
    • Tool Name: Enter a descriptive name (e.g., "Sales Order Lookup")
    • Description: Describe what the tool does (e.g., "Query sales orders from a 6-million-record dataset")
  4. Select Columns: Choose which columns to include in the MCP tool
    • Pick only the columns you want to expose to the LLM
    • This reduces token costs and focuses the tool's purpose
    • Example columns: order_id, customer_name, product, amount, date, status
  5. Click Save or Create MCP Tool

Your MCP server is now configured with server-side filtering, ready to handle queries across all 6 million records.

Step 4 — Connect to Claude Desktop

  1. Copy the MCP server URL from InstantRows (it will look like: https://instantrows.com/mcp-server/UdA6pVqN)
  2. Open Claude Desktop
  3. Click your profile icon (bottom left) → Settings
  4. Navigate to Connectors in the left sidebar
  5. Click Add custom connector
  6. Fill in the connector details:
    • Name: Enter a descriptive name (e.g., "Instantrows Sales Data")
    • Remote MCP server URL: Paste your InstantRows MCP URL
  7. Click Add
  8. The connector will appear in your list with a "CUSTOM" badge
  9. You can click Configure or the menu to manage it later

Step 5 — Query Your Data with Claude

  1. Open Claude Desktop
  2. Verify the MCP tool is connected (you should see it in the available tools)
  3. Ask Claude to query your data, for example:
    • "Get sales data from instantrows for order ORD-0000000008"
    • "Show me all orders over $10,000"
    • "Find orders for customer John Smith"

Claude will call your MCP tool, which executes server-side filtering across all 6 million records and returns only the matching results. This keeps context windows clean and token costs low.

What This Demonstrates

  • Scale: InstantRows handles millions of rows without breaking a sweat
  • Parquet support: Native columnar format for 10x faster queries on large datasets
  • MCP integration: Turn any dataset into an LLM-queryable tool
  • Server-side filtering: Filter millions of rows without loading them into the LLM context
  • Token efficiency: Only matching results are returned, keeping costs down
  • No local infrastructure: No Docker, no databases, no servers to manage

Key Benefits

  • Beyond Context Windows: Query 6M rows without cramming data into prompts
  • Instant Setup: Upload → Configure → Query in under 2 minutes
  • Production Ready: Stable MCP interface for AI-powered data access
  • Cost Effective: Server-side filtering means minimal token usage

You've just turned 6 million rows into an AI-accessible data source with zero infrastructure!

Ready to Try It Yourself?

Join our waitlist to get early access. Start transforming your data in seconds.