Build an MCP Server for 6 Million Rows
Upload a 6-million-record Parquet file, configure an MCP tool with server-side filtering, and connect to Claude Desktop for instant queries.
Build an MCP Server for 6 Million Rows
This guide demonstrates InstantRows' ability to handle massive datasets by uploading a Parquet file with 6 million records, configuring an MCP (Model Context Protocol) tool with server-side filtering, and connecting it to Claude Desktop for instant AI-powered queries.
Prerequisites
- Be logged in to your InstantRows account
- Have Claude Desktop installed
- A large Parquet file (6M+ rows) - or use the sample dataset
Step 1 — Upload Your Parquet File
- Navigate to the InstantRows home page
- Click Upload or drag and drop your Parquet file
- The file will be processed and you'll see a record count displayed
- You should see confirmation showing 6,000,000 rows (or your file's count)
- Click through to view the uploaded dataset
The system handles the 6 million records instantly, converting the columnar Parquet format for efficient querying.
Step 2 — Review Your Data
- After upload, you'll be redirected to the dataset view page
- Note the total record count displayed prominently on the page (6,000,000 rows)
- Browse through the available columns in your dataset
- The data is ready for MCP configuration
Step 3 — Configure MCP Tool
- On the dataset view page, locate the MCP or Configure MCP button
- Click to open the MCP configuration panel
- Fill in the MCP tool details:
- Tool Name: Enter a descriptive name (e.g., "Sales Order Lookup")
- Description: Describe what the tool does (e.g., "Query sales orders from a 6-million-record dataset")
- Select Columns: Choose which columns to include in the MCP tool
- Pick only the columns you want to expose to the LLM
- This reduces token costs and focuses the tool's purpose
- Example columns:
order_id,customer_name,product,amount,date,status
- Click Save or Create MCP Tool
Your MCP server is now configured with server-side filtering, ready to handle queries across all 6 million records.
Step 4 — Connect to Claude Desktop
- Copy the MCP server URL from InstantRows (it will look like:
https://instantrows.com/mcp-server/UdA6pVqN) - Open Claude Desktop
- Click your profile icon (bottom left) → Settings
- Navigate to Connectors in the left sidebar
- Click Add custom connector
- Fill in the connector details:
- Name: Enter a descriptive name (e.g., "Instantrows Sales Data")
- Remote MCP server URL: Paste your InstantRows MCP URL
- Click Add
- The connector will appear in your list with a "CUSTOM" badge
- You can click Configure or the ⋯ menu to manage it later
Step 5 — Query Your Data with Claude
- Open Claude Desktop
- Verify the MCP tool is connected (you should see it in the available tools)
- Ask Claude to query your data, for example:
- "Get sales data from instantrows for order ORD-0000000008"
- "Show me all orders over $10,000"
- "Find orders for customer John Smith"
Claude will call your MCP tool, which executes server-side filtering across all 6 million records and returns only the matching results. This keeps context windows clean and token costs low.
What This Demonstrates
- Scale: InstantRows handles millions of rows without breaking a sweat
- Parquet support: Native columnar format for 10x faster queries on large datasets
- MCP integration: Turn any dataset into an LLM-queryable tool
- Server-side filtering: Filter millions of rows without loading them into the LLM context
- Token efficiency: Only matching results are returned, keeping costs down
- No local infrastructure: No Docker, no databases, no servers to manage
Key Benefits
- Beyond Context Windows: Query 6M rows without cramming data into prompts
- Instant Setup: Upload → Configure → Query in under 2 minutes
- Production Ready: Stable MCP interface for AI-powered data access
- Cost Effective: Server-side filtering means minimal token usage
You've just turned 6 million rows into an AI-accessible data source with zero infrastructure!
Ready to Try It Yourself?
Join our waitlist to get early access. Start transforming your data in seconds.