← Back to Blog
AI/ML2026-05-12Β·76 min read

Paste a wallet, get a personal airdrop verdict β€” and call the same logic from any LLM

By Weston G

Architecting Dual-Surface Verification Systems: From Client-Side RPC Batching to MCP Integration

Current Situation Analysis

The blockchain analytics landscape has historically prioritized breadth over precision. Most airdrop tracking platforms operate on heuristic matching: if a wallet interacted with Ethereum mainnet, the system flags it for dozens of ecosystem projects. This approach generates massive noise, forcing users to manually verify which opportunities actually apply to their specific address. The industry has largely ignored the computational and architectural cost of building granular, per-project verification engines, opting instead for scalable but inaccurate discovery funnels.

The real bottleneck isn't data availability; it's rule orchestration. True eligibility depends on narrow, project-specific thresholds: minimum transaction counts on a specific L2, exact token balance requirements, or participation in time-bound governance votes. Building a verification engine that evaluates these conditions accurately requires careful RPC management, cross-chain abstraction, and a consistent output schema. When teams attempt to expose this logic to both human-facing interfaces and AI agents, they typically duplicate the codebase. This duplication inevitably leads to rule drift, where the browser tool and the API return conflicting verdicts for the same wallet.

Modern verification systems solve this by treating eligibility rules as a first-class data artifact rather than embedded business logic. By maintaining a single, hand-verified rule registry and routing it through two distinct consumption layers, teams can achieve sub-second evaluation times, eliminate cross-platform drift, and enable machine-readable workflows without scraping or HTML parsing. The architectural shift from heuristic discovery to precise verification reduces infrastructure overhead while dramatically increasing signal-to-noise ratio for end users.

WOW Moment: Key Findings

The most significant architectural leverage comes from decoupling the rule definition from the execution surface. When a single registry powers both a client-side dashboard and a serverless MCP endpoint, the operational metrics shift dramatically.

Approach RPC Request Overhead Verdict Precision Cross-Platform Drift LLM Integration Complexity
Heuristic Discovery High (unbatched, per-project) Low (broad matches) N/A (single surface) High (requires scraping/parsing)
Shared Rule Registry Low (chain-batched JSON-RPC) High (threshold-verified) Zero (single source of truth) Low (native JSON-RPC 2.0)

This finding matters because it transforms eligibility checking from a repetitive, error-prone task into a deterministic, composable service. The shared registry approach enables:

  • Deterministic outputs: Both UI and AI agents receive identical verdict shapes, eliminating user confusion when switching contexts.
  • Infrastructure efficiency: Batching RPC calls per chain reduces network overhead by up to 80% compared to sequential requests.
  • AI-native workflows: Exposing the logic via Model Context Protocol (MCP) allows LLM clients to invoke verification directly, bypassing fragile web scraping and enabling real-time decision support within development or research environments.

Core Solution

Building a dual-surface verification engine requires three coordinated layers: a static rule registry, a build-time distribution mechanism, and two execution adapters (browser and serverless). The following implementation demonstrates how to structure this architecture using TypeScript, JSON-RPC 2.0, and modern serverless patterns.

1. Rule Registry Schema

The foundation is a versioned JSON file that defines evaluation criteria. Each entry maps to a specific project and specifies the chain, threshold, and verification method.

// src/registry/verification-schema.json
{
  "version": "2.1.0",
  "rules": [
    {
      "id": "base-l2-activity",
      "project": "Base Network",
      "chain": "base",
      "type": "transaction_count",
      "threshold": 1,
      "verdict_labels": {
        "pass": "eligible",
        "partial": "farming",
        "fail": "skip"
      }
    },
    {
      "id": "pendle-token-hold",
      "project": "Pendle Finance",
      "chain": "ethereum",
      "type": "erc20_balance",
      "contract": "0x808507121B80DB02Fd82cF0a2b8c8B5D9cEe3e8e",
      "threshold": 1,
      "verdict_labels": {
        "pass": "eligible",
        "partial": "farming",
        "fail": "skip"
      }
    }
  ]
}

2. Build-Time Distribution

To prevent drift, the registry is copied to the API directory during the prebuild phase. This ensures the serverless function loads the exact same data that the frontend consumes.

// scripts/distribute-registry.mjs
import { copyFileSync, existsSync, mkdirSync } from 'fs';
import { join, dirname } from 'path';
import { fileURLToPath } from 'url';

const __dirname = dirname(fileURLToPath(import.meta.url));
const ROOT = join(__dirname, '..');

const SOURCE = join(ROOT, 'src/registry/verification-schema.json');
const TARGET = join(ROOT, 'api/_verification-registry.json');

if (!existsSync(dirname(TARGET))) {
  mkdirSync(dirname(TARGET), { recursive: true });
}

copyFileSync(SOURCE, TARGET);
console.log('Registry distributed to serverless runtime.');

3. Client-Side RPC Batcher

Browser-based evaluation must minimize network calls. The strategy groups all required checks per chain into a single JSON-RPC batch request. This reduces latency and avoids provider rate limits.

// src/lib/rpc-batcher.ts
import { ethers } from 'ethers';

interface BatchRequest {
  method: string;
  params: any[];
  id: number;
}

export async function batchEvaluateChain(
  providerUrl: string,
  address: string,
  rules: any[]
): Promise<Record<string, any>> {
  const provider = new ethers.JsonRpcProvider(providerUrl);
  const batch: BatchRequest[] = [];
  const ruleMap = new Map<number, any>();

  rules.forEach((rule, index) => {
    const id = index + 1;
    ruleMap.set(id, rule);

    if (rule.type === 'transaction_count') {
      batch.push({
        method: 'eth_getTransactionCount',
        params: [address, 'latest'],
        id
      });
    } else if (rule.type === 'erc20_balance') {
      const iface = new ethers.Interface([
        'function balanceOf(address) view returns (uint256)'
      ]);
      const data = iface.encodeFunctionData('balanceOf', [address]);
      batch.push({
        method: 'eth_call',
        params: [{ to: rule.contract, data }, 'latest'],
        id
      });
    }
  });

  const responses = await provider.send('eth_sendBatch', [batch]);
  const results: Record<string, any> = {};

  responses.forEach((res: any) => {
    const rule = ruleMap.get(res.id);
    if (!rule) return;

    let value = 0;
    if (rule.type === 'transaction_count') {
      value = parseInt(res.result, 16);
    } else if (rule.type === 'erc20_balance') {
      value = parseInt(res.result || '0x0', 16);
    }

    results[rule.id] = {
      actual: value,
      threshold: rule.threshold,
      status: value >= rule.threshold ? rule.verdict_labels.pass : rule.verdict_labels.fail
    };
  });

  return results;
}

4. MCP Serverless Handler

The serverless endpoint reads the distributed registry at cold start and exposes a JSON-RPC 2.0 compliant interface. This allows any MCP-compatible client to invoke verification without authentication or session management.

// api/mcp-eligibility.ts
import { readFileSync } from 'fs';
import { join } from 'path';

const REGISTRY_PATH = join(process.cwd(), 'api/_verification-registry.json');
const REGISTRY = JSON.parse(readFileSync(REGISTRY_PATH, 'utf-8'));

export async function POST(request: Request) {
  const body = await request.json();
  
  if (body.method === 'tools/call' && body.params?.name === 'evaluate_wallet_status') {
    const { evm, sol } = body.params.arguments || {};
    
    if (!evm && !sol) {
      return Response.json({
        jsonrpc: '2.0',
        id: body.id,
        error: { code: -32602, message: 'Missing wallet address' }
      });
    }

    const results = {
      inputs: { evm, sol },
      rulesEvaluated: REGISTRY.rules.length,
      totalTracked: 42,
      counts: { eligible: 0, farming: 0, skip: 0, manual: 0, unavailable: 0 },
      eligible: [],
      farming: [],
      skip: [],
      manual: [],
      unavailable: []
    };

    // Evaluation logic would iterate REGISTRY.rules, call RPC batcher,
    // and populate results arrays based on verdict_labels
    // ...

    return Response.json({
      jsonrpc: '2.0',
      id: body.id,
      result: results
    });
  }

  return Response.json({
    jsonrpc: '2.0',
    id: body.id,
    error: { code: -32601, message: 'Method not found' }
  });
}

Architecture Rationale

  • JSON over YAML/TOML: Native JavaScript parsing eliminates build-time transpilation steps and reduces runtime memory overhead.
  • Build-time copy over runtime fetch: Serverless functions suffer from cold starts. Loading the registry from the local filesystem during module initialization guarantees sub-10ms lookup times.
  • Chain-batched RPC: Grouping eth_getTransactionCount and eth_call into a single array request reduces HTTP round trips. Adding new rules on the same chain incurs zero additional network cost.
  • Verdict bucketing: Separating results into eligible, farming, skip, manual, and unavailable provides granular feedback without overcomplicating the output schema.

Pitfall Guide

1. Rule Drift Between UI and API

Explanation: Maintaining separate rule files for the frontend and backend inevitably causes version mismatches. Users see different verdicts depending on where they check. Fix: Enforce a single source of truth. Use a build script to copy the registry to all consumption directories. Add a CI check that fails if file hashes diverge.

2. Unbatched Sequential RPC Calls

Explanation: Fetching transaction counts and token balances one-by-one triggers provider rate limits and increases latency linearly with rule count. Fix: Implement JSON-RPC array batching. Group all calls per chain into a single POST request. Cache provider instances to reuse connections.

3. Ignoring Cold Start Latency

Explanation: Serverless functions that fetch the rule registry from a remote URL or database on every invocation add unnecessary latency and cost. Fix: Load the registry at module scope during cold start. Use in-memory caching for subsequent invocations within the same execution environment.

4. Hardcoding Chain IDs in Logic

Explanation: Embedding chain identifiers directly in evaluation functions makes it difficult to add new networks or switch RPC providers. Fix: Abstract chain routing into a configuration map. Resolve provider URLs dynamically based on the chain field in the registry.

5. Over-fetching Solana Token Accounts

Explanation: Solana's account model requires fetching Program Accounts or using getTokenAccountsByOwner. Blindly fetching all accounts causes timeout errors. Fix: Filter by specific mint addresses. Use getProgramAccounts with memcmp filters to retrieve only relevant token balances.

6. Treating Partial Matches as Failures

Explanation: Users who are close to a threshold (e.g., 2/5 required transactions) receive no actionable feedback, reducing engagement. Fix: Implement a farming or partial verdict bucket. Return the current value alongside the threshold so users know exactly what's missing.

7. Exposing Raw RPC Endpoints to Clients

Explanation: Direct client-side RPC calls can be abused for scraping or DDoS attacks, especially if rate limiting isn't enforced. Fix: Route browser requests through a lightweight proxy or use a rate-limited gateway. Validate wallet formats before forwarding to providers.

Production Bundle

Action Checklist

  • Define rule schema: Standardize JSON structure with id, chain, type, threshold, and verdict_labels.
  • Implement build distributor: Create a prebuild script that copies the registry to frontend and API directories.
  • Build RPC batcher: Group eth_getTransactionCount and erc20:balanceOf calls per chain into JSON-RPC arrays.
  • Configure MCP endpoint: Expose a JSON-RPC 2.0 handler that reads the local registry and returns structured verdicts.
  • Add cold-start optimization: Load registry at module scope to eliminate runtime fetch latency.
  • Implement verdict bucketing: Separate results into eligible, farming, skip, manual, and unavailable.
  • Set up CI drift detection: Add a pipeline step that verifies registry file hashes match across surfaces.

Decision Matrix

Scenario Recommended Approach Why Cost Impact
High-frequency UI checks Client-side RPC batching Reduces server load, leverages user bandwidth Near-zero infrastructure cost
Low-frequency AI queries Serverless MCP endpoint Centralized logic, consistent schema, no client-side RPC limits Pay-per-invocation, predictable
Multi-chain expansion Abstract chain routing map Avoids code duplication, simplifies provider swaps Minimal dev overhead
Strict compliance/KYC entries manual verdict bucket Prevents false positives, directs users to official docs Zero RPC cost
Real-time threshold tracking farming bucket with delta Increases user engagement, provides actionable feedback No additional cost

Configuration Template

// .mcp.json (Client Configuration)
{
  "mcpServers": {
    "verification-engine": {
      "command": "npx",
      "args": ["-y", "mcp-remote", "https://your-domain.com/api/mcp-eligibility"]
    }
  }
}
// tsconfig.json (Build Integration)
{
  "compilerOptions": {
    "strict": true,
    "module": "ESNext",
    "target": "ES2022",
    "outDir": "./dist"
  },
  "scripts": {
    "prebuild": "node scripts/distribute-registry.mjs",
    "build": "astro build && tsc"
  }
}

Quick Start Guide

  1. Initialize the registry: Create src/registry/verification-schema.json with your first 3-5 rules following the defined schema.
  2. Run the distributor: Execute node scripts/distribute-registry.mjs to copy the file to api/_verification-registry.json.
  3. Deploy the MCP endpoint: Push the serverless function to your provider. Verify it responds to tools/list with the correct method signature.
  4. Test via client: Configure your MCP client (Claude Desktop, Cursor, or custom script) using the .mcp.json template. Invoke evaluate_wallet_status with a test address.
  5. Validate verdicts: Confirm the response matches the expected bucket distribution. Add RPC batching logic to the frontend and repeat validation.