BigQuery BigQuery Claude AI Dashboard Builder

From BigQuery to Interactive Dashboard in a Single Conversation with Claude

Juan Bello

Juan Bello

Founder, Porter Metrics

April 2026
14 min read
What makes this guide different: Every other tutorial stops at the query. They show you how to ask Claude about your data. This one shows you how to turn that data into a published, shareable interactive dashboard — without Looker, without Tableau, without a frontend developer.

You have millions of rows living in Google BigQuery. You know the answers are in there. But getting from raw data to a dashboard your team can actually use takes days — a ticket to the data team, a Jira backlog, a Looker developer, a design review. By the time the chart lands in front of you, the decision has already been made.

What if you could skip the entire queue? What if you could open Claude, ask a question in plain English, watch it query your BigQuery warehouse in real time, and then say “now build me an interactive dashboard from this” — and it just… did?

That is exactly what this guide covers. A complete, working pipeline that anyone with a Google Cloud project can set up in under 30 minutes — and that most companies with expensive BI subscriptions have never heard of.

The Architecture: Five Nodes, One Conversation

Before touching a terminal, it helps to understand what we are actually building. The full pipeline has five components that work together as a seamless chain:

Google BigQuery

Your data warehouse. Could be terabytes of transactional data, marketing events, product analytics, or anything else stored in GCP.

MCP Server

A Model Context Protocol server that acts as a secure bridge between Claude and your BigQuery project. It exposes three core tools: list tables, describe schema, run query.

Claude Desktop

The AI interface that acts as the MCP client. It connects to the MCP server, understands your data schema, translates natural language into SQL, and runs it.

Claude Artifact

When you ask Claude to visualize results, it generates a fully functional React component — charts, filters, KPI cards — rendered live in the Artifact panel.

Published Link

One click on the Publish button gives you a shareable URL you can drop in Slack, open in a client meeting, or embed anywhere.

That is the entire chain. Let us build it.

Prerequisites: What You Need Before Starting

This setup takes roughly 15–30 minutes if you have the prerequisites in order.

Requirement Details Free?
Google Cloud Project With BigQuery API enabled. Any existing GCP project works. ✅ BigQuery has a free tier (10 GB/month storage, 1 TB/month queries)
Claude Desktop The macOS or Windows desktop app from Anthropic. Not the web interface. ✅ Free to download. Requires a Claude account (free or Pro).
Node.js 18+ Required to run the MCP server via npx. Check with node --version. ✅ Free — nodejs.org
GCP Service Account A dedicated account for Claude with read-only BigQuery permissions. We create this in Step 1. ✅ Free
A BigQuery Dataset Any table with real or sample data. You can use Google’s public datasets if you do not have your own. ✅ Public datasets are free to query
No dataset yet? BigQuery’s public data library includes datasets like NYC Taxi Trips, GitHub activity, and the Google Analytics 4 sample. You can follow this entire guide using bigquery-public-data.thelook_ecommerce — a realistic e-commerce dataset with orders, users, and inventory tables.

Step 1: Create a Secure Service Account for Claude

The first thing we need is a dedicated Google Cloud identity for Claude — a service account with the minimum permissions required and nothing more. This is a critical security step that most tutorials skip entirely.

1.1 Create the Service Account

  1. 1

    Go to Google Cloud Console → IAM & Admin → Service Accounts

  2. 2

    Click “Create Service Account”

  3. 3

    Name it: claude-bigquery-readonly

  4. 4

    Add a description: “Read-only access for Claude MCP integration”

  5. 5

    Click Continue

1.2 Assign the Minimum Required Roles

Grant exactly these three roles — and no others. Starting read-only is a best practice you should not skip, especially when connecting any AI tool to production data.

IAM Role What It Allows Why It Is Needed
BigQuery Data Viewer Read data from tables and views Required to fetch query results
BigQuery Job User Run query jobs Required to execute SQL against your project
BigQuery Metadata Viewer List datasets, tables, and schemas Lets Claude understand your data structure
Security tip: Do NOT grant BigQuery Admin, BigQuery Data Editor, or BigQuery Data Owner. Claude only needs to read. Giving write access to an AI tool connected to production data is an unnecessary risk.

1.3 Download the JSON Key File

  1. 1

    Click on your new service account to open it

  2. 2

    Go to the Keys tab

  3. 3

    Click Add Key → Create new key

  4. 4

    Select JSON format and click Create

  5. 5

    A file downloads to your machine — keep it safe

Move the key file to a secure location and lock down the permissions:

# Move to a secure directory
mkdir -p ~/.credentials
mv ~/Downloads/your-key-file.json ~/.credentials/claude-bigquery.json

# Restrict file permissions (macOS / Linux)
chmod 600 ~/.credentials/claude-bigquery.json

Step 2: Configure the BigQuery MCP Server

The Model Context Protocol (MCP) is an open standard that lets Claude talk to external systems through a defined set of tools. The BigQuery MCP server exposes three core tools to Claude: list_tables, describe_table, and execute_query.

You have two options: a local MCP server (runs on your machine, more control) or the remote managed MCP server from Google (no installation, OAuth-based, currently in preview). For this guide we use the local approach — it works today without any waitlist.

2.1 Edit the Claude Desktop Configuration File

Open the Claude Desktop configuration file in a text editor:

# macOS
open ~/Library/Application\ Support/Claude/claude_desktop_config.json

# Windows
notepad %APPDATA%\Claude\claude_desktop_config.json

Add the following block inside mcpServers:

{
  "mcpServers": {
    "bigquery": {
      "command": "uvx",
      "args": ["mcp-server-bigquery"],
      "env": {
        "BIGQUERY_PROJECT": "your-gcp-project-id",
        "BIGQUERY_LOCATION": "US",
        "BIGQUERY_KEY_FILE": "/Users/yourname/.credentials/claude-bigquery.json"
      }
    }
  }
}
Replace these values: your-gcp-project-id → your actual GCP Project ID (find it in the Cloud Console top bar). BIGQUERY_LOCATION → the region of your dataset (US, EU, us-central1, etc.). The path to your JSON key file.

2.2 Restart Claude Desktop

Close Claude Desktop completely (quit, do not just minimize) and reopen it. On the new conversation screen, you should see a small hammer icon () in the input area — this confirms that Claude has detected and loaded the MCP server.

If the icon is not there, double-check your JSON config for syntax errors and ensure that uvx (or npx) is accessible in your terminal path.

Step 3: Verify the Connection with 3 Test Queries

With the MCP server running, open a new chat in Claude Desktop and run these three verification queries in order. They go from simple to complex and confirm that the full connection is working.

3 Verification Tests

  1. 1

    List all available tables — Type: “List all the datasets and tables available in my BigQuery project.” Claude will call the list_tables tool and return a structured list. If it returns results, your credentials and project ID are correct.

  2. 2

    Inspect a table schema — Type: “What columns are in the [your_table_name] table? Show me the data types and a short description of what each column likely represents.” This tests schema awareness — Claude’s ability to understand your data structure and enrich it with context.

  3. 3

    Run a real analytical query — Type: “Show me the top 10 orders from the last 30 days, grouped by category. Include totals and percentage of the overall.” If Claude returns a formatted table with accurate numbers pulled live from BigQuery, the pipeline is fully operational.

Step 4: The Master Prompt for Dashboard Generation

This is the step that no other tutorial covers. Most guides stop at “Claude can answer questions about your data.” What we are doing here is asking Claude to go further: take the data it just retrieved and build a complete interactive dashboard as a Claude Artifact.

The quality of the dashboard depends almost entirely on the quality of your prompt. Here is the prompt structure that consistently produces the best results:

You have access to my BigQuery data through the MCP connection.

STEP 1 — Data retrieval:
Query the [dataset.table_name] table and pull [metric_1], [metric_2], [metric_3]
for the last [time period], grouped by [dimension].
Also pull [secondary_table] for comparison if relevant.

STEP 2 — Dashboard generation:
Using the data you just retrieved, build a complete interactive dashboard as a React Artifact with:
- A header with the dashboard title and last-updated date
- KPI summary cards at the top showing: [KPI_1], [KPI_2], [KPI_3]
- A [bar/line/area] chart showing [primary metric] over time
- A [pie/donut/table] showing breakdown by [dimension]
- A filter bar that lets me filter by [field_1] and [field_2]
- Clean, professional design with a [light/dark] color scheme

AUDIENCE: [your boss / clients / the product team]
PURPOSE: [weekly review / board presentation / client report]
Real example using the thelook_ecommerce public dataset: “Query the bigquery-public-data.thelook_ecommerce.orders table and pull total orders, total revenue, and average order value for the last 90 days, grouped by week and by country. Also query order_items to get the top 10 product categories by revenue. Then build a complete interactive React dashboard with three KPI cards, a line chart of weekly revenue trend, a bar chart of top 10 product categories, and a filterable table by country. Dark theme, professional. Audience: e-commerce team weekly review.”

Step 5: Publish and Share Your Dashboard

Once your dashboard is rendered in the Artifact panel, publishing it is a single click. In the top-right corner of the Artifact panel, click the Publish button. Claude generates a public URL that opens the exact same interactive dashboard in a clean browser view — no login required for viewers.

You can share this link in:

  • Slack — Drop it in the team channel for async review
  • Meeting presentations — Open it full-screen instead of a static slide
  • Client reports — A live, interactive dashboard looks far more professional than a PDF export
  • Notion or Confluence — Embed the URL as an iframe in your documentation
Need to update the numbers? Just go back to Claude, ask it to re-query BigQuery with a new date range, and say “update the dashboard with the new data.” Claude edits the existing Artifact rather than rebuilding from scratch — iterations take under a minute.

Real-World Use Cases by Team and Industry

The setup is generic — the use cases are anything but. Here are concrete examples of how different teams are using this pipeline today.

SaaS

Product & Growth Metrics

What used to require a growth analyst and two days in Looker now takes three minutes. The same dashboard that cost $3,000/month in a BI platform subscription is generated on demand from a conversation.

Query my subscriptions and events tables. Show me MRR by plan tier for the last 6 months, monthly churn rate, and the top 5 in-app actions correlated with upgrades. Build a dashboard for the weekly product review.
E-Commerce

Revenue Operations

Claude does not just surface the numbers — it can explain the deltas. The narrative context it adds to the dashboard turns a chart into an insight.

Pull revenue, orders, and average order value from my orders table for Q1 2026 vs Q1 2025. Identify the top 3 drivers of the 12% growth. Build a comparison dashboard for the board presentation.
Marketing

Campaign ROI

Marketing teams without a dedicated analyst can now run their own attribution analysis — directly from the same BigQuery tables their data engineers maintain, without ever touching SQL.

Join my ad_spend table with my orders table. Calculate ROAS, cost per acquisition, and revenue attributed per channel for the last 30 days. Build a marketing performance dashboard.
Operations

Real-Time Monitoring

Ops teams can now self-serve their daily monitoring — no analyst required, no data pipeline to maintain separately.

Query my fulfillment_events table. Show me average fulfillment time by warehouse for the last 7 days, flag any warehouse where fulfillment time exceeded 48 hours, and show the trend over the past 30 days.

BigQuery + Claude vs Traditional BI Tools: An Honest Comparison

Let us address the obvious question: why would you use Claude for dashboards when tools like Looker, Tableau, or Power BI already exist? Here is an honest comparison based on real-world usage.

Dimension Looker / Tableau Power BI BigQuery + Claude
Time to first dashboard Days to weeks (modeling, LookML, deployment) Hours to days Under 30 minutes (including setup)
Cost (monthly) $3,000–$5,000+ (Looker Standard) $10–$20 per user ~$0 additional (BigQuery query costs only)
SQL knowledge required High (LookML, custom SQL) Medium (DAX, M query) None (natural language)
Dashboard customization High (with developer time) Medium High (just describe what you want)
Data freshness Real-time or scheduled Scheduled refresh On-demand (queries run live)
Governance & access control Enterprise-grade Enterprise-grade Via BigQuery IAM (solid for most teams)
Best for Enterprise, always-on reporting Microsoft ecosystem users Startups, ad-hoc analysis, fast iteration

The honest take: Claude is not replacing Looker for a 500-person enterprise with a dedicated data team and compliance requirements. It is, however, a genuine replacement for small to mid-sized teams that are paying $3,000 a month for a tool they use to answer the same ten questions every week.

The “Live Dashboard” vs the “Dead Dashboard”

There is a concept worth naming explicitly. Traditional BI dashboards are dead by design — they answer the questions the builder anticipated when they built them, and nothing more. Want to slice by a different dimension? That is a new report request.

The Claude pipeline produces something fundamentally different: a live conversation around your data. After Claude generates your initial dashboard, you can immediately continue:

  • “Break the revenue chart down by device type.”
  • “Remove the table and add a geographic map instead.”
  • “Add a benchmark line showing last year’s numbers.”
  • “What does this trend mean? What might be driving the drop in week 8?”

Each of these instructions updates the Artifact in real time. The dashboard evolves with the conversation. That is qualitatively different from anything a static BI tool offers.

Security: What You Must Get Right

Connecting an AI tool to a production data warehouse is not something to do carelessly. Here are the non-negotiable security practices for this setup:

  • Principle of least privilege — always: The service account should only have the three roles listed in Step 1. Use dataset-level IAM to restrict access to only the tables relevant to your use case.
  • Never expose credentials in version control: Your JSON key file should never be committed to a Git repository. Add .credentials/ to your .gitignore immediately. Consider using GCP Workload Identity Federation for production setups.
  • Disable dangerous SQL operations: If your MCP server implementation supports it, explicitly allow only SELECT and block INSERT, UPDATE, DELETE, DROP, and ALTER. Claude should only read — never write.
  • Audit query logs: Every query Claude runs through MCP is logged in BigQuery’s INFORMATION_SCHEMA. Periodically review JOBS_BY_PROJECT to audit what has been queried and catch any unexpected behavior.

Audit query to review Claude’s activity:

-- Audit recent queries made through Claude's service account
SELECT
  job_id,
  creation_time,
  user_email,
  query,
  total_bytes_processed,
  state
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
WHERE user_email = 'claude-bigquery-readonly@your-project.iam.gserviceaccount.com'
  AND creation_time > TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
ORDER BY creation_time DESC
LIMIT 50;

Next Steps: Where to Take This Further

Advanced

Multiple MCP Servers in Parallel

Claude Desktop can connect to multiple MCP servers simultaneously. If your sales data is in BigQuery but your CRM data is in a Postgres database, you can add a second MCP server for Postgres. Claude will then join insights across both systems in a single conversation.

Automation

Claude Code for Automation

Claude Code (the terminal-based CLI from Anthropic) also supports MCP and can run as an automated agent. You can script it to query BigQuery every morning, generate an updated dashboard, publish the Artifact, and send the link to Slack — fully autonomous.

Upcoming

Remote MCP Server (Google’s Managed Option)

Google launched a fully managed remote BigQuery MCP Server in preview in January 2026. Once it reaches general availability, it eliminates the need for a local Node.js server — the MCP endpoint is hosted by Google, authenticated via OAuth, and works from any Claude client.

No-Code

No-Code Alternatives

If the MCP setup feels too technical for your team, tools like Windsor.ai and Coupler.io offer managed connectors that sync BigQuery data to Claude without any configuration. Useful for non-technical stakeholders who want the BigQuery + Claude experience without the setup.

Conclusion: The Analyst That Never Sleeps

The shift happening here is more significant than it might appear. For the last decade, the bottleneck in data-driven organizations has not been the data — it has been the translation layer between the data and the people who need it. SQL, LookML, DAX, data modeling — these are hard enough to create a structural dependency on a specialized role.

What BigQuery + Claude + MCP does is collapse that translation layer entirely. The person with the question is now the person who gets the answer — directly, in seconds, with a shareable visual artifact that they built themselves in plain English.

That is not a small improvement in tooling. It is a rethinking of who gets to be a data analyst.

Try it now: If you have a Google Cloud account, you can run this entire guide using the free-tier BigQuery public datasets. Start with the thelook_ecommerce dataset and run the exact dashboard prompt from Step 4. The first time it works, you will understand immediately why this matters.

Further Reading & Resources

Ready to connect BigQuery to Claude?

Get your data warehouse talking to AI in under 30 minutes — no BI subscription, no SQL required.

Book a free setup call →