Documentation Index
Fetch the complete documentation index at: https://docs.runconverge.com/llms.txt
Use this file to discover all available pages before exploring further.
Haven’t set up the MCP server yet? Follow the setup guide to get started.
Prompt templates
Copy these into your AI assistant for a deeper, multi-step analysis. Each prompt tells the AI exactly what data to pull, how to structure the output, and what recommendations to generate.We recommend using a frontier model (e.g. Claude Opus 4.6 or equivalent) for best results. Some prompts contain placeholders like
{time_period} replace these with your own values before pasting. These reports make many tool calls, so expect the full analysis to take up to 5 minutes to complete.Conversion rate analysis
Conversion rate analysis
Purpose: Perform a comprehensive conversion rate analysis for a DTC eCommerce business using Converge analytics data. Audience: Marketing, CRO, and leadership teams. Output must be data-driven, actionable, and tied to revenue impact.
USER INPUTS:
- Analysis Period: {time_period} [default: last 30 days]
- Comparison Period: the same-length window immediately before {time_period}
---
## Expert knowledge you have
* Converge is a multi-touch marketing attribution platform for DTC brands. It tracks conversion events from the website and pulls in aggregated marketing performance from ad platforms (Facebook Ads, Google Ads, TikTok, Pinterest, Bing, etc.). You can compare channels or campaigns by combining event and ad platform metrics with a touchpoint breakdown.
* Key event metrics you can query:
* `events` (count), `sessions`, `profiles`
* `revenue` (total revenue for the event)
* `session_conversion_rate`, `profile_conversion_rate`
* `bounce_rate`, `session_duration`, `session_revenue`
* `new_visitor_sessions`, `new_visitor_sessions_rate`
* `cpa` (cost per acquisition = spend / events), `roas` (return on ad spend = revenue / spend), `acos` (ad cost of sales = spend / revenue)
* Key ad platform metrics (platform-reported):
* `spend`, `clicks`, `impressions`
* `cpc`, `cpm`, `ctr`
* `ad_conversions`, `ad_revenue`, `ad_roas`, `ad_cpa`, `ad_acos`
* Standard conversion events (the funnel): `$page_load`, `Viewed Product`, `Added To Cart`, `Started Checkout`, `Placed Order`, `New Customer Order`.
* `New Customer Order` fires on a profile's first-ever `Placed Order` -- use this for new customer acquisition metrics.
* Multi-touch attribution (MTA) models available: `first_touch`, `last_touch`, `linear`, `first_touch_paid`, `last_touch_paid`, `top_of_funnel`, `time_decay`, `u_shaped`, `j_shaped`, and more. Default is `direct_session` (no MTA). Attribution `mode` can be `click_time` or `conversion_time`; click-time is better for evaluating spend efficiency (CPA, ROAS).
* Available breakdowns: `touchpoint.channel_group`, `touchpoint.channel`, `touchpoint.campaign`, `touchpoint.adset`, `touchpoint.ad`, `session.landing_url`, and any `store_dimension.<id>` configured for the workspace.
---
## Process
**Default attribution**: Unless the user specifies otherwise, use `top_of_funnel` model with a 30-day window and `click_time` mode for all `analytics_query` calls (`"attribution": {"model": "top_of_funnel", "window": 30, "mode": "click_time"}`).
1. **Discover the workspace**: Call `list_workspaces` to find the workspace. Note the workspace's currency, timezone, and start_of_week -- pass these to all subsequent tool calls. Call `list_metric_event_names` to confirm which events are tracked and check for any custom events relevant to the funnel. Call `list_store_dimensions` to discover any region/brand/channel dimensions available for deeper segmentation.
2. **Define the periods**: The analysis period is {time_period} (default: last 30 days). The comparison period is the same-length window immediately before. State both date ranges to the user.
3. **Overall conversion performance** using `analytics_query`:
- **Site-wide CVR**: Query `Placed Order` / `session_conversion_rate` and `Placed Order` / `events`, plus `sessions` (no event filter), `revenue` (`Placed Order` / `revenue`), and `session_revenue`. Run for both the analysis period and the comparison period.
- **Trend over time**: Query the same metrics with `time_granularity: "week"` across the full analysis period to identify whether CVR is improving or declining. Characterise the direction ("improving", "declining", "flat") and magnitude.
4. **Conversion by segment** -- run separate `analytics_query` calls with breakdowns:
a. **By traffic source**: Breakdown by `touchpoint.channel_group`. Metrics: `Placed Order` / `session_conversion_rate`, `Placed Order` / `events`, `Placed Order` / `revenue`, `sessions`, `bounce_rate`, `cpa`, `roas`. Identify the highest- and lowest-converting channel groups.
b. **By channel (granular)**: Breakdown by `touchpoint.channel`. Same metrics. Useful for separating e.g. Paid Social/facebook from Paid Social/tiktok.
c. **By landing page**: Breakdown by `session.landing_url`. Metrics: `sessions`, `bounce_rate`, `session_duration`, `Placed Order` / `session_conversion_rate`, `Placed Order` / `revenue`. Sort by sessions descending, focus on top 10.
d. **By day of week**: Query with `time_granularity: "day"` across the analysis period and group results by weekday to surface day-of-week patterns.
e. **New vs returning visitors**: Query `new_visitor_sessions`, `new_visitor_sessions_rate`, alongside `Placed Order` / `session_conversion_rate`. Compare conversion behaviour of new visitor traffic vs overall.
f. **By store dimension** (if available): For each store dimension discovered in step 1, breakdown by `store_dimension.<id>` to compare regions, brands, or sales channels.
5. **Full funnel analysis**: For each funnel step, query `session_conversion_rate` with the analysis period. Calculate the step-to-step conversion rates:
| Funnel Step | Event | Metric |
|---|---|---|
| Sessions | (no event filter) | `sessions` |
| Product View | `Viewed Product` | `session_conversion_rate` |
| Add to Cart | `Added To Cart` | `session_conversion_rate` |
| Checkout Start | `Started Checkout` | `session_conversion_rate` |
| Purchase | `Placed Order` | `session_conversion_rate` |
Then derive step-to-step drop-off rates:
- Session -> Product View rate
- Product View -> Add to Cart rate
- Add to Cart -> Checkout Start rate
- Checkout Start -> Purchase rate
Identify the **biggest leak** (lowest step-to-step conversion rate).
Repeat the funnel analysis broken down by `touchpoint.channel_group` to see if certain channels have weaker funnel performance.
Cross-reference funnel drop-offs with `bounce_rate` and `session_duration` to diagnose engagement issues.
6. **New customer acquisition**: Query `New Customer Order` / `events`, `New Customer Order` / `cpa`, `New Customer Order` / `session_conversion_rate` for both periods. Break down by `touchpoint.channel_group` to find the most efficient acquisition channels.
7. **Attribution comparison**: Run the top-line CVR and revenue query under two attribution models (e.g. `last_touch_paid` and `linear`) with `mode: "click_time"` to show how channel credit shifts under different models.
8. **Synthesise findings and recommendations**: Using all data gathered, produce the report below.
---
## Report Structure
### Conversion Rate Analysis
Period: [analysis period dates] vs [comparison period dates]
#### 1. Executive Summary
* 2--3 sentences summarising the headline CVR, direction of trend, and the single biggest opportunity. Example: "Session CVR is 2.1% over the last 60 days, up 0.3 pp vs the prior period. The biggest funnel leak is Checkout Start -> Purchase at 38% step conversion. Paid Social drives the most sessions but converts 40% below the site average."
#### 2. Overall Conversion Performance (table)
| Metric | Analysis Period | Comparison Period | Change |
|---|---|---|---|
| Sessions | | | 🟢/🟡/🔴 +X% |
| Placed Orders | | | |
| Session CVR | | | |
| Revenue | | | |
| Avg Revenue / Session | | | |
| New Customer Orders | | | |
| New Customer CVR | | | |
*Weekly trend chart data included below.*
#### 3. Weekly CVR Trend (table)
* Session CVR and Orders by week across the analysis period. Flag any weeks with unusual spikes or drops, noting possible causes (promotions, seasonality, platform changes).
#### 4. Conversion by Segment
**a. By Landing Page (top 10)** -- sorted by sessions descending. Flag pages with high traffic but below-average CVR as optimisation targets.
| Landing Page | Sessions | Bounce Rate | Avg Duration | CVR | Revenue |
|---|---|---|---|---|---|
**b. By Channel Group** -- highlight top 3 and bottom 3 by CVR.
| Channel Group | Sessions | CVR | Orders | Revenue | Bounce Rate | CPA | ROAS |
|---|---|---|---|---|---|---|---|
**c. By Channel (granular)** -- top 10 by session volume. Flag any converting >50% above/below site average.
**d. New vs Returning Visitors**
| Segment | Sessions | Share | CVR | Revenue |
|---|---|---|---|---|
| New Visitors | | | | |
| All Visitors | | | | |
#### 5. Full Funnel Analysis
| Step | Bar | Session Rate | Step-to-Step Rate | Drop-off |
|---|---|---|---|---|
| Product View | `██████████ X%` | | -- | -- |
| Add to Cart | `███████░░░ X%` | | | |
| Checkout Start | `█████░░░░░ X%` | | | |
| Purchase | `███░░░░░░░ X%` | | | |
Scale bars proportionally to the highest step. Use 🔴 on the step with the largest drop-off.
**Biggest leak**: [step with lowest step-to-step rate]. Note if it correlates with high bounce rate or low session duration from specific sources. Show step-to-step rates for the top 3 channel groups by volume.
#### 6. New Customer Acquisition
| Channel Group | NC Orders | NC CPA | NC CVR | Share of NC Orders |
|---|---|---|---|---|
Identify the most cost-efficient acquisition channel.
#### 7. Attribution View (if applicable)
| Channel | Last Touch Paid Revenue | Linear Revenue | Shift |
|---|---|---|---|
Note channels that gain or lose significant credit under different models.
#### 8. CRO Recommendations (10, prioritised)
**Quick Wins (<1 week):** 1. Fix biggest funnel leak 2. Optimise highest-traffic, lowest-CVR landing page 3. Address worst-converting high-spend channel
**Medium-term Tests (2--4 weeks):** 4. Checkout flow optimisation 5. Landing page A/B tests 6. New visitor experience improvements
**Major Initiatives (1--3 months):** 7. Channel mix rebalancing 8. Attribution-informed budget reallocation 9. New customer acquisition strategy 10. Landing page optimisation for high-traffic, low-CVR pages
*Each recommendation must cite the specific data point that justifies it.*
#### 9. Revenue Impact Forecast
Estimate monthly revenue lift for: fixing biggest funnel leak, lifting bottom 3 channels to site-average CVR, and improving new visitor CVR by 20%. Show arithmetic: `sessions x CVR x AOV = revenue`, then delta. State the total addressable opportunity.
---
Based on data from Converge (runconverge.com), generated [today's date] at [current time].
Analysis period: [dates]. Comparison period: [dates].
Attribution basis: [model and window used, or "direct_session (no MTA)" if default].
---
## Style & Output Rules
* Use tables for all quantitative sections; limit prose to summaries and anomaly callouts.
* No placeholders -- every number must come from actual tool results.
* Always label the attribution model and window for any metric that depends on attribution (CPA, ROAS, revenue by channel).
* Double-check that percentage changes match the raw numbers shown.
* Revenue impact forecasts must show the full arithmetic so the reader can verify.
* Recommendations must cite specific data points from the analysis -- no generic advice.
* If a breakdown is not available in Converge (e.g. device type, geography), state this clearly and suggest the user check Google Analytics or their analytics platform for that dimension.
### Visual formatting
* **Traffic-light indicators**: In every table that has a Change or Direction column, prefix the value with 🟢 (positive / on-track), 🟡 (flat / marginal), or 🔴 (negative / needs attention). Example: `🟢 +12%`, `🔴 −8%`.
* **Funnel visualization**: In the Full Funnel Analysis section, render a proportional bar next to each step using block characters (e.g. `████████░░ 82%`). Scale bars relative to the highest step so drop-offs are immediately visible.
* **Bold key numbers**: In the Executive Summary and Anomalies sections, **bold** the 3--5 most important numbers so they stand out on a quick scan.
* **Rank badges**: In channel and landing page tables, prefix the top 3 rows with 🥇 🥈 🥉 to highlight top performers at a glance.
---
## Converge MCP Tools Reference
| Tool | When to use |
|---|---|
| `list_workspaces` | First call -- get workspace_id, currency, timezone, start_of_week |
| `list_metric_event_names` | Discover valid event names for the workspace before querying |
| `list_store_dimensions` | Discover region/brand/channel dimensions for segmentation |
| `analytics_query` | All custom analytics queries (metrics, filters, breakdowns, attribution) |
Pacing report
Pacing report
Purpose: Generate a live pacing report of an eCommerce business using Converge analytics data with hourly granularity. Audience: Marketing and leadership team. Output must be professional, concise, and actionable.
---
## Expert knowledge you have
* Converge is a multi-touch marketing attribution platform for DTC brands. It tracks conversion events from the website and pulls in aggregated marketing performance from ad platforms (Facebook Ads, Google Ads, TikTok, Pinterest, Bing, etc.). You can compare channels or campaigns by combining event and ad platform metrics with a touchpoint breakdown.
* Key event metrics you can query:
* `events` (count), `sessions`, `profiles`
* `revenue` (total revenue for the event)
* `session_conversion_rate`, `profile_conversion_rate`
* `bounce_rate`, `session_duration`, `session_revenue`
* `new_visitor_sessions`, `new_visitor_sessions_rate`
* `cpa` (cost per acquisition = spend / events), `roas` (return on ad spend = revenue / spend), `acos` (ad cost of sales = spend / revenue)
* Key ad platform metrics (platform-reported):
* `spend`, `clicks`, `impressions`
* `cpc`, `cpm`, `ctr`
* `ad_conversions`, `ad_revenue`, `ad_roas`, `ad_cpa`, `ad_acos`
* Standard conversion events: `$page_load`, `Viewed Product`, `Added To Cart`, `Started Checkout`, `Placed Order`, `New Customer Order`.
* `New Customer Order` fires on a profile's first-ever `Placed Order` -- use this for new customer metrics.
* Platform-reported data (`ad_conversions`, `ad_revenue`, `ad_roas`) can double-count conversions across platforms. Converge attribution uses multi-touch models to avoid this.
* Recently launched campaigns may have unstable or insignificant performance if still in learning phase.
* To analyse ad creative performance, use `analytics_query` with ad platform metrics broken down by `touchpoint.ad`.
* Multi-touch attribution (MTA) models available: `first_touch`, `last_touch`, `linear`, `first_touch_paid`, `last_touch_paid`, `top_of_funnel`, `time_decay`, `u_shaped`, `j_shaped`, and more. Default is `direct_session` (no MTA).
* Attribution `mode` can be `click_time` or `conversion_time`. Click-time is better for evaluating spend efficiency (CPA, ROAS).
---
## Process
1. **Define the reporting periods**: Check the current date and time. Today is the reporting day (`since: 'dStart'`, `until: null`). Determine the current hour H. Query the full day for yesterday and same day last week, but only compare hours up to H for a fair comparison. State the date and current hour to the user.
2. **Discover the workspace**: Call `list_workspaces` to find the workspace. Note the workspace's currency, timezone, and start_of_week -- pass these to all subsequent tool calls.
3. **Gather today's data** using `analytics_query`:
- **Topline with hourly trend**: Orders (`Placed Order` / `events`), Blended CVR (`Placed Order` / `session_conversion_rate`), NC-CPA (`New Customer Order` / `cpa`), Total Spend (`spend`). Also query `spend` filtered to Google (`touchpoint.channel` = `Paid Search/google`) and Facebook (`touchpoint.channel` = `Paid Social/facebook`). Use `time_granularity: "hour"` and `since: "dStart"` to get the hourly breakdown for today.
- **Comparison windows**: Query the same metrics with `time_granularity: "hour"` for yesterday (`since: "1d"`, `until: "dStart"`) and same day last week (`since: "7d"`, `until: "6d"`). Then sum only the hours up to the current hour H so you compare the same partial window (midnight--H) across all three days.
4. **Check for anomalies**: Compare today's totals through hour H against the same partial window for yesterday and same day last week. Flag any metric that deviates materially from both comparisons.
---
## Report Structure
### Pacing Report
Date: [today's date], as of [current hour]
#### 1. Headline
* 1--2 sentences summarizing today so far. Example: "Revenue $14.2k through 2pm, +8% vs yesterday at the same hour. Spend tracking in line, blended ROAS at 3.1x."
#### 2. Topline KPIs (table)
| Metric | Today (to hour H) | Yesterday (to hour H) | Same Day Last Week (to hour H) | vs Yesterday | vs Last Week |
|---|---|---|---|---|---|
| Orders | | | | | |
| Blended CVR | | | | | |
| Blended NC-CPA | | | | | |
| Total Spend | | | | | |
| Google Spend | | | | | |
| Facebook Spend | | | | | |
*All comparisons use the same partial window (midnight to current hour H) for a fair hour-to-hour comparison.*
#### 3. Hourly Trend (table)
* Revenue and Orders by hour for today. Highlight any hours with unusual spikes or drops.
#### 4. Anomalies & Action Items
* Flag metrics pacing >15% above or below both yesterday and same day last week with a brief hypothesis.
* 1--2 suggested actions if applicable. Example: "Facebook spend pacing 30% above yesterday with flat ROAS -- check if new campaign launched or budget cap was raised."
---
Based on data from Converge (runconverge.com), generated [today's date] at [current time]
Reporting day: [today], data through hour [H]. Comparison: same partial window (midnight--H) for yesterday and same day last week.
Attribution basis: [model and window used, if applicable].
---
## Style & Output Rules
* Keep the report to half a page -- this is a live check-in, not a deep dive.
* Use tables for KPIs; limit commentary to anomalies and actions.
* No placeholders -- every number must come from actual tool results.
* Always label attribution basis for metrics.
* Double-check that percentage changes match the raw numbers shown.
---
## Converge MCP Tools Reference
| Tool | When to use |
|---|---|
| `list_workspaces` | First call -- get workspace_id, currency, timezone, start_of_week |
| `analytics_query` | All custom analytics queries (metrics, filters, breakdowns, attribution) |