Advanced Comparison Report
Compare CX performance across locations or departments with configurable analysis modes, saved views, and Excel exports
The Advanced report lets you compare customer experience metrics side by side across different groups in your organization, either by location or by department. This is essential for multi-location businesses or organizations with distinct service teams that need to benchmark performance internally.
Navigate to Reports and select the Advanced tab.
Analysis Modes
The report operates in two modes:
Location Mode
Compares performance across your business locations. Each row represents one location and shows its CX metrics. Use this to:
- Identify your best and worst-performing locations
- Spot locations that need operational attention
- Benchmark new locations against established ones
- Validate whether a process change at one location should be rolled out company-wide
Department Mode
Compares performance across employee departments. Each row represents one department. Use this to:
- Compare customer satisfaction between service teams (e.g., sales vs. support vs. installation)
- Identify which departments need additional training
- Validate whether department-specific initiatives are working
Metrics Per Group
Each location or department row shows:
| Metric | What It Measures |
|---|---|
| Total Responses | Number of survey responses attributed to this group |
| Total Sent | Number of survey invitations sent to customers of this group |
| Response Rate | Percentage of sent surveys that received a response |
| CSAT | Average Customer Satisfaction score for the group |
| NPS | Net Promoter Score for the group |
| CES | Average Customer Effort Score for the group |
Not all metrics appear for every organization. If you only use NPS surveys, the CSAT and CES columns will be empty or show N/A.
Controls
Period Selection
Choose between:
- All Time - Aggregate performance across your entire history
- Monthly - Select a specific month for period-specific analysis
Monthly views are critical for tracking progress. "All Time" masks recent improvements or declines behind historical averages.
Analysis Mode Toggle
Switch between Location and Department analysis. The table restructures automatically when you change modes.
Saved Configurations
The Advanced report supports saved configurations so you can define a standard view and reload it for recurring reports. This is especially useful for:
- Monthly leadership reports - Save a configuration with the metrics and mode your leadership team expects
- Quarterly reviews - Standardize the view so quarter-over-quarter comparisons are consistent
- Location manager views - Save a location-mode configuration that each manager can load
Configurations are saved via the report configuration API and persist across sessions.
Exporting
Export the comparison data to Excel for sharing with stakeholders who don't have demeterrr access. The export includes all visible metrics in a formatted spreadsheet.
Common export use cases:
- Board or investor reports showing location performance
- Department head presentations with team metrics
- Quarterly business reviews with trend data
Interpreting Comparisons
Volume Adjustments
Compare groups with similar response volumes. A location with 10 responses and an NPS of 90 is not reliably outperforming a location with 200 responses and an NPS of 70. The larger sample is more statistically significant.
As a rule of thumb, groups need at least 30 responses for their metrics to be reasonably reliable.
Response Rate Differences
If one location has a much lower response rate than others, its scores may be skewed. Low response rates often mean only the most motivated customers (very happy or very unhappy) responded, creating a biased sample.
Before comparing scores, check whether response rates are comparable. If they're not, improving response rate at the low-performing location is a prerequisite to meaningful comparison.
Environmental Factors
Not all locations or departments serve identical customer populations. A location in a high-traffic area may receive more impatient customers. A department handling complaints will naturally have lower satisfaction scores than one handling new sales. Factor these structural differences into your analysis.
Common Use Cases
Post-Rollout Validation
After implementing a new process, training program, or tool at specific locations:
- Run the report for the month before the rollout
- Run it again for the month after
- Compare the affected locations to the control group (locations without the change)
- If the affected locations improved more than the control group, the rollout likely worked
Identifying Best Practices
When one location consistently outperforms others:
- Investigate what they're doing differently (staffing, processes, culture)
- Document their practices
- Test those practices at lower-performing locations
- Monitor the Advanced report over the next 2-3 months to confirm impact
Resource Allocation
Use the report to justify resource allocation decisions. A location with declining scores and increasing volume may need additional staff. A department with consistently low scores may need training investment.
Best Practices
- Use monthly comparisons rather than all-time for actionable insights
- Save your standard configuration so recurring reports are consistent
- Check response volumes before drawing conclusions from score differences
- Export for stakeholders who need the data but don't use demeterrr directly
- Track 3+ months of trends before making major operational decisions
Next Steps
For individual employee performance within departments, see Employee Performance Report.
For the complete reporting overview, see Reports Overview.
For question-level response analysis, see Questionnaires Report.
// Related articles
Reports Overview
Understand what report views are available and how to use them for CX decisions
Employee Performance Report
Track and compare employee satisfaction scores, response volumes, and performance trends to inform coaching and recognition decisions
Questionnaire Report
Analyze survey performance question-by-question with distribution and average breakdowns
Sendings Report
Track sending volume, delivery status, channel distribution, and sequence-level outcomes to optimize your survey delivery strategy
Was this article helpful?
Let us know if you found this article helpful or if you need more information.
Join hundreds of businesses already using demeterrr to collect feedback, boost reviews, and grow faster.
Start Your Trial