Performance Indicators
Performance Indicators are quantifiable metrics that evaluate the success, efficiency, or impact of strategic responses. They close the feedback loop in the Strategic Response Model, enabling organizations to measure whether responses achieved their intended outcomes.
Schema Version: 2.1
Schema Location: /schemas/extensions/performance-indicator.schema.json
Specification: JSON Schema Draft-07
Overview
What are Performance Indicators?
In the SRM context, performance indicators are specific, measurable criteria that:
- Define what success looks like for a strategic response
- Provide quantifiable targets against baselines
- Enable ongoing monitoring and adjustment
- Support governance reviews and audits
The Measurement Loop
Indicator Types
Outcome Indicators
Measure the ultimate result or impact of the response.
| Type | Description | Example |
|---|---|---|
effectiveness | Degree to which objectives are achieved | Compliance rate = 100% |
impact | Broader organizational effect | Customer satisfaction +15pts |
value | Business value delivered | Revenue increase $2M |
Process Indicators
Measure the execution and delivery of the response.
| Type | Description | Example |
|---|---|---|
efficiency | Resource utilization | Cost per transaction -20% |
timeliness | Schedule adherence | Milestones on-time 90% |
quality | Output quality | Defect rate <2% |
Leading Indicators
Predict future performance or outcomes.
| Type | Description | Example |
|---|---|---|
predictive | Early warning signals | Pipeline coverage 3x |
adoption | Uptake and usage | User adoption 75% |
readiness | Preparation state | Training completion 100% |
Lagging Indicators
Confirm outcomes after the fact.
| Type | Description | Example |
|---|---|---|
result | Final outcome achieved | Project delivered |
retrospective | Post-implementation review | Benefits realized 85% |
Indicator Attributes
Core Attributes
| Attribute | Type | Required | Description |
|---|---|---|---|
indicatorID | string (UUID) | Yes | Unique identifier |
name | string | Yes | Human-readable name |
description | string | Yes | What the indicator measures |
indicatorType | enum | Yes | Category of indicator |
linkedResponse | reference | Yes | Strategic response being measured |
Measurement Attributes
| Attribute | Type | Required | Description |
|---|---|---|---|
unit | string | Yes | Unit of measurement |
targetValue | number/string | Yes | Target to achieve |
baselineValue | number/string | No | Starting point for comparison |
currentValue | number/string | No | Most recent measurement |
direction | enum | No | Desired direction of change |
threshold | object | No | Warning/critical thresholds |
Timing Attributes
| Attribute | Type | Required | Description |
|---|---|---|---|
measurementFrequency | enum | No | How often measured |
targetDate | date | No | When target should be achieved |
baselineDate | date | No | When baseline was established |
lastMeasuredDate | date | No | Most recent measurement date |
Data Attributes
| Attribute | Type | Required | Description |
|---|---|---|---|
dataSource | string | No | Where measurement comes from |
calculationMethod | string | No | How the metric is calculated |
dataOwner | string | No | Who is responsible for the data |
Measurement Frequencies
| Frequency | Use Case | Example |
|---|---|---|
real-time | Critical operational metrics | System availability |
daily | High-frequency operational | Transaction volumes |
weekly | Tactical project metrics | Sprint velocity |
monthly | Standard business metrics | Revenue, costs |
quarterly | Strategic review metrics | NPS, market share |
annually | Long-term strategic | Brand equity |
Direction Values
| Direction | Meaning | Example |
|---|---|---|
increase | Higher is better | Revenue, satisfaction |
decrease | Lower is better | Costs, defects |
maintain | Stay within range | Compliance rate |
achieve | Binary target | Certification obtained |
Threshold Configuration
Thresholds enable automated alerting when metrics move outside acceptable ranges:
{
"threshold": {
"warning": {
"operator": "lessThan",
"value": 95
},
"critical": {
"operator": "lessThan",
"value": 90
}
}
}
Threshold Operators
| Operator | Description |
|---|---|
equals | Exactly equals value |
notEquals | Does not equal value |
greaterThan | Greater than value |
lessThan | Less than value |
greaterThanOrEqual | Greater than or equal to value |
lessThanOrEqual | Less than or equal to value |
between | Between two values (inclusive) |
JSON Schema Definition
{
"$schema": "http://json-schema.org/draft-07/schema#",
"$id": "https://orthogramic.com/schemas/extensions/performance-indicator.schema.json",
"title": "Performance Indicator",
"description": "Schema for performance indicators measuring strategic response success",
"type": "object",
"required": ["indicatorID", "name", "description", "indicatorType", "linkedResponse", "unit", "targetValue"],
"properties": {
"indicatorID": {
"type": "string",
"format": "uuid",
"description": "Unique identifier for the indicator"
},
"name": {
"type": "string",
"description": "Human-readable name of the indicator"
},
"description": {
"type": "string",
"description": "What the indicator measures and why"
},
"indicatorType": {
"type": "string",
"enum": [
"effectiveness",
"impact",
"value",
"efficiency",
"timeliness",
"quality",
"predictive",
"adoption",
"readiness",
"result",
"retrospective"
],
"description": "Category of indicator"
},
"linkedResponse": {
"type": "object",
"properties": {
"responseID": {
"type": "string",
"format": "uuid"
},
"responseType": {
"type": "string",
"enum": ["initiative", "capability", "policy", "strategy"]
}
},
"required": ["responseID"],
"description": "Strategic response being measured"
},
"unit": {
"type": "string",
"description": "Unit of measurement (%, $, count, days, etc.)"
},
"targetValue": {
"oneOf": [
{"type": "number"},
{"type": "string"}
],
"description": "Target value to achieve"
},
"baselineValue": {
"oneOf": [
{"type": "number"},
{"type": "string"}
],
"description": "Starting point for comparison"
},
"currentValue": {
"oneOf": [
{"type": "number"},
{"type": "string"}
],
"description": "Most recent measurement"
},
"direction": {
"type": "string",
"enum": ["increase", "decrease", "maintain", "achieve"],
"description": "Desired direction of change"
},
"threshold": {
"type": "object",
"properties": {
"warning": {
"type": "object",
"properties": {
"operator": {
"type": "string",
"enum": ["equals", "notEquals", "greaterThan", "lessThan", "greaterThanOrEqual", "lessThanOrEqual", "between"]
},
"value": {
"oneOf": [
{"type": "number"},
{"type": "string"},
{"type": "array", "items": {"type": "number"}, "minItems": 2, "maxItems": 2}
]
}
}
},
"critical": {
"type": "object",
"properties": {
"operator": {"type": "string"},
"value": {"oneOf": [{"type": "number"}, {"type": "string"}, {"type": "array"}]}
}
}
},
"description": "Alert thresholds"
},
"measurementFrequency": {
"type": "string",
"enum": ["real-time", "daily", "weekly", "monthly", "quarterly", "annually"],
"description": "How often the metric is measured"
},
"targetDate": {
"type": "string",
"format": "date",
"description": "When target should be achieved"
},
"baselineDate": {
"type": "string",
"format": "date",
"description": "When baseline was established"
},
"lastMeasuredDate": {
"type": "string",
"format": "date",
"description": "Most recent measurement date"
},
"dataSource": {
"type": "string",
"description": "System or process providing the data"
},
"calculationMethod": {
"type": "string",
"description": "Formula or method for calculating the metric"
},
"dataOwner": {
"type": "string",
"description": "Role or person responsible for data accuracy"
},
"status": {
"type": "string",
"enum": ["on-track", "at-risk", "off-track", "achieved", "not-started"],
"description": "Current status relative to target"
},
"measurementHistory": {
"type": "array",
"items": {
"type": "object",
"properties": {
"date": {"type": "string", "format": "date"},
"value": {"oneOf": [{"type": "number"}, {"type": "string"}]},
"notes": {"type": "string"}
}
},
"description": "Historical measurements"
},
"metadata": {
"type": "object",
"additionalProperties": true
}
}
}
Examples
Compliance Rate Indicator
{
"indicatorID": "880e8400-e29b-41d4-a716-446655440001",
"name": "Privacy Compliance Rate",
"description": "Percentage of data processing activities compliant with Privacy Act requirements",
"indicatorType": "effectiveness",
"linkedResponse": {
"responseID": "srm-privacy-compliance-2024",
"responseType": "initiative"
},
"unit": "%",
"targetValue": 100,
"baselineValue": 72,
"currentValue": 94,
"direction": "increase",
"threshold": {
"warning": {
"operator": "lessThan",
"value": 95
},
"critical": {
"operator": "lessThan",
"value": 90
}
},
"measurementFrequency": "monthly",
"targetDate": "2025-06-30",
"baselineDate": "2024-06-15",
"lastMeasuredDate": "2024-12-01",
"dataSource": "Compliance Management System",
"calculationMethod": "(Compliant activities /Total activities) × 100",
"dataOwner": "Chief Compliance Officer",
"status": "at-risk",
"measurementHistory": [
{"date": "2024-09-01", "value": 85, "notes": "Initial remediation complete"},
{"date": "2024-10-01", "value": 89, "notes": "Training program launched"},
{"date": "2024-11-01", "value": 92, "notes": "Technical controls deployed"},
{"date": "2024-12-01", "value": 94, "notes": "Process changes implemented"}
]
}
Customer Satisfaction Indicator
{
"indicatorID": "880e8400-e29b-41d4-a716-446655440002",
"name": "AI Service Customer Satisfaction",
"description": "Net Promoter Score for customers using AI-powered service channels",
"indicatorType": "impact",
"linkedResponse": {
"responseID": "srm-ai-enablement-2024",
"responseType": "capability"
},
"unit": "NPS points",
"targetValue": 55,
"baselineValue": 40,
"currentValue": 48,
"direction": "increase",
"threshold": {
"warning": {
"operator": "lessThan",
"value": 45
},
"critical": {
"operator": "lessThan",
"value": 35
}
},
"measurementFrequency": "monthly",
"targetDate": "2025-12-31",
"baselineDate": "2024-01-01",
"lastMeasuredDate": "2024-11-30",
"dataSource": "Customer Feedback Platform",
"calculationMethod": "% Promoters (9-10) minus % Detractors (0-6)",
"dataOwner": "Head of Customer Experience",
"status": "on-track"
}
Cost Efficiency Indicator
{
"indicatorID": "880e8400-e29b-41d4-a716-446655440003",
"name": "Cost per Customer Service Interaction",
"description": "Average cost to resolve a customer service request across all channels",
"indicatorType": "efficiency",
"linkedResponse": {
"responseID": "srm-ai-enablement-2024",
"responseType": "capability"
},
"unit": "AUD",
"targetValue": 8.50,
"baselineValue": 12.00,
"currentValue": 9.25,
"direction": "decrease",
"threshold": {
"warning": {
"operator": "greaterThan",
"value": 10.00
},
"critical": {
"operator": "greaterThan",
"value": 12.00
}
},
"measurementFrequency": "monthly",
"targetDate": "2025-06-30",
"dataSource": "Financial Management System + CRM",
"calculationMethod": "Total CS cost /Total interactions",
"dataOwner": "Finance Business Partner - CX",
"status": "on-track"
}
Adoption Indicator
{
"indicatorID": "880e8400-e29b-41d4-a716-446655440004",
"name": "AI Channel Adoption Rate",
"description": "Percentage of eligible customer interactions handled by AI channels",
"indicatorType": "adoption",
"linkedResponse": {
"responseID": "srm-ai-enablement-2024",
"responseType": "capability"
},
"unit": "%",
"targetValue": 60,
"baselineValue": 0,
"currentValue": 42,
"direction": "increase",
"measurementFrequency": "weekly",
"targetDate": "2025-12-31",
"dataSource": "Channel Analytics Platform",
"calculationMethod": "AI-handled interactions /Total eligible interactions × 100",
"dataOwner": "Digital Channels Manager",
"status": "on-track"
}
Indicator Relationships
Performance indicators can relate to each other:
Usage Guidelines
Selecting Effective Indicators
- SMART criteria — Specific, Measurable, Achievable, Relevant, Time-bound
- Balance leading and lagging — Don't wait until the end to measure
- Limit quantity — 3-5 indicators per response is typically sufficient
- Ensure measurability — Data source must exist and be reliable
- Align with response objectives — Indicators must reflect what success means
Common Mistakes to Avoid
| Mistake | Problem | Solution |
|---|---|---|
| Too many indicators | Dilutes focus, overwhelming | Prioritize 3-5 key metrics |
| Vanity metrics | Look good but don't matter | Focus on actionable metrics |
| No baseline | Can't measure improvement | Establish baseline before starting |
| Unmeasurable targets | "Improve customer experience" | Define specific, quantifiable targets |
| Wrong frequency | Measure too late to act | Match frequency to response urgency |
Indicator Lifecycle
Performance indicators often map to data quality metrics and pipeline KPIs. Consider creating automated feeds from your data observability tools to populate SRM performance indicators, enabling real-time tracking of how data quality initiatives are progressing.
Related Documentation
- Strategic Response Model Overview — Full SRM documentation
- Trigger Schema — What initiates responses
- Rationale Schema — Why responses are taken
- Performance Domain — Domain-level performance modeling