We analyzed the strategic plans of 225 local governments. Here's what the typical city tracks, where most cities struggle, and how the top performers pull ahead.
We analyzed the strategic plans of 225 local governments. Here's what the typical city tracks, where most cities struggle, and how the top performers pull ahead.
If you manage strategy for a city or county government, you've probably asked yourself some version of this question: "Are we tracking the right things? Is our plan too big? How do other cities do this?"
The honest answer, until now, has been: nobody really knows. Municipal benchmarking data exists in pockets — ICMA's Open Access Benchmarking covers 80 KPIs, the GFOA publishes financial management best practices, and a handful of regional consortiums compare notes on specific service areas. But nobody has looked at how cities structure and execute their strategic plans at scale.
We have. Our dataset includes 31.2 million rows of behavioral data from 20,582 strategic plans across 12 industries — and 225 of those organizations are local governments: cities, counties, towns, and municipalities of all sizes. Another 114 are state and federal government agencies. Combined, that's 339 government organizations, making this the largest behavioral analysis of government strategic planning ever assembled.
This isn't survey data about what cities think they're doing. This is what they're actually doing — every objective set, every measure tracked, every initiative launched, and every milestone completed (or not).
Here's how your city compares.
The Typical City Strategic Plan: By the Numbers
Across 225 local government organizations in our dataset, the average municipal strategic plan contains:
ElementAverage per CityWhat It MeansObjectives (Goals)154Top-level strategic prioritiesMeasures (KPIs)820Performance indicators trackedInitiatives (Projects)416Strategic projects and programsMilestones390Intermediate checkpointsScorecards21Organizational units or themes
That's a lot. The median city in our dataset is tracking over 1,900 total strategic elements. For context, our data shows that plans with fewer than 20 elements succeed 68% of the time, while plans with 60+ elements succeed just 8%. Cities are running plans that are 30-100x larger than the optimal portfolio size.
Before you panic: city government is inherently broader than a private company. A technology startup has one product line. A city has police, fire, public works, parks, libraries, housing, economic development, utilities, and a dozen other departments, each with legitimate performance metrics. The question isn't whether cities should track more than 20 elements — they should. The question is whether 1,900 elements is working for them.
The completion rate suggests it isn't.
How Cities Actually Execute
Here are the execution benchmarks for local government:
17.2%
initiative completion rate for local governments — completing roughly one in six strategic projects
That's actually above the overall dataset average of ~14% and notably better than government agencies at the state and federal level (10.8%). Cities are the best-executing government tier in our data.
But the distribution underneath that average is brutal:
Nearly a quarter of cities in our dataset — 23.5% — have completed zero strategic initiatives. Not "a few." Zero. Another 37.8% have completed fewer than one in ten. Together, that's 61.3% of cities completing less than 10% of what they planned.
At the other end, just 2.5% of cities — three organizations out of 119 with sufficient data — complete more than half their strategic initiatives. Those are your elite performers, and they're outnumbered 20-to-1 by cities finishing nothing at all.
What Cities Track: The KPI Landscape
We analyzed the names of all 97,568 measures tracked by local governments in our dataset to understand what cities actually measure. Excluding generic or unclassifiable measures, the KPI priorities break down as follows:
The largest classifiable category is Parks & Programs at 6.8% — which makes sense given that recreation departments often track dozens of individual program metrics. Financial Metrics (4.9%) and Satisfaction Surveys (4.1%) round out the top three.
The surprise is how little weight traditional "headline" KPIs receive. Crime & Safety accounts for just 1.0% of city measures. Response Times account for 1.9%. These are the metrics that residents care about most and that media covers most — but they represent a tiny fraction of what cities actually track in their strategic plans. The bulk of measurement effort goes toward operational and programmatic metrics that are invisible to the public.
This is an important finding for cities thinking about public dashboards. What you track internally and what you publish externally should be different conversations. Residents want to know about safety, infrastructure, and service quality. Your internal plan should track those things — but also the operational metrics that drive them.
The Department Scorecard Map
When we analyzed how cities organize their strategic plans, a clear pattern emerged. The most common departmental scorecards across 225 local governments:
The most universally tracked department is HR — 57 out of 225 cities have a dedicated HR scorecard in their strategic plan. Police (51) and Fire (47) follow closely. These three departments represent the operational core that nearly every city tracks at the strategic level.
The drop-off after the top five is significant. Only 23 cities have a dedicated IT scorecard, and only 19 track Economic Development as its own strategic unit. This doesn't mean cities aren't doing economic development — it means they're often folding it into a broader category like "Community Development" or tracking it at the city-wide level rather than as a departmental function.
If you're building a new strategic plan and wondering how to structure your scorecards, the data suggests starting with the departments that virtually every city tracks (HR, Police, Fire, Public Works, Finance) and then adding departmental scorecards only for functions that have enough strategic initiatives to warrant their own accountability structure.
Why Larger Plans Don't Mean Better Execution
Intuition says that larger, more mature cities — the ones with bigger budgets, more staff, and more sophisticated planning processes — should execute better. The data says the opposite is more complicated:
The very largest plans (2,000+ elements) actually complete at the highest rate: 19.9%. Medium and large plans both hover around 7.7-7.9%.
This is counterintuitive — shouldn't bloated plans execute worse? Two things explain the pattern. First, the very largest plans belong to the most mature ClearPoint users — cities that have been using the platform for years, have trained staff, and have embedded strategic planning into their operating rhythm. They have more elements because they've been at this longer, not because they planned carelessly. Second, these large plans tend to be more granularly structured, with more milestones per initiative and more scorecards creating departmental accountability.
The medium-sized plans (100-500 elements) are often the most dangerous. They're big enough to create complexity but not structured enough to manage it. A city with 277 elements spread across a handful of scorecards with inconsistent ownership is in a worse position than a city with 5,054 elements organized across 30 departmental scorecards with clear accountability at every level.
Structure matters more than size.
The Strategic Priority Stack: What Cities Care About
We categorized the 18,362 objectives across local government plans by strategic theme:
Economic Development dominates at 16.3% of all city objectives — every sixth strategic goal is about attracting businesses, growing the tax base, or revitalizing a downtown. Workforce and Community Engagement round out the top three.
Two emerging priorities are worth noting. Sustainability & Environment (2.3%) and Equity & Inclusion (0.8%) are relatively small in the current data but growing rapidly among newer plans. Cities that adopted ClearPoint in 2023-2025 are significantly more likely to include sustainability and equity objectives than those that started earlier. If you're building or refreshing your plan, these are categories that peer cities are increasingly prioritizing — and that citizens are increasingly demanding.
Housing & Homelessness at 1.2% is arguably underrepresented given how much it dominates local policy discussions. This suggests that while housing is a top political priority, it often gets managed outside the formal strategic plan — through separate housing departments, federal grant programs, or inter-agency initiatives that don't always flow through the city's central strategy.
How to Benchmark Your City
Here's the honest benchmarking framework. We've organized it into tiers so you can see where your city falls and what to aim for.
Tier 1: Getting Started (61% of cities)
- Completing less than 10% of initiatives
- Probably tracking 100-500 elements without clear structure
- Likely missing ownership on most objectives and measures
- Strategic plan may exist on paper but isn't actively driving decisions
- Priority: Focus before expansion. Cut to 5-9 goals, assign owners, meet quarterly.
Tier 2: Building Momentum (24% of cities)
- Completing 10-25% of initiatives
- Tracking 500+ elements with departmental scorecards
- Some ownership discipline, inconsistent update cadence
- Strategic plan influences budget discussions but isn't the primary decision-making tool
- Priority: Embed the plan in your operating rhythm. Monthly departmental reviews, quarterly council updates.
Tier 3: Executing Consistently (13% of cities)
- Completing 25-50% of initiatives
- Well-structured plans with milestones, clear owners, and regular updates
- Strategic plan directly informs budget allocation and performance reviews
- Public dashboards or council reports showing progress
- Priority: Extend the model to all departments. Add community dashboards for transparency.
Tier 4: Elite Performance (2.5% of cities)
- Completing 50%+ of initiatives
- Deeply embedded strategic culture with cascading accountability
- Public dashboards actively used by citizens and media
- Strategy reviews that drive real-time decisions, not just quarterly updates
- Priority: Share your model. The field needs it.
Case in point: City of Germantown, Tennessee
Germantown exemplifies what Tier 4 looks like in practice. Under De'Kisha Fondon's leadership, the city runs quarterly reviews between the City Administrator and each department director centered on three questions: "How much did it cost? How long did it take? Did it have the impact we wanted?" Their citizen dashboard makes strategic progress visible to residents, and they've won both the Tennessee Center for Performance Excellence Award and the Baldrige Award. Their approach isn't revolutionary — it's disciplined execution of fundamentals that most cities skip.
Case in point: Washington State Department of Licensing
WDOL represents a different kind of excellence: person-centered performance management. Under Janet Zars and Tony Griego, the agency shifted from tracking aggregate metrics to asking "who is underserved?" and "who is un-served altogether?" — reframing KPIs around individual experiences rather than population averages. They use ClearPoint to visualize performance by groups of people and their experiences, quote customers directly in leadership reports, and track measures that focus on the outliers, not just the averages. Their approach earned recognition through the Baldrige Criteria for Performance Excellence and demonstrates that what you measure matters as much as how you measure it.
What To Do This Week
Five steps to benchmark your city and build a more executable plan.
1. Calculate your completion rate. Pull your strategic plan and count: how many initiatives were due in the last 12 months, and how many were actually completed? If you're below 10%, you're in the majority — and the first step is acknowledging the gap rather than hiding it behind green dashboard indicators. Performance management starts with honest measurement.
2. Count your elements and compare. The average city in our dataset runs 154 objectives, 820 measures, 416 initiatives, and 390 milestones. If your plan is significantly above those averages with no better completion rate, you're likely overloaded. The optimal portfolio is 5-9 citywide goals, 9-11 measures per goal, and 5-8 active initiatives. Departmental scorecards can expand that total, but each department should stay within those ratios.
3. Map your departments against the common scorecard structure. If you don't have dedicated scorecards for at least Police, Fire, Public Works, Finance, and HR, you're missing the accountability foundations that nearly every city in our dataset maintains. Each department should have its own scorecard with its own goals, measures, and initiatives that roll up to the citywide plan.
4. Audit your KPI categories for blind spots. Compare your measures against the distribution above. Are you tracking satisfaction surveys? Workforce metrics? Permitting efficiency? Many cities over-index on programmatic output metrics (parks attendance, event counts) and under-invest in the outcome metrics that residents and council members actually care about — response times, infrastructure condition, citizen satisfaction.
5. Join a benchmarking consortium. The ICMA Open Access Benchmarking program provides 80 consistently-defined KPIs across 16 categories at no cost. ClearPoint's own Measure Library includes 250+ measures from 20+ municipalities, with contact information for each measure owner. You don't have to figure out benchmarking alone — but you do have to start.
Your City Is Better Than You Think (And Worse Than You Think)
Here's the paradox of municipal benchmarking. Most cities are doing more than they give themselves credit for — they're tracking hundreds of KPIs, running dozens of strategic initiatives, publishing dashboards, and reporting to council. The effort is real and significant.
But most cities are also completing far less than they think. When 61.3% of cities finish fewer than one in ten of their strategic initiatives, the gap between planning effort and execution results is enormous. The plan looks comprehensive. The dashboard looks green. And the actual completion rate is in single digits.
The cities that break through — the 2.5% completing more than half their initiatives — aren't doing fundamentally different things. They're doing the same things with more discipline: fewer priorities, clearer owners, shorter initiative timelines, and an honest reckoning with what's actually getting done versus what's being tracked.
Your city is probably somewhere in the middle. The benchmarks in this article give you a starting point for figuring out where. What you do with that information is up to you — but the 143 KPIs you could track matter a lot less than whether you're finishing the initiatives that move those KPIs in the right direction.
Methodology: This analysis is based on 31.2 million rows of anonymized, aggregated data from ClearPoint Strategy accounts spanning 2017–2025. Local government analysis includes 225 organizations tracking 18,362 objectives, 97,568 measures, 49,548 initiatives, and 46,434 milestones. Government agency analysis includes 114 state and federal organizations. Organization names are anonymized; KPI theme and objective orientation analysis is based on keyword classification of element names. Performance tiers are calculated for organizations with 5 or more initiatives. For the full methodology, download the 2026 Strategic Planning Report.
ClearPoint Strategy powers 20,000+ strategic plans and 2 million monthly updates across government, healthcare, education, and enterprise. See how →





.png)
.png)