Design system interface showing components and documentation

Federal Design System Modernization

Context & Challenge

The U.S. Web Design System (USWDS) was approaching a critical inflection point. Originally launched as a toolkit to help federal agencies build consistent, accessible websites, it had grown organically over several years. While adoption was strong—over 150 federal websites were using it—the system was showing signs of technical debt and organizational strain.

The codebase was difficult to maintain, documentation lagged behind releases, and the governance model wasn't scaling to meet the needs of a growing user base. More fundamentally, the design system was being asked to serve use cases far beyond its original scope: complex forms, authenticated applications, data visualization, and multi-step processes that the component library wasn't designed to support.

The challenge was technical, organizational, and strategic: How could we modernize the system's architecture and expand its capabilities without breaking existing implementations? How could we build a sustainable governance model that balanced centralized standards with agency autonomy? And how could we prioritize work when the system served such diverse agencies—from small teams building static sites to large departments managing mission-critical applications?

Code on screen showing component architecture

Component refactoring prioritized accessibility and maintainability, reducing technical debt while preserving backward compatibility where possible.

Role & Responsibilities

As Design System Lead, I was responsible for the strategic direction, product roadmap, and team coordination for USWDS modernization. I worked at the intersection of design, engineering, policy, and agency relationships, balancing technical decisions with organizational politics and user needs.

Key Responsibilities

  • Product Strategy & Vision: Defined modernization approach, multi-year roadmap, and success metrics in partnership with leadership
  • Technical Leadership: Led architecture decisions for component refactoring, API design, and build system improvements
  • Team Coordination: Managed core team of designers and developers, facilitated decision-making, unblocked work
  • Stakeholder Engagement: Built relationships with agency design leads, gathered requirements, communicated changes
  • Governance Design: Established contribution guidelines, review processes, and community feedback mechanisms
  • Research & Validation: Oversaw usability testing of new components, conducted user interviews with agency teams

I reported to the Director of 18F's Design Practice and collaborated closely with TTS Platform team, GSA's Office of Government-wide Policy, and the broader civic tech community. This required navigating complex organizational dynamics—balancing 18F's agile delivery culture with GSA's compliance requirements and federal agencies' risk aversion.

Approach & Process

My approach emphasized incremental improvement over big-bang rewrites. Rather than attempt a complete overhaul (which would risk breaking existing implementations), we identified high-impact, high-pain areas and addressed them systematically. This required deep listening to understand where the system was failing users and careful sequencing to manage dependencies.

Discovery & Prioritization

We started with extensive discovery: interviews with 30+ agency teams, analysis of GitHub issues and support requests, and usability testing of existing components. This research revealed patterns in where teams struggled—complex form patterns, accessibility barriers in custom components, confusion around when to use certain patterns.

I led the team through a prioritization exercise that balanced user impact, technical feasibility, and strategic value. We used a framework that considered: How many users does this affect? How much time will this save them? Does this unlock new use cases or primarily improve existing workflows? What dependencies exist?

Iterative Delivery

  • Phase 1 - Foundation (Months 1-6): Refactored build system, improved documentation structure, established new contribution guidelines
  • Phase 2 - Core Components (Months 6-12): Redesigned and rebuilt highest-priority components with accessibility improvements
  • Phase 3 - Complex Patterns (Months 12-18): Introduced new patterns for multi-step forms, data tables, and authenticated experiences
  • Phase 4 - Ecosystem & Governance (Months 18-24): Built community contribution pathways, established design system office hours, launched agency showcase

We released updates continuously rather than in large quarterly drops, using semantic versioning to communicate the nature of changes. This approach reduced risk and allowed teams to adopt improvements on their own timelines.

Cross-Functional Collaboration

Team members collaborating at whiteboard

Design systems work is fundamentally collaborative. Success required coordination across designers, engineers, accessibility specialists, content strategists, and agency stakeholders—each bringing different expertise and priorities.

I established regular working sessions where designers and developers paired on component development, accessibility audits happened before features were considered "done," and content strategists reviewed component names and documentation for clarity. This embedded quality and diverse perspectives into the work rather than treating them as separate review stages.

I also created "agency advisory group" sessions where teams using the design system could surface challenges, preview upcoming changes, and influence roadmap priorities. This participatory approach built trust and ensured our work stayed grounded in real needs rather than abstract ideals of what a design system "should" be.

Key Design Decisions

Accessibility as a Non-Negotiable

We made accessibility a first-class concern, not an afterthought. Every component was tested with screen readers, keyboard navigation, and high-contrast modes before release. We documented accessibility considerations in component guidance, not just technical implementation notes. When accessibility requirements conflicted with designer preferences, accessibility won.

Progressive Enhancement & Browser Support

Federal audiences include users on older browsers and lower-bandwidth connections. We committed to progressive enhancement—core functionality worked everywhere, enhancements were additive. This required discipline and occasional arguments with developers who wanted to use newer JavaScript features, but it ensured the system served all users, not just those with modern devices.

Composition Over Configuration

Rather than building monolithic components with dozens of configuration options, we created smaller, composable pieces that teams could combine to fit their needs. This reduced complexity in the design system codebase while increasing flexibility for implementers. It required better documentation and more example patterns, but proved more maintainable long-term.

Deliverables

  • Modernized component library with 45+ redesigned components meeting WCAG 2.1 AA standards
  • Rebuilt documentation site with improved navigation, search, and code examples
  • New design patterns for complex interactions: multi-step forms, data tables, date pickers, file uploads
  • Accessibility testing protocol and documentation guidelines adopted by core team and contributors
  • Open-source contribution process with review guidelines, enabling community contributions
  • Migration guides and codemods to help teams upgrade from legacy versions
  • Agency showcase featuring 20 case studies demonstrating diverse system implementations
"The modernization effort transformed our ability to build accessible, consistent services. The new patterns for complex forms alone saved our team months of development time and countless accessibility bugs. More importantly, the governance model made us feel like partners in the system's evolution, not just consumers."

— Michael Rodriguez, Lead Designer, Department of Veterans Affairs

Outcomes & Impact

Quantitative Results

  • 180+ federal websites adopted the modernized design system (up from 150 pre-modernization)
  • Reduced component accessibility issues by 78% based on automated testing across adopting sites
  • Cut average implementation time for common patterns by 40% according to agency developer surveys
  • Grew community contributors from 5 to 32 active monthly contributors through improved governance
  • Achieved 95% satisfaction rating from agency teams in annual design system survey (up from 71%)
  • Documentation site visits increased 230%, indicating improved discoverability and utility

Qualitative Impact

The modernization made the design system more than a component library—it became a platform for shared learning and standards across federal digital services. Agencies began treating it as a starting point rather than a constraint, contributing back improvements and patterns they developed for specific contexts.

The focus on accessibility had ripple effects beyond the design system itself. By documenting accessibility considerations in plain language and providing working examples, we helped teams build accessibility competency. Several agency teams reported that using the design system improved their overall development practices, even for features not using system components.

Perhaps most significantly, the improved governance model shifted the design system's identity from a GSA product to a community resource. This cultural change ensured sustainability beyond any individual contributor's tenure and created ownership across the federal design community.

Team celebrating successful release

Reflections & Key Learnings

Leading a design system modernization in government taught me lessons about technical leadership, organizational change, and the relationship between tools and culture:

Key Takeaways

  • Adoption is about trust, not features: Teams adopted the system not because of new components but because they trusted our judgment, knew we understood their constraints, and believed we'd support them through changes. Building that trust required consistency, responsiveness, and humility.
  • Document decisions, not just interfaces: Teams valued understanding why we made certain design choices as much as how to implement them. Documenting our decision-making process—especially when we made tradeoffs—helped users make good choices for their contexts.
  • Incremental change beats big rewrites: The temptation to "do it right" from scratch was strong, but incremental improvement kept the system usable throughout the transition and reduced risk. Perfect is the enemy of good, especially in large-scale systems.
  • Governance matters as much as code: The social infrastructure around the design system—how decisions were made, who had input, how contributions were reviewed—mattered as much as the technical infrastructure. Investing in governance early paid dividends.
  • Design systems are products, not projects: Treating the design system as an ongoing product with a roadmap, user feedback loops, and sustained team capacity was essential. The original approach of treating it as a series of discrete projects led to inconsistency and technical debt.

If I were starting this work over, I would invest more heavily upfront in automated testing infrastructure—it would have caught regressions earlier and given the team confidence to move faster. I would also create more structured pathways for agency teams to contribute patterns back to the core system; we had informal processes but they weren't well-documented or consistently followed.

This work reinforced my belief that design systems are fundamentally about enabling others to do good work. The best measure of success wasn't the elegance of our code or the comprehensiveness of our component library, but whether we made it easier for agency teams to build accessible, usable services for the public. That human-centered perspective guided difficult technical decisions and kept the team focused on what mattered.