Design system adoption numbers...just a vanity metric?
Unless you can draw conclusions about why people are using your design system, and the impact it's having, raw adoption numbers could mean anything.
When I took over leadership of the IBM Carbon Design System, it was the first time I’d reported directly to Phil Gilbert, IBM Design GM. I remember one of the first one on one meetings I had with Phil. I was talking over some of my ideas and strategies for the further growth and evolution of Carbon. In particular, I highlighted some of the targets I had for adoption numbers. He told me, fairly bluntly, that I needed to stop focusing on adoption. That it wasn’t the right measure of success.
Phil’s the first person who challenged me on the idea of design system adoption numbers as a useful success metric. And, over the subsequent years, I’ve taken that challenge on. Phil’s point was that adoption numbers, without context, are merely a vanity metric. They’re easy for a design system team to latch onto as an easy measure of growth (they’re always going “up and to the right”). But they have no inherent value if there isn’t any more information.
Obviously, we want people to use the design systems we build. Adoption is interesting, but it’s not valuable. Adoption gives no measure of value. If everyone has adopted the design system, but efficiency hasn’t improved, and end user satisfaction of the products built with the design system haven’t changed, then what’s the point?
I’d made the mistake of looking at adoption as a leading metric. It’s a trailing metric. If teams are adopting the design system because they’re deriving value from it, then adoption is a recognition of that value. And such adoption is more likely to be sustainable. It’s based on a genuine desire from a team to use the design system because it benefits them.
Adoption might imply that your design system is bringing value. But unless you have other measures, you don’t know that. It could also be a result of a premature mandate for design system compliance. If a design system brings value to the business, then you need to know what that value is. Those are the metrics of real importance. It’s not how many people are using the design system, it’s the quantitative benefits they’re gaining from that adoption.
When you can tie a clear correlation between design system adoption and some other useful metric, then you have worthwhile information. Do teams who adoption the design system have a shorter delivery cycle? Then you have a story about how the design system brings greater efficiency. Do products using the design system have a better conversion rate for users? Then you have a story about how the design system brings business value. The more those stories repeat as adoption grows, the more robust the correlation is, and the more confidence you have that the design system is the cause.
Phil Gilbert challenged me, and created a fundamental shift in how I think about design and design systems. Always be looking to the value that design brings. Once you’ve done the work, you need to be able to demonstrate why the work is good. Not just by how often you see it, or by people telling you it’s great. But by real measures of the positive impact it has on goals it’s intended to achieve.
Phil retired from IBM last week. I’m very privileged that I had direct access to such an exceptional design leader. There’s about a thousand tributes on LinkedIn and elsewhere from IBMers who’ve been equally touched, inspired, and engaged by his leadership and vision.