The New Imperative for PMs

Nina Olding Product Manager, Weights and Biases

Key Quote

If we don’t help users fill the gaps with mental models, they will create their own.

Session Overview

AI features are landing in every product, sometimes as core functionality, sometimes layered on top. Wherever you sit on that spectrum, trust and transparency will determine whether your product sinks or swims in this new reality.

In this talk, Nina Olding will share practical strategies for embedding trust systems into AI products, from intuitive controls to clear explainability and transparent data practices. You’ll leave with concrete examples and design approaches to make AI feel trustworthy and natural in your products, building user confidence and satisfaction without sacrificing "AI magic", ambition, or innovation.

Notes

The Trust Gap

1 in 3 Americans trust AI. This is down from 50% in 2019.

The Trust Gap is adoption of AI is skyrocketing while trust in AI is plummeting.

The biggest consumer concerns are:

This impacts adoption, retention, and regulation

Plus, nobody knows how AI works.

Trust is like the air we breathe. When it’s present, nobody really notices. But when it’s absent, everybody notices.

Warren Buffet

Trust is hard to come by

Eminem