The user tolerance with AI tools is high.
The user tolerance with AI tools is high. They're so obviously useful that early adopters will put up with them every so often punching them in the face.
18 chunks · 16 episodes
The user tolerance with AI tools is high. They're so obviously useful that early adopters will put up with them every so often punching them in the face.
...ficant differentiation from alternatives… 4) …with a hyper-motivated segment of early adopters.
Don't get captured by your early adopters. There are a number of groups that are likely to be early adopters… and are also very unlike the mass market. If you aren't careful, you could get "...
A product whose early adopters all share a commonality unlike the general public likely has a low ceiling. For example, imagine a product with a privacy angle where 60% of its ear...
My friend Dimitri has a classic essay about the composition of early adopters. A few riffs: The Doers are the most populous. They just want to use the thing to achieve a task. They give the overall momentum to the group. The T...
...ling out the feature by default for the mass market vs rolling it out for savvy early adopters who can better understand the risk is a way bigger risk. Chrome hasn't talked about any novel mitigations they've come up with. Either they're being...
...erland, you don't have to structure any part too closely. See what people, even early adopters, choose to build on from what others created: those are the load-bearing things. The use cases can emerge, like a folksonomy.
... have an open-ended system that will change the world, 70% of the perception of early adopters should be the simple starting part. 20% of people should see the obvious extensions into direct adjacencies. Only 10% get the whole abstract vision....
...ee whiz" temporary flash-in-the-pan bump of how well it demos, powered by every early adopter trying it once. 2) the "this is useful" compounding curve powered by word of mouth. The two curves are different and distinct. Things that demo well ...
...doesn't tell you what to think. You have to have your own hypothesis. What your early adopters (or users via UXR) ask you to do is great signal. But don't just follow it blindly.
...eason it keeps popping up as a pitch is because everyone wants it so much. Tech early adopters have become more skeptical of the pitch just because they've heard it so much and it's never worked, so they assume the next one pitching it also wo...
...you want the product to go, so you don't blindly follow the "weird" requests of early adopters and iterate into a dead end. You want to surf the energy in front of you not with the steepest gradient, but that best aligns with where you want to...
Be careful about blindly following your early adopters. The early adopters can pull you in weird, random directions. A mental model to get the underlying dynamic: Imagine an underlying distribution of po...
...what they want, they will give you interesting dimensions to develop on. If the early adopters have to go through a gauntlet, the less-engaged bounce, and only the most-engaged remain. Sometimes the gauntlet is too intense, and a critical mass...
...the ladder. "Use it for X, it will work for you! You can do it!" People who are early adopters, who follow the thought influencers on Twitter, can curate and pass on that knowledge to other people.
...iability until demonstrating network effects strong enough to break through the early adopter ceiling. If you have network effects but a low ceiling, it can't change the whole world. To change the whole world, you have to have gravity-well sty...
...on't know what it's for yet! Typically with new technologies, there's a wave of early adopters who experiment and sense-make about the new technology, and share those best practices with new users as they join. ChatGPT is weird in that we have...
Early adopters are more engaged. There's an inherent self-selection bias: they are by definition more engaged, more willing to try new things than the rest of the ...