If you’re not aware, Strava is a social media platform disguised as a fitness tracking application. For years they’ve carved out a nice little niche for themselves in the space, first by offering some truly unique and value-added features like their heat map and route planner, which uses real user data to find routes where people cycle, run, and so forth, and second, by incorporating a number of social features to allow athletes to interact, whether it’s liking each other’s activities, commenting, organizing group events, and so forth. Notably, quite a few professional athletes use the platform, including legendary riders like Jonas Vingegaard and Tadej Pogačar, which allows fans to follow their activities and so forth.

Strava opts for a freemium model, with a number of key features included in a paid tier, and as I’ve become more serious about cycling, I’ve found those paid features to be worth the price. As part of those packages, they include things like additional analytics, which provide various statistics about activities, a route planner with automated route suggestions based on their heatmap data, and leaderboards which, I have to admit, I kinda enjoy as they gamify community-defined segments for sprints, climbs, and so forth.

All of these features do one of two things: connect athletes to each other, or give athletes access to data, either about themselves or supplied by the community, that they can use to enhance their activities.

Unfortunately, recently it seems Strava has caught a nasty cold that is infecting companies the world over: AI. And in doing so I’m afraid Strava is demonstrating why tech companies ultimately fail to understand basic concepts like trust and empathy.

By the way, just a brief interlude: if you want to disable this mis-feature (for now), in the app you have to pick “Show More” on one of the AI generated callouts, select “Give Feedback”, and from there you can leave the beta. Given how difficult that option is to find, I’m comfortable calling this a dark pattern.

Alright, let’s get back to it, shall we?


So, what does this new feature do? Well, let’s start with Strava’s marketing. Called “Athlete Intelligence”, here’s how they describe the feature:

As soon as you upload a run, ride, walk or hike, AI takes your activity stats and delivers a clear summary of your workout. No more staring at charts—just smart insights that help you get more out of your next session with simple, actionable tips. Plus you get the push you need with encouraging feedback that keeps you focused and moving forward.

Sounds pretty nice, right?

Well, let’s see what that looks like in action. Here’s some of these “insights” from a ride I completed recently.

A very generic description of this activity noting it showed a 'solid performance with consistent speed and heart rate'.

First, the AI supplies an overall summary of the activity. This first example certainly is encouraging, with a nice tone. Sure, it’s a bit generic but eh, that’s fine.

But look just a little closer and you’ll notice it really doesn’t say much, does it? In fact, all it does is reiterate a bunch of statistics that are already visible and easily available right on that very same screen. It just does it in a narrative form, which, from a sheer technological standpoint is pretty nifty, but as an athlete, is pretty useless unless the goal is to increase accessibility of the app. And if that were the goal, I would absolutely be all for it! But this ain’t being sold as an aid for users of screen readers1. This is supposed to be “intelligence”, and so far I’d call this underwhelming.

Okay, maybe that’s just a bad start. Let’s look at a second example.

More AI 'intelligence' noting 'You spent a good portion of your time in your higher heart rate zones, showing you were working hard during this activity.'  Pure genius.

Look at that, just more of the same. If this is the kinds of intelligence we can expect from this feature, it’s not too terribly impressive. But wait, there’s more!

In this screenshot the AI notes 'Your average heart rate was 138 bpm, with a max of 159 bpm, indicating you pushed yourself during this ride.' Amazing.

This third example is quite literally just reiterating the heart rate statistics in my ride. Again, as a tool for accessibility, amazing! As a source of insight? Not so much.

And just think of the vast amounts of computing power and associated energy use that’s required to arrive at these “insights”.


Consider for a moment what a more interesting version of this feature might look like. My best guess is Strava is trying to build something akin to a digital trainer, in which case you’d expect suggestions for future training, maybe a critique of the activity, pointing out areas for future improvement, that sort of thing.

But Strava has a few problems:

  1. They don’t have the full context. They only have a limited window into the athlete’s activities, which makes it difficult to provide meaningful, holistic advice.
  2. They invite liability if the advice they give is bad. Imagine if their AI trainer told someone to train beyond what is safe, or engage in activities that might threaten someone’s health?
  3. Any attempt to provide the user with more nuanced, possibly uncomfortable feedback would likely be extremely poorly received due to it coming from an unthinking machine.

And it’s this last point that I went to dwell on.

When a person sees a doctor or a physiotherapist or a personal trainer, one of the first things that must happen is the building of a trust relationship between the person and the professional. Why? Well, consider the kind of advice a trainer might need to provide. Not only do they need to provide support and encouragement, and supply a tailored training plan that meets the needs of the individual, they also need to be able to tell the person things they might not want to hear in a way that’s gentle, kind, relatable, and actionable.

And if there’s one thing I think it’s safe to say about modern technology, it’s that people do not trust it, as they understand these technologies are flawed, unreliable, and frequently, inhumane.2

I’ve seen similar claims that AI might substitute for doctors, mental health professionals, or other healthcare practitioners, and in every case I have to wonder if the people entertaining these ideas have ever had to deal with a complex health situation or mental health difficulty.

This isn’t to say that human healthcare professionals are flawless. Goodness knows I’ve encountered my fair share of inhumane treatment at the hands of human healthcare providers. Nor do I believe AI can play no part in the healthcare system. Certainly AI, in the hands of trained professionals, has the potential to be very powerful.

But the idea that we should further excise humanity from what is the ultimate human activity–caring for one another–isn’t just alarming, it’s flat out dystopian.


Now look, it’s entirely possible that Strava really meant this to be as boring a feature as it appears to be: a simple textual summary of activity data.

But I struggle to believe that. No one adopts generative AI technology and brands it “Intelligence” if that’s the limit of their ambitions.

So assuming they are shooting for something more substantial, Strava either fails to understand these issues of trust and empathy, which explains why they’d roll out this feature in the first place, or they understand them, decided proceed anyway, and had to neuter the feature to make it as inoffensive as possible, thus rendering it useless.

In either case, Strava’s incorporation of “Athlete Intelligence” is, to me, a perfect microcosm of the failures of big tech, both in understanding their very human users, and in understanding how best to leverage (genuinely impressive) LLM technology to enhance human experience rather than degrade it.3

I can only hope Strava changes course and recognizes this misstep for what it is.

  1. Unfortunately, building products for those living with disabilities is not the kind of thing that attracts a lot of investor capital… 

  2. Yes, I recognize that some people do manage to build parasocial relationships with AI-powered chatbots, but I’d argue those individuals represent the exception rather than the rule. 

  3. Again, they could have branded this an accessibility feature and celebrated the value in supporting athletes living with disabilities, but that kind of thing just isn’t sexy enough.