Who’s Behind the Orders your Smartwatch Is Giving You?

Kara Hanson
4 min readJan 15, 2020

Recently I upgraded from a fitness tracker band to a smartwatch. I figured using it would be intuitive, so I bypassed the tutorial. I just strapped it on my wrist and went to work, which for me involves sitting at a computer typing.

Soon I was startled when the watch beeped and shook. “Time to stand!” the watch said, instructing me to get up and move around for a minute. Hmm, okay, not a bad idea, I had been sitting for some time. So I took a bathroom break, and that seemed to satisfy it. It told me I was a success.

A few minutes later, the watch alerted again. This time: “Breathe!”

I was confused. I was breathing, wasn’t I? Obviously. Then, I felt slightly offended. I need instructions on when and how to breathe? Uh, no.

Suddenly the health and fitness features of the smartwatch seemed less about reminders and more about regulating my body. It felt intrusive.

Devices like smartwatches, fitness trackers, and other wearable technologies may be good tools for recording exercise goals and prompting us to get moving. But increasingly, these gadgets are reaching into more intimate parts of our lives, and they aren’t waiting for us to indicate our preferences. They come preprogrammed to decide what we should do, and they proactively direct our actions.

My smartwatch, an Apple Series 5, not only prompts me to stand and breathe but automatically records my heart rate. You can set it up to take an ECG (heart rhythm) or track your menstrual cycle. It’s the same with other brands of wearable tech. Fitbit Versa 2 records your sleep stages and gives you a rating on the quality. The latest Garmin smartwatch can test your pulse ox (the amount of oxygen in your blood) and track your sweat loss.

I’m not saying these features aren’t useful, especially in the context of a fitness program. And, we do have some level of control in using the apps and muting alerts. But information about your body and its various functions are about as personal as it gets. It’s intimate. It’s private.

These smartwatches don’t just track, record, transmit, and compare — they judge. They’re programmed to congratulate and give awards and badges (in the form of digital images). My smartwatch jiggles, dings, and displays fireworks when I reach my activity goals. That’s not so bad. But it also admonishes me when I fall short. I didn’t ask for either of these features; they were already in the watch when I bought it.

Smartwatches allow us choices, but they’re also designed to guide us in those choices. As these watches continue to develop more apps and reach more intimately into our bodies and our behaviors, we need to consider: Who gets to decide what those choices are? The programmers? Fitness experts? Or someone else?

These are important questions because technologies do have an impact on our behavior. In his book Moralizing Technology, Peter-Paul Verbeeck, a professor in philosophy of technology, demonstrates that some technologies tend to “nudge” us toward making certain moral and ethical choices. One simple example he gives is the shopping cart corral that require you to put in a quarter in order to take a cart. This design encourages you to return the cart to the corral when you’re finished so you can get your quarter back. You are “nudged” by the design to be responsible with the cart.

At times, technologies are intentionally designed to elicit or prohibit certain behaviors. For instance, some benches at bus stops are made narrow or tilted to prevent people from sleeping on them. This design essentially constitutes a political act since it deliberately targets the homeless.

More often, the ethical qualities of the technology are unintentional and may involve the unconscious biases of engineers, programmers, designers, and so on. This is why we often see baby changing tables in women’s restrooms and not in men’s. At the design level, assumptions were made, perhaps based on stereotypes, that it would be the moms would be caring for the baby. Parents know this isn’t always true.

And this brings us back to the smartwatch and the directions it gives the user. They are designed intentionally for the purpose of influencing users’ behavior regarding their bodies. In fact, we buy them specifically because of that — we want them to encourage us to be more active and form healthy habits. But are there also features that were influenced by the unconscious biases of the people who made them? Is this a question that anyone is asking?

Who gets to decide what choices we get in our smartwatches, which we can disable, and what level of control we have? Employment stats at tech firms suggest that it’s unlikely that the decision-makers are women, people of color, or other underrepresented minorities. Although Apple states it is making progress in diversity, its employees are still overwhelmingly white and male. The situation is similar at FitBit and most other technology companies.

Smartwatches are opening the way for more technologies that give us instructions about our bodies, our behavior, and other intimate matters. We need to pay attention to who is making them.

References

Verbeeck, Peter-Paul. Moralizing Technology: Understanding and Designing the Morality of Things. Chicago: University of Chicago Press, 2011.

--

--

Kara Hanson

I study the interrelationship of technology, media, culture, and philosophy. PhD Humanities, concentration in philosophy of technology. Journalist. SF fan.