The Device You Trust

It started with the weather. Then a timer for the pasta. Then music while you cooked dinner. Before long, you were using it for everything — shopping lists, news briefings, checking your calendar, calling your mom hands-free while you folded laundry.

Alexa learned your habits. She turns on the lights when you get home. She locks the front door at 10 p.m. She knows which brand of paper towels you buy, what shows you watch, and what time you set your alarm. She plays white noise for the baby and audiobooks when you can’t sleep.

It’s genuinely useful. Voice assistants have made life easier for millions of people — especially older adults, people with mobility challenges, people juggling kids and work and everything in between. The convenience is real. Nobody’s pretending otherwise.

But there’s a second conversation happening in your living room. One you didn’t start. One you can’t hear. And it never stops.

What It’s Actually Doing

Every voice command you give Alexa is recorded, transmitted, and stored on Amazon’s servers. Not just the command itself — the audio. Your actual voice. Amazon employees and contractors have reviewed these recordings. The company admitted it in 2019 after Bloomberg broke the story: thousands of workers around the world were listening to Alexa recordings from people’s homes, including conversations that users believed were private.

But the voice recordings are only the beginning. Amazon has filed patents for analyzing voice data to detect emotional states, health conditions, and physical characteristics. The pitch of your voice. The pace of your speech. A cough pattern that might indicate illness. A tremor that might suggest a neurological condition. Your voice is biometric data, and Alexa is capturing it every time you speak.

Then there’s your Roomba. In 2022, Amazon tried to acquire iRobot — the company that makes Roomba — for $1.7 billion. The EU blocked it on antitrust grounds, and Amazon abandoned the deal in January 2024. But the attempted acquisition revealed what Big Tech actually wants: home mapping data. Think about what a Roomba does. It maps your home. It knows the dimensions of every room, where your furniture sits, how your space is laid out. It knows which rooms are large and which are small. It knows if you have a nursery. And the story didn’t end there. In December 2025, iRobot filed for Chapter 11 bankruptcy and agreed to be acquired by Shenzhen Picea Robotics — the Chinese manufacturer that had been building the Roombas all along. American home mapping data, potentially headed to a Chinese company. The surveillance didn’t stop when Amazon walked away. It just found a new buyer.

And then there’s fall detection. Newer Alexa devices can use ultrasonic sensors and microphone arrays to detect when someone falls. For elderly users living alone, this could be lifesaving. The device listens for the sound of impact, monitors for movement afterward, and can call emergency services automatically. It’s a genuinely important safety feature.

But here’s the paradox: the same always-on microphone that detects a fall also detects everything else. The sensor array that monitors whether grandma is moving around the house is the same sensor array that builds a behavioral profile of everyone in the home. The life-saving feature and the surveillance feature are the same feature. You can’t separate them — not the way the system is currently built.

Amazon knows when you wake up, what you eat, what you watch, who you call, and what your floor plan looks like. And they didn’t break in. You invited them.

Who’s Profiting

Amazon didn’t build Alexa out of generosity. The Echo was sold at cost — sometimes below cost — because the device itself was never the product. You were. Every interaction with Alexa feeds Amazon’s advertising and retail ecosystem. Ask Alexa to order batteries and she defaults to Amazon’s store. Ask for a recipe and she surfaces sponsored content. The assistant isn’t neutral. It’s a salesperson that lives in your kitchen.

The voice data feeds Amazon’s machine learning models, which power its ad targeting across the web. The shopping data informs product recommendations. The smart-home data — when you turn lights on and off, when you lock the door, when you’re home and when you’re not — builds a behavioral profile that has value far beyond selling you paper towels.

Amazon’s attempt to buy iRobot failed, but the playbook is clear: combine spatial data with behavioral data. Ring doorbell footage from outside, Alexa data from inside — Amazon has already assembled a surveillance infrastructure that covers the approach to your house, the interior of your house, and the daily patterns of everyone who lives there. And now iRobot’s mapping data — the layout of millions of American homes — is on its way to Chinese ownership through bankruptcy court.

No single government agency has that kind of access. No law enforcement body could get a warrant that broad. But Amazon built it one convenience feature at a time, and millions of people opted in because each piece — the doorbell, the speaker, the vacuum — seemed harmless on its own.

In 2023, the FTC fined Amazon $25 million for violating children’s privacy through Alexa — retaining kids’ voice recordings and geolocation data even after parents requested deletion. A $25 million fine against a company worth $1.5 trillion. That’s not a penalty. That’s a rounding error.

The mic that saves your life is the mic that sells you wheelchairs. That’s not a conspiracy — it’s a business model. And until we separate the service from the surveillance, every new convenience comes with an invisible cost.

I’m not saying fall detection is bad. I’m saying it shouldn’t require handing Amazon a complete behavioral map of your home and your health. We can build devices that detect falls without monetizing the data. We can build voice assistants that answer your questions without recording the conversation. The technology exists. What’s missing is the business incentive to use it — because right now, surveillance pays better than service.

What Real Privacy Looks Like

The answer isn’t unplugging everything and going back to light switches and paper maps. The answer is building technology that works for you instead of on you.

  • Local-processing voice assistants — projects like Home Assistant and Open Voice OS (OVOS) have proven that voice control can work entirely on your own hardware. Your voice command gets processed on a chip in your house, not on a server in Virginia. No recording leaves your home. No corporation stores your audio. The lights still turn on. The timer still goes off. The only difference is that nobody else is listening.
  • Right to disconnect — every smart device sold in America should be required to function in an offline mode. If you buy a speaker, it should play music without phoning home. If you buy a vacuum, it should clean your floors without uploading your floor plan. The baseline functionality you paid for should never depend on surrendering your data.
  • Devices that work offline by default — cloud connectivity should be opt-in, not opt-out. The device works out of the box with zero data transmission. If you want cloud features — remote access, cross-device sync, voice history — you can enable them with a clear, plain-English explanation of what data will be collected and where it will go. And “no thanks” doesn’t brick the device.
  • Separation of safety and surveillance — fall detection, smoke alarms, medical alerts — these are critical services. They should be built on architectures that keep the safety function without the data harvesting. A device that calls 911 when you fall doesn’t need to know what you watched on TV last night. These functions can and should be decoupled.
  • Meaningful enforcement — when a company worth $1.5 trillion gets fined $25 million for violating children’s privacy, the incentive structure is broken. Penalties need to be proportional to revenue. Executives need personal liability. And consumers need a private right of action — the ability to sue when their data is misused, without waiting for a federal agency to act on their behalf.

Your living room should be the most private space you have. The place where you talk to your family, relax after work, live your life without an audience. Technology should make that space more comfortable — not more surveilled. We have the tools to build it right. We just need the rules to demand it.