Signing Away Our Health Data

Even if companies are transparent with their data policies, most people don’t bother reading them, especially when they’re in the middle of a mental health crisis.

It is not enough to declare your intended use of personal data in a EULA. Even if you’ve translated your terms and conditions into human-readable language.

When we use a product, we think of it primarily as a transaction. I give you money or information, and you give me something useful. Increasingly, though, we’re entering into a relationship and are agreeing to continue providing money or information.

When it’s money, the continuation and scale of cost is generally obvious because our bank statements will remind us we’re paying, and there tend to be laws in place about ongoing transparency.

But when we agree to exchange access to data about ourselves, the relationship can become ambient, subtly slurping information from us with no outward sign. Often, the cost can increase without us noticing, as long as the technology allows a provider to silently gather more from us.

Laws and public opinion are slower than technology, so our obligations as product providers are not always obvious. But, ethically, we clearly must be actively transparent about what we will learn from our customers, and remain in dialog with them about how we use those learnings.

I posted this in September 2019 during week 2377.

For more, you should follow me on the fediverse: