The data economy is a dystopian nightmare

Smart speakers have become part of our daily lives. With a single word, we are able to command a device to answer our inquiries and shopping desires. The simple act of ordering diapers or asking for a weather report is now banished to the corner of the brain that houses what you had for breakfast that morning; you can recall it, sure, but not without considerable effort. Our devices, though, don’t forget. And neither do the companies that make them and own all the data collected through our interactions.

Data — as the analogy goes — is the new oil. It’s a commodity we can’t see or touch; we can’t process it to make food, nor can we use it to fuel the engine under the hood of our car. But it exists in abundance; it’s renewable, and consumers continue to feed this machine through daily interactions with the digital world.

How did we get here?

We can thank the ubiquity of smart devices, following the example set by social media platforms, coupled with the relatively low cost of building the networks that run them. It’s the perfect system — one where collecting data gets cheaper over time while the value of owning it rises exponentially.

Collecting and storing this data isn’t an inherently nefarious practice. It’s used to power smart cities, train artificial intelligence, and even drive changes in policy based on public sentiment. Calling the practice evil is short-sighted, and it’s a black-and-white solution to a problem bathed in shades of gray.

Part of the problem is governance.

There are important questions to be asked, and few regulators are willing to answer them. Few know the extent of these collection methods, nor are they aware of massive behind-the-scenes marketplaces where data is bought and sold as if it were cattle at a market. And aside from targeted advertising, there’s little known about how this data is being used, or what dangers it presents to our way of life both in the present and in the future.

Data presents a nightmarish scenario for misuse

Take your medical history. Medical professionals and insurance companies are bound by law — and professional oath — to keep this information private. Google, Apple and Amazon, however, are not. Even without direct, first-hand information from your doctor or pharmacist, the knowledge these companies acquire is sufficient to paint a pretty detailed picture of your vitals. These are companies, remember, that have access to your email, your search history, your location data, your shopping habits and often your photos. Google can read your online spreadsheets with prescription dosage instructions or find the list of depressive episodes you’ve been logging in a Google Doc to share with your mental health professional. Even that PDF on post-surgical care is being used to train AI as we speak.

This is only the tip of the iceberg. Amazon and Google are currently filling your house with devices that are always on and listening. Smart TVs collect data, which is sold to just about anyone willing to buy it, and sometimes even record you with built-in cameras and microphones.

And then there’s Facebook. Throughout its history, Facebook has not only shown little regard in protecting its users from data misuse, it’s run experiments to actively manipulate them into behaving in very specific ways.

But even if none of this worries you, and you’re willing to pay the price of convenience in order to keep using your favorite free services, then you ought to start thinking about the future.

Facebook, Google, Amazon and others will argue that they don’t sell this data, a talking point meant to comfort uneasy privacy advocates. And while this may or may not be true — each has been caught in instances of saying one thing and doing another — imagine the potential for future data breaches or misuse. Imagine trusting for-profit companies with securing information that rivals that of our most advanced three-letter government agencies. And imagine the general apathy of most people in continuing to feed an insatiable machine of our own creation.

What do we do about it?

It starts with education as with most shifts in consumer behavior. It’s educating the public that free isn’t free, and if they value privacy, they’re better suited in paying for services or choosing those that operate with a business model they can stomach.

Ask yourself what you’re willing to give up to share political posts and memes on Facebook. Are you willing to let Google follow you around, both online and off, for search results that are marginally better than its competitors? Do you know that item is cheaper on Amazon, or did you just give up comparison shopping altogether? Informed consumers can, and should, be looking for alternatives to mainstream services.

From customer service to business opportunities, our day-to-day life and the conveniences of the internet are traded for a loss of trust and privacy that barely can be denied. This might lead to the enthusiastic adoption of decentralized digital identity protocols in order to provide much-needed security.

But we’re not there yet. So, that leaves each of us a choice to make. And if you’re looking for privacy-friendly alternatives to major platforms and applications, they’re in no short supply.

The decision is yours: Continue using the internet as you always have, or take the steps needed to protect yourself from the companies that collect and later weaponize data for use against us.

If you’re looking for legislative change, don’t count on it; few politicians understand the scope of what we’re dealing with on a meaningful level. It’s up to us to adapt, embracing new products and technologies that mesh with our ideologies. You are the catalyst for change.

Dominik Schiener is a co-founder of the IOTA Foundation — a nonprofit foundation based in Berlin. He oversees partnerships and the overall realization of the project’s vision. Additionally, he won the largest blockchain hackathon in Shanghai. For the past two years, he has been focused on enabling the machine economy through IOTA.

Post a Comment

0 Comments