Ideas as Monetizable Assets: Welcome to the Age of Surveillance Capitalism

In the late 1990’s, Wall Street helped musician, David Bowie, to realize a payday of more than $55 million when it monetized the artist’s catalog of recorded music with “Bowie Bonds” that gave bondholders interest payments connected to revenues from that music. At the time, the concept of monetizing assets was not new, but the scope and success of Bowie Bonds led to the monetizing of other assets, including the sub-prime mortgages that were at the foundation of the 2008 financial meltdown.

Wall Street has largely recovered from that meltdown, but its efforts to monetize assets did not go unnoticed by power brokers and governments looking for ways to control and squeeze value out of their constituents. Whereas bankers jealously protect their control over investment capital, governments thrive when they control the thoughts and ideas of the governed masses.

In his seminal novel, 1984, George Orwell framed the motivation of the ruling authorities as an interest in pure power. Wall Street tears assets apart and rebundles their pieces into monetized investment vehicles. In contrast, Orwell noted that, in the hands of a nefarious entity that is interested in control, “power is in tearing human minds to pieces and putting them together again in new shapes of your choosing”.

A casual observer needs to look no further than the surface of the ubiquitous social media platforms to see flame wars, trolling, cyber-bullying, and other forms of negative interaction to understand how the users of those platforms are feeding the interests of both the companies that operate them and the governments that exert increasing control and influence over their constituents.

Where government restrictions are concerned, at least 40 countries exert some form of control or limitations on civilian access to social and political media platforms. China stands out among those countries with the social credit system that will become fully operational in 2020. That system ranks, rewards, and punishes people as a function of their behavior. Conduct that is deemed to be “bad”, including posting fake news or playing too many videogames, can lead to restrictions on the use of public transportation, limitations on internet access, prohibitions on attending better schools, and lockouts from better jobs. Other totalitarian regimes, including North Korea and Iran, use similar digital mechanisms to tamp down democracy in their homelands and to monitor and restrict criticism from abroad, but none of that control would be possible without some complicity from the tech giants that create the platforms that collect and store the data that is effectively monetized into a tool of greater control.

Shoshona Zuboff, a distinguished Harvard Business School professor, sums up this new form of control under the heading “Surveillance Capitalism”. In pertinent part, she notes that surveillance capitalism ”claims human experience as free raw material for translation into behavioural data that is fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later“. She goes on to explain that companies monetize these prediction products in behavioural futures markets, where companies take risks and commit capital in a form of wagering that anticipates a person’s future conduct.

Even when ordinary citizens attempt to opt out of systems that collect personal data about their behavior, many of the devices that they use, including interactive speakers and televisions, smart phones, doorbells, fitness watches, and other internet-of-things devices, gather metadata that companies can readily associate with individuals at a granular level. New internet-connected devices and systems such as driverless cars, in-home appliances, and wearable technology, as well as smart cities and free wifi networks, reach into the private lives of individuals to collect even more data. Every online interaction with Amazon and other etailers adds further data into the system of surveillance capitalism that companies use to push goods and services and that governments use to monitor and control their citizens.

Consider the data and information that can be derived from a single posting or electronic transaction. A vacation photograph posted on social media might disclose a person’s travel interests, hobbies, family status, and political motivations. Links in a social media account disclose friends and acquaintances, and artificial technology can weight those relationships to delve deeper into a person’s psyche.

Nest thermostats and doorbells, Amazon’s Alexa and Google’s passive smart speakers, and even internet-connected baby monitors can make a household more efficient, but those devices can also be trained to listen to snippets of conversation, which build more foundation around a person’s greater data profile. The artificial intelligence built into those devices can predict a person’s daily patterns and routines.

Defenders of these systems and devices emphasize the efforts of companies like Apple to offer greater privacy protections to users of their devices. Yet even Apple’s public posturing on data privacy rings hollow when the cover is removed from the company’s claims. For example, Apple revoked Facebook’s enterprise developer certificate because the social media company ostensibly breached the terms of its agreement with Apple, and not, as Apple seemed to imply in public announcements, because used a paid research program to extract data from users. Moreover, Apple’s Safari browser routes web searches through Google, and each Google search extracts billions of bits of data about users. Apple should not be shamed for these practices, but the reality of its conduct with respect to user data is more complex than the image that the company projects.

Governments and the companies that collect data are not the only entities that monetize it for capital or control purposes. Other entities that are in the line of data collection are learning to benefit from surveillance capital. Auto insurers, for example, ask their clients to install driving monitors in their vehicles with the promise of offering lower rates for safe driving. The data that those devices collect gives insurers far greater insights than just driving habits. Pharmaceutical manufacturers use patient comment boards to glean drug use habits. Grocery stores and other retailers use customer loyalty programs to build profiles that enable predictions of future products and other spending habits.

At the opposite end of the spectrum from national government control, local cities and municipalities are collecting and analyzing data from smart city grids that track both individuals and groups. The big tech companies are willing partners in these ventures, both enabling data collection by parties down the line from the tech sources and gathering data themselves. IBM and Cisco lead the smart city charge, with Microsoft, Siemens, and Huawei following close behind.

Many of these municipal efforts have come under sharp criticism from privacy advocates who express concerns that facial recognition technology and other biometric scanners will lead to deeper authoritarian inroads by government into private society. A line likely exists between data collection and use that provides genuine benefits, and data spying that serves the benefits of a Big Brother-like sovereign state. It remains to be seen where that line will be drawn.

Until then, consumers need to remain aware that their data is being collected, monitored, analyzed, and monetized every day. The companies and governments that collect data prefer to stay below the radar and to keep people in the dark about how their daily interactions are being used for profit and control. A fully automated future can be a remarkable boon for all mankind, but the vesting of control over that future poses questions that require further considerable insight and analysis. If, as George Orwell predicted, power will be a function of tearing apart and rebuilding a human mind, it is not unreasonable to give some choice in the shape of that rebuilding to the people whose minds are the subject of it.