Sunday, 17 de November de 2024 ISSN 1519-7670 - Ano 24 - nº 1314

Big Data Is Not Our Master Humans create technology. Humans can control it.

We’ve known for a long time that big companies can stalk our every digital move and customize our every Web interaction. Our movements are tracked by credit cards, Gmail, and tollbooths, and we haven’t seemed to care all that much. 

That is, until this week’s news of government eavesdropping, with the help of these very same big companies—Verizon, Facebook, and Google, among others. For the first time, America is waking up to the realities of what all this information—known in the business as “big data”—enables governments and corporations to do. 

Whether it requires a subpoena or a warrant, if government truly wants to identify your physical location, which friends you talk to, or what you have recently purchased, the revealing data is all readily available. The NSA’s overly broad interpretation of the Patriot Act suggests that government agencies will interpret the law as aggressively as necessary to get their job done.

We are suddenly wondering, Can the rise of enormous data systems that enable this surveillance be stopped or controlled? Is it possible to turn back the clock?

Technologists see the rise of big data as the inevitable march of history, impossible to prevent or alter. Viktor Mayer-Schönberger and Kenneth Cukier’s recent book Big Data is emblematic of this argument: They say that we must cope with the consequences of these changes, but they never really consider the role we play in creating and supporting these technologies themselves.

Larry Page just last week shrugged off concerns about Google Glass—a set of eyeglasses that can deploy facial identification or silently take pictures of everyone around you—as inevitable. "Obviously, there are cameras everywhere," Page said at his shareholder meeting, implying that because iPhone cameras exist, the next technology that enables silent, surreptitious photography is the inevitable next step.

But these well-meaning technological advocates have forgotten that as a society, we determine our own future and set our own standards, norms, and policy. Talking about technological advancements as if they are pre-ordained science erases the role of human autonomy and decision-making in inventing our own future. Big data is not a Leviathan that must be coped with, but a technological trend that we have made possible and support through social and political policy.

Unfortunately, this teleological approach to technology isn’t just a recent phenomenon. The clearest example of Silicon Valley mistaking scientific law for sociological trend dates from a 1965 paper discovering Moore’s “law.” According to its author Gordon Moore, co-founder of Intel, the number of transistors on integrated circuits doubles approximately every 18 months. 

Or is it every two years? Moore corrected his initial prediction in 1975, revising it to claim doubling every two years. Or is it even longer? A 2010 update to the law predicts it will slow to every three years beginning at the end of 2013. 

Among academics, there is consensus that the word “law” is a misnomer, since it isn’t clear if the advancement is driven by economics, corporate policy, or some other factor. To posit that the growth is driven by the nature of technology itself would ignore the economic, social, and political backdrop on which the innovation occurs. Craig Barrett, the former CEO of Intel, said in a talk earlier this year that Moore’s law was more of a strategic plan for Intel than a scientific law.

The “laws” of Silicon Valley are, in fact, not laws at all, but breakthroughs that we make possible. Europe is in the midst of debating its own approach to privacy and is considering a fundamental “right to be forgotten” law. Technology may continue to grow and become more complex, but that need not preclude debate—and potentially legislation—about how it can and should be used.

The security and privacy crises that have unfolded over the past week are the perfect moment for us to ask ourselves what public policy we should adopt not only to limit the government’s ability to mine data, but the ability of technological systems to store and process this data in the first place.

We do not need to live in a society where photos can be silently taken by a pair of eyeglasses or conversations can be overheard by a stranger across the room. We don’t need to live in a world where government or commercial satellites might peer into our homes at any moment without request. These are still hypothetical advancements, but for how long?

Disclosure: The New Republic's Publisher and Editor-in-Chief Chris Hughes was a co-founder of Facebook and worked at the company through 2007.