Available in Spanish. Kindly translated by Daniel Urbina

Early in October, Europe’s highest court invalidated the 15-year-old “safe harbor” – an international agreement that the European Union had negotiated with the United States to loosen the EU’s Data Protection Directive of 1995[1] so that it would allow companies to transfer personal information in digital form from the European Union to the United States.[2]  Is the European Court’s judgment a fundamental change in networking policy – a full stop – or merely a comma?

This is actually a longstanding structural conflict that has reignited. Its beginnings go back nearly half a century – when transborder flows of computer data [TDF] threatened to become a point of sharp conflict between the US, Europe, and often newly independent countries of the then-Third World.
ebony
By the mid-1970s, TDF was simultaneously controversial and – for U.S. big business and military agencies – irreplaceable. In 1981, Herbert I. Schiller showed, a few thousand large corporations possessing foreign direct investments outside the United States and (two-thirds of them, anyway) headquartered within the U.S. – relied on “a continuously swelling volume of data flows circulating inside [their] corporate business structures across national boundaries.”[3] Based in all economic sectors, these companies used early computer communications networks to transmit data concerning such things as “raw material stocks, production schedules, quality control, personnel records, tax and legal information, currency transactions, profit repatriation, and investment decisions.” As Schiller underlined, TDF helped to enable the largest corporations both “to transact their global business and further integrate the internationalization of capital.”[4]

A second source of TDF was the U.S. military and its allies. “The ability of American companies to operate on a global scale and enjoy the benefits of worldwide resource and market exploitation,” Schiller explained, “would be unimaginable without the full backup of a concentrated military power, ready for instantaneous deployment and intervention.”[5] Military and intelligence agencies depended on networked TDF to operate bases around the world; to implement attacks; and to conduct increasingly widespread surveillance.

There existed no definitive inventory of TDF; even partial views were highly inexact, for  the data that streamed across jurisdictions remained shielded. How much data was being sent over the private telecommunications circuits that carried most of it?  What portion of TDF was made up of operational and administrative business data? What part of the total was comprised of personally identifiable information?  What were the companies doing with all of “their” data? States did not deign to find out. The absence of meaningful public documentation bespoke an underlying power imbalance. Big companies successfully insisted that policymakers should not peer too closely at TDF, out of concern that such investigation might lead to calls for greater accountability – which in turn might constrain the operations of their profit-projects.

TDF not only conferred power on corporate capital but also, paradoxically, established a new point of vulnerability for it.  Interruptions and oversight requirements both endangered the political economy that was being reconstructed around computer-communications. Physical threats emerged when an earthquake or even the drag of a ship’s anchor engendered a break in the submarine cables over which data coursed; however, far more menacing for big business were political threats, emerging in initiatives that aimed to restrict the content of TDF, or charge according to the volume of data sent, or oversee TDF in the interests of self-government. (more…)

Cubans repeatedly rebelled against the mono-culture of sugar that an empire of capital forced on both land and people[1]; only the Cuban Revolution of 1959 finally succeeded in overcoming this bondage.[2] However, even before attending to a new agrarian law, needed to put an end to the plantation system and to redistribute foreign landholdings, Cuba gave an immediate demonstration of its newly won sovereignty. Just two months after Fidel Castro marched into Havana, in March 1959, telephone workers tore down a telephone advertising billboard, placed it in a coffin, and marched it down a boulevard before tossing it into the sea.[3] The ad was an icon of foreign domination. The Cuban Telephone Company, owned by the International Telephone and Telegraph Corporation, controlled and profited from the country’s thoroughly inadequate telecommunications. Cuba’s revolutionary government now took over management of this company; to the cheers of the Cuban people, formal expropriation followed.[4] In the ensuing years, what had been an unbalanced, Havana-centric telecommunications system was extended substantially into Cuba’s countryside. Meanwhile, other companies, based not only in the US but also in Western European countries, were also nationalized.[5] Every government apart from that of the United States duly accepted the legality of nationalization under existing international law, and negotiated financial settlements with the Cuban state.

The U.S. Government neither forgave nor forgot. It imposed a punishing economic embargo, which has lasted for more than half a century. Successive U.S. Administrations made repeated attempts, overt and covert, to overthrow the Cuban Government; since the 1980s, the US government has doled out more than $1 billion (under the pretense of “democracy” and the “free flow of information”[6]) to stir unrest against Cuba’s government. The 1992 Torricelli Act and the 1996 Helms-Burton Act turned the embargo into an even more devastating blockade, by adding further extraterritorial sanctions. Helms-Burton enabled the original owners of nationalized Cuban assets who afterward became U.S. citizens to use US courts to prosecute foreign companies that took over these properties.[7] Such provisions violated international law; but they were still deployed against a Mexican telecommunications corporation for making use of IT&T’s onetime Cuban telephone property.[8] Year after year, for twenty years, the United Nations General Assembly has resolved by overwhelming majorities that the U.S. embargo should be ended. The blockage continues; but real changes are afoot.

If Cuba’s entanglement in the circuits of capitalism had long revolved around sugar, information and communication have now become pivots of the country’s reintegration into a newly digital capitalism. In the run-up to President Obama’s 2014 announcement that the US was negotiating with Cuba to restore diplomatic relations, the U.S. Agency for International Development (USAID) was funding a Cuban version of Twitter – “ZuneZuneo” – through the Cuban-American youth group called Roots of Hope,[9] and was infiltrating the underground Hip Hop scene by sending a Serbian music producer to recruit Cuban rappers to provoke fans to spark a youth movement against the Cuban state.[10] As the U.S. shifted its foreign policy strategy – the two countries re-established formal diplomatic relations in July 2015 – it also moved networks systems and applications into the foreground.

The previous June, Google Chairman and former CEO Eric Schmidt had visited Cuba, with a team of Google executives including former State Department advisor Jared Cohen, and accompanied by the usual noise about a “free and open internet.”[11] Expressly criticizing the embargo, Schmidt geared his visit to scoping out future business opportunities. Soon after, Google released its Chrome browser and free versions of Google Play and Analytics in Cuba.[12] This was possible because, while the US trade embargo still remains intact to this day and can only be lifted by an act of the US Congress, Google tactfully offered free services – which fell between the cracks of the embargo – to test the waters in Cuba. As Google anticipated, the Obama Administration eased regulations in a few strategic fields including telecommunications.[13] To “initiate new efforts to increase Cubans’ access to communications and their ability to communicate freely,” the U.S. relaxed its controls to allow U.S. companies to sell telecommunications equipment and services, software, and Internet services in Cuba.[14]

(more…)

Over the last few years, the idea has taken hold that “big data” is driving far-reaching, and typically positive, change. “How Big Data Changes the Banking Industry,” “Big Data Is Transforming Medicine,” and “How Big Data Can Improve Manufacturing,” are characteristic headlines. “Big data” has become ubiquitous, powering everything from models of climate change to the advertisements sent to Web searchers.

Even in a society in which acronyms and sound-bites pass for knowledge, this familiar formulation stands out as vacuous. It offers us a reified name rather than an explanation of what the name means. What is the phenomenon denoted by “big data”? Why and when did it emerge? How is “it” changing things? Which things, in particular, are being changed – as opposed to merely being hyped? And last, but hardly least, are these changes desirable and, if so, for whom?

Big data is usually defined as data sets that are so large and complex – both structured and unstructured – that they challenge existing forms of statistical analysis. For instance, Google alone processes more than 40 thousand search queries every second, which equates to 3.5 billion in a day and 1.2 trillion searches per year;[1] every minute, Facebook users post 31.25 million message and views 2.77 milion video, 347,222 tweets are generated; by the year 2020, 1.8 megabytes of new information is expected to be created every second for every person on the planet.[2]

The compounding production of data – “datafication,” in one account [3] – is tied to proliferating arrays of digital sensors and probes, embedded in diverse arcs of practice. New means of storing, processing, and analyzing these data are the needed complement.

A quick etymological search finds that the term “big data” began to circulate during the years just before and after 2000.[4] Its deployments than quickened; but this seemingly sharp-edged transition into what Andrejevic and Burdon call a “sensor society”[5] actually possesses a deeper-rooted history.

The uses of statistics in prediction and control have long been entrenched, and have increased rapidly throughout the last century[6] – as is pointed out by a working group on “Historicizing Big Data” established at the Max Planck Institute for the History of Science. The group emphasizes that big data must not be stripped out of “a Cold War political economy,” in that “many of the precursors to 21st century data sciences began as national security or military projects in the Big Science era of the 1950s and 1960s.”[7]

(more…)

After 11 years of battle between Google — now a unit of the just-named Alphabet conglomerate — and the Authors Guild,[1] a professional organization of published writers, the second US Circuit Court of Appeals has ruled that Google’s book project is protected under fair use [paper trail].  Responding to this judgment, the American Library Association (ALA) announced that “Libraries laud appeals court affirmation that mass book digitization by Google is ‘fair use’.”[2] Larry Alford, president of the Association of Research Libraries (ARL) concurred, stating that ”ARL applauds this victory for fair use regarding the Google Books project, which involved partnerships with many of our libraries.”[3]

Is this outcome truly a cause for celebration? For whom is this a victory? What this verdict actually signifies may be understood only by clarifying the nature of the conflict; and analysis needs to go beyond “access,” as if “access” constitutes an absolute virtue – the be-all -and-the-end-all.

Google’s search engine has long been the premier gateway to the Web, granting Google a dominant market position online (around 60+ percent of all computer Web searches). This has enabled Google to seize a disproportionate share of Web advertising. However, Google’s dominance is not guaranteed. Competition to control the Internet has intensified between Google and other Web giants — Amazon, Facebook, and Apple. From Google’s perspective the question of how to maintain its market position could hardly be more vital.

Its superb search technology — its secret algorithm — isn’t enough, in itself, to ensure supremacy.  An additional element is also required. Only by protecting and, if possible, expanding its user-base, to feed streams of data into the company’s means of production — its search algorithm — will Google’s dominance in Web advertising be protected. Expanding its user-base in turn may be accomplished only by introducing, or taking over, services and content with which to draw additional users, and with which to target ads at what the industry calls “most-needed audiences.” (more…)