Tags

, , , , , , ,

TS: Nobody taught them, but they just emerged with this; so in Japanese expression, these are a new species … totally different from us.

MTK: There’s a Japanese phrase for this?

TS: mm-hmm.  Shin Jin RuiJin Rui is the ‘species’ and then Shin is new. So, a totally different group of people as far as this computer or technology.

Takako Smith, 51, on her children and nephews, Narita Airport, June 2005
one of 11 interviews in The First Contact Project:

www.revolocien.com/zounds/firstcontact.php

30 years ago, in the spring of my thirteenth year, a handful of seventh grade classmates and I were introduced to the first computer any of us had ever seen. It was a big, grey, heavy thing that took up an entire desk table

It cost about a thousand dollars, which in 1980 was several hundred dollars more than a high-end color television, and it came with a 5¼-inch floppy disk drive in front and a multi-pin serial port in the back with which to connect a wide, flat, gray plastic cable to a dot-matrix printer, the sole consumer peripheral of the era. The printers were loud.

The TRS-80 – for T[andy] R[adio] S[hack] [19]80 – didn’t last long in the stampede towards obsolescence that has become the trademark of personal devices, maybe two years, but this bulky, ugly, gunmetal-grey personal computer lasted long enough to garner a nickname. We called it the Trash-80.

We were thirteen and it represented suburban popular society’s introduction to computing – you could buy one at the mall. It had 16k of memory, a dull black monitor screen, and a little white, rectangular blinking cursor.

20 years later, in the year 2000, the desktop computer in my office at The New School in Manhattan, came with 80 million k and 512k of RAM. Laptops with as much and weighing less than twelve pounds were available for around two thousand dollars. Most machines had numerous data ports including modem, USB and/or fire wire and all provided access to the nascent virtual human extension we call the World Wide Web – which didn’t exist when I graduated from the University of Texas, Pre-Internet Baccalaureate, a slow-dying breed.

10 years later, here in Oakland in 2010, I steer a 750 dollar Dell laptop with 500gigs, 4 gigs of RAM, and an Intel Core2 Duo 2.2GHz to write this essay, and if I don’t use the net to source it with fresh material it will bore most of my contemporaries.

plug/unplug, is a vernacular history of my use of technology and comes with a compact disc called, The First Contact Project, which consists of interviews of people of various ages remembering their first interactions with a computer and the Internet.

I participated in and then withdrew from high tech for years at a time in various contexts over the past three decades and here I attempt to address one principal criticism: that the quality of this immense leap in personal computing technology in such a short amount of time has been over-valued by high-tech industry, and it’s corollary, that we must, at least occasionally, unplug from the ever-spiraling fantasy projection of ourselves that we have begun in the wake of the digital era, take pause, reflect and perhaps even reboot our system, at a personal and national level. We must be more judicious about our ongoing relationship with machines.

To understand this criticism, consider first how many personal electronic devices existed in our home when the TRS-80 arrived. There was television; color and black-and-white sets with four channels (ABC, NBC, CBS, PBS) and perhaps a local UHF station. We had VHS by 1985 and most of my classmates were part of a fast-growing cable television market (HBO, Cinemax, Showtime, ESPN, MTV, Discovery and the History channel were all born in the eighties). Games were 2-D and catchy as hell: Atari, Pacman, Donkey Kong.

There were landlines and princess phones and fax machines, and a “mobile phone” was rumored to exist. There was audio gear that had evolved to an analog specificity of high order: pre-amps, amplifiers, receivers, equalizers, turntables, cassette players (8-tracks) and speakers.  There were devices for the kitchen: microwaves were the latest, but blenders, mixers, grinders, coffee makers, juicers and toaster ovens had all appeared in the three decades after WWII. In the garage, we had gas-driven mowers, blowers and perhaps electric gardening and power tools. Certainly 1980’s, “middle class,” USA was the most advanced culture in terms of consumer technologically anywhere – except perhaps Japan.

The personal computer entered the home and went into a totally separate room – Dad’s study – where it was dedicated to educating me about computing. We had to make a relationship with the personal computer and, early on, the machine was pretty brutish. Initially, it wasn’t even as useful as the machine it shared that room with, the typewriter. I remember trying to get that Trash-80 to do a moronic do-loop while hearing, beside me at his desk, the soft, powerful clicking of my father on his dominant IBM electric at the very end of the typewriter’s hundred-year reign over writing.

My Dad put the computer table in a walk-in closet in his study, and so I was alone in there with it all the time. I remember the closet’s dusty smell, of the old papers he had archived on the shelves above me. I wasn’t scared of it, but it was daunting. I had classmates – prodigies really – who had already gained local notoriety for their use of computers, and my father like many, wanted me to have access to the new tech. Often, I was in there only because Dad expected me to be. I just sat with this big, ugly gray thing blinking at me, unable to program it. I remember feeling utterly uninspired.

I learned some BASIC at school and through a magazine and from some friends, and wrote some really simple code. I designed a Dungeons and Dragons type text-based exploration game, wrote a calculating program, but I never really got into it. Because I was interested even then in publishing, I was printing multiples with typewriters and carbon paper, with the AB Dick mimeographing machine, and finally through the wonder of copy machines. I didn’t consider the computer as a tool for publishing. The computer was the territory of science and mathematics. It required programming with Mathematics terms. We cracked the case and opened it up in Physics class. Though some of us may not have grasped the technical aspects of computing as quickly as others of my classmates, we all understood it was the beginning of the digital era. This is evident when listening to The First Contact Project.

In 1983, a few years into my experience with the TRS-80 at home and with various IBM 8088s and Apples  introduced to us through science classes in Junior High and High School, I ran into a Macintosh. My friend Randy’s dad was an engineer at Datapoint, and bought one of the first. We were handling the little console months before the Chiat/Day television ad for the Mac, which debuted during the Super Bowl in 1984. The ad featured a woman in running shorts and tee shirt, tinted, the only color figure in an ominous black-and-white future-world of faceless grey drones. In the ad, she ran, carrying a hammer which she throws into a massive television screen to smash a projection of an enormous Big Brotherly face monotonously intoning unintelligible propaganda.#

The first Apple Macintosh’s were actually editioned, with an engraved steel plate with a unique number soldered to the back.  It was clear to us as teenagers in Randy’s room out behind his parents’ place, that we were looking at something radical. The interface of the Mac was stunningly more user-friendly than any previously experienced. A child could use it.

Windows, the operating system that commanded more than 90% of the world’s desktops for two decades, did not yet exist. Bill Gates was just a Harvard dropout, but Apple was on the map. The Mac introduced the mouse, fonts, pull down menus and yes, windows. The Macintosh would define how Windows would look and eventually how tens of millions would interface – through machines – with each other around the world.

By my senior year of high school a lot of us were writing papers with word processors and printing them dot-matrix to take to our teachers. The movement started with stand-alone word processor devices, which were typewriter-like machines that had single-line or paragraph-wide monitors at the top of the keyboard, allowing writers the ability to read what they were typing without printing it first, for the first time ever. My Dad loved his.

Looking back it seems both obvious and amazing how quickly we made the transition to using the word processor and eventually software on a pc to write. It was a natural step that changed writing forever. Cursive and the typewriter languish. But though the computer was on the verge of changing writing, publishing, and expressing with text and image forever, the single most dominant force of mass media technology wasn’t yet the computer.

THE TELEVISION PRESIDENCY 1945 – 2008
As the Super Bowl ad for the Macintosh reminded us, it was 1984, and the United States was described by most as being a free society, totally unlike the one in George Orwell’s prophetic novel named for that year. That image – of totalitarian fascism that produced false-flags and created an enslaved society – was projected by the U.S. President onto the Soviet Union, a country he called “The Evil Empire”. It was a term taken directly from popular movies and, wielded by a movie actor through the ubiquity of the medium of television, it became successful political propaganda.

The Television Presidency, born when Truman told the world the U.S. had used the A-bomb, instantly made the Office of the President of the United States different from every presidency before TV, and television dominated until the Internet and the digital age, a period of twelve presidents.

In his most important moment on TV, President General Eisenhower warned against the Military-Industrial Complex and went unheeded, perhaps it would have worked in color, we‘ll never know. The relationship between color television and the Presidency began with Kennedy’s handsomeness and, typically of all things videoed, was be taken to the other extreme, the visual abuse of his savage assassination and that of his brother. TV then exposed LBJ, Nixon and Kissinger’s dirty wars and the ugly side of the USA: repression, corruption, racism. TV was the king of the failure that was The Vietnam War.

Predictably, it was Ronald Reagan, an actor, who synthesized the power of the “small screen” for political propaganda. He overcame the tool’s power to reveal and its potency withered with the mic in his hands. Many fought against it and lost as TV smothered President Carter and buoyed Reagan to a full eight-year script, designed just like a Hollywood movie, with a brilliant new dawn at the front and a cowboy riding into the sunset at the end.

Reagan and TV media convinced most Americans that people in Russia lived in a dreary, black-and-white reality, trudging when they walked, standing in interminable lines as black-booted officers of the Kremlin marched past with truncheons to beat them if they acted out. And he promoted our freedom to shop and drive and declared the vast empty spaces of our plains – devoid of the genocided natives and buffalo – to be ours to tame. Trained and experienced for fifty years in delivering lines written by others, Reagan used the words “freedom,” “liberty” and “greatest country in the world” on TV a lot. But Reagan’s “New Dawn” should be revised by historians to be revealed for what it was, a veil.

During his terms, millions were jailed for victimless crimes. Millions of other unfortunates unable to care for themselves were cast out of care centers and into the streets. Hundreds of thousands suffered because the President refused to utter the word AIDS – on TV or anywhere else. Secret wars were conducted that tortured, raped and murdered tens of thousands of civilians, including women and children – in Central America, in West Asia, in Africa. Trickle down economics and Reagan’s massive military budgets set us on a path from which we have yet to fully recover.

One of the best assessments of the Reagan Era, which reads prescient in the wake of the Reagan Doctrine and captures Reagan, the man, is Ronald Reagan: An Autopsy, by Murray N. Rothbard in March of 1989, an autopsy well before his death in 2004, which chillingly predicts that the digital age would if it could “mummify” a carefully crafted public perception of the 40th President well into the future.
“In this High Tech Age, I’m sure his mere physical death could easily have been overcome by his handlers and media mavens. Ronald Reagan will be suitably mummified, trotted out in front of a giant American flag and some puppet master would have gotten him to give his winsome headshake and some ventriloquist would have imitated the golden tones, “We -e-ell …” (Why not? After all, the living reality of the last four years has not been a helluva lot different).”

Consumer technology, on the cusp of elevating us with the Internet, was in those days represented in its farthest reach by television. And that medium was manipulated on the most epic scale by Ronald Reagan. In those days, to be broadcast all over the world on US television was as close to “global communication in realtime” as existed and, on the evening of my sixteenth birthday, the actor-president went on television and gravely told us it was imperative to invest our tax dollars in a Strategic Defense Initiative to protect us from nuclear war. Reagan described this SDI as “Star Wars” technology, in the vernacular of the pop-movie phenomenon.

Every legitimate scientist in the world knew SDI was a ploy of language, a technical and political impossibility to deliver, and indeed, it was later revealed that Reagan’s own speechwriters had advised against his including it in public presentation – he’d made the decision on his own that day to do it. Generals, scientists, politicians and writers protested and others were put on the spot, but somehow the language was never exposed. A naïve public wowed by Reagan, Star Wars, computers and technology in general – and without the Internet to look up the reaction of scientists and writers to such drivel – ate it up. Conservatives have used the phrase to justify defense spending for offensive weapons for decades – even now in Europe. Years later we live with such TV-generated myths, like the “dirty bomb”.

Reagan used his charisma on the small screen to push private, and even illegal agendas, until the veneer finally broke in the Iran/Contra hearings, but even then, his “I can’t remembers” delivered pitch-perfect on national television, got him off the hook. Years later, I asked U.S. Historian Dr. Cornell West, how it could have come to that:

“In some ways it’s like after the Congress of Vienna in 1815 where you had thick waves of counter-revolution, thick waves of conservative politics and the emergence of reactionary elites and nation states. And since the 1980’s we have had thick waves of conservatism, thick waves of reactionary elites, Thatcher, Reagan, you can go right across the board. … We’re dealing right now with an ice age, and by ice age I mean deeply conservative and reactionary elites shaping the world in their own image.”#

BIRTH OF THE INTERNET MEETS THE TELEVISION PRESIDENCY
As the global capitalists and reactionary elites seized back control through Reagan and G.H.W. Bush, the Agency man, their mouthpieces in the White House, out west, intellectuals were absorbed in the privatization, commercialization and diversification of high technology. In Silicon Valley, California, a cultural renaissance of international significance was taking place – biotech giants like Genentech were revolutionizing the privatization of research labs, RDBMS giants like Oracle and Informix were radicalizing data collection and analysis, and computing was blossoming. Since the early 1970’s, U.S. scientists had been working on a concept from a series of memos written in the 60’s toward the creation of the network we now call the Internet.

Working at Stanford, MIT, Champagne-Urbana and elsewhere, these scientists realized international networking even as my classmates and I were first being exposed to the TRS-80, DOS and Apple II.#  My generation’s history with computers parallels the history of personal computing itself. We were the first to send an e-mail; the first to use what has become a principal tool for communication on earth, The Internet. It provides never-before realized transparency and sharing capability between independent thinkers. It is the culmination of the greatest successes of the last century, bringing together the progress of telegraph, telephone, radio, television and computer to realize in synthesis one of the greatest human tools ever designed and implemented.

The TCP/IP protocol that is the basis for the modern World Wide Web was established January 1st of 1983, when I was a sophomore in High School. Then the National Science Foundation funded and supported networks – and dialogues which led directly to networking – for students and professors around the country in the 80’s and, by the 90’s, around the globe. The world became hyper-computerized before our eyes over the next twenty years. There was software for everything, and if there wasn’t yet, there soon would be, a progression culminating in the contemporary question: Is there an app for that?

The engineers of the microchip age have tried to make machines that fit seamlessly into our lives. Have we taught them to think more like us or have they taught us to adapt to them? Of course the answer is both. But I don’t believe the capitalist model has prioritized, nor is likely to prioritize, producing new technologies in a humanistic or socially altruistic manner before producing whatever will sell most and fastest. I have grown mistrustful not of the technology, but of the market, which has been abusive to us as consumers, debasing our desires while pushing gadgets at us.

I have always felt a conscious need to withdraw from the gear – to unplug, for fear of being drawn into a deluded state. I have, from the earliest days of computing, resisted giving myself over wholly to needing the machines I use. Today. For example I cling to my clamshell phone for three years, convinced I want phone and net separate, watching as everybody I know goes to I-phone, Blackberry, Android, and 4G device. I didn’t want to become a slave to new technologies as they revealed themselves. I prefer to lay back and let the tech that’s worth having sift its way to the top. Often, as in the cases of my mobile phone or gaming, my resistance has been against mass commercial media blitzed at my generation, forcing upgrades.

BIRTH OF THE INTERNATIONAL GAMING INDUSTRY
In the 1980’s, arcade games went digital in a big way and pinball slipped into the archive bin. When I began as a 12-year-old with Pong, I’d played pinball, but home gaming systems changed all that and playing Atari and Nintendo and the arcade games – Space Invaders, Asteroids, Pac Man, Donkey Kong, Galaga, Defender, Frogger, Centipede, Grand Prix, Tempest – was a national obsession for my generation.  It was fun, but more often it felt like a huge waste of time and quarters. The need to revision derivative versions to sustain interest arose –  Donkey Kong II and Ms. Pac Man – and that’s when I dropped out. I’d spent hours playing a game for days in a row. I had spent tremendous energy obsessed with taking games to their final levels. It was great for killing time, but draining when it became an obsession. Perhaps because I’ve always been a reader and a person who wants to be active, gaming feels like a net-energy loss or maybe I just matured out of it, but today I don’t game.

The same cannot be said for my generation. Electronic and Internet gaming is now a much bigger business than the movie industry. The din of the clamor for games reaches a global fever pitch in advance of new releases. I’ve observed those who are absorbed in it wholly now for twenty years. I play from time to time to both measure the advances in gaming and the seamlessness with which the gamers are engaged. I played Doom in  NYC in 1997, was late, but appreciated the range of motion.

I am happy for the mental freedom of not being hooked to games, but I have often felt outside of huge social groups, and unwilling to play a given game long enough to join them. Leaving gaming has been an unplug with complicated dimensions. As I drifted away from my friends who continued playing games over the years, and the generations that have followed us, I joined a groups of people who, unlike me, had no access or experience with the technology. Among them, I felt like an agent, a member of a tech class milling amongst the unconnected, by far the vast majority of the world. To create the largest cottage entertainment industry in history, I have wondered whether or not my withdrawal was cultural.

I had the unique opportunity to conduct personal research into this over the next two decades as I watched the introduction of the games to teenagers in California, New York, Japan, Taiwan and India at arcades and Internet cafés throughout the 1990’s and the Aughts.

Between 1999 and 2007, I used Internet cafés in New York, LA, Paris, Tokyo, Hong Kong, Amsterdam, Lisboa, London, Gothenberg, Sweden and big cities and smaller towns all over Japan, India and Taiwan. I used internet cafés in East Jerusalem and the occupied Palestinian territories to send news radio to LA, and on Madeira Island and even the tiny Azorean island of Ilha Terceira to work details for an art installation.

I’ve Skyped from Madras to NYC and do it commonly now from anywhere I please. It has been an incredible period to be traveling and observing the birth of the digital age, globally, firsthand, after having been at the nascence of the age in its birthplace, the United States. One of the most interesting things I’ve noticed  concerns gaming and teenagers.

In the early 1990’s Taiwan and Japan were hotbeds for US corporate activity, the ubiquity of MADE IN JAPAN came before MADE IN TAIWAN and only ten years later MADE IN CHINA followed. Japan had become linked with the US, as one critic put it to me, like the 51st state, or in the words of another Japanese observer, who remarked more harshly, Okinawa was no longer a Japanese island, but an aircraft carrier for the U.S.

In Taiwan, the ruling Kuo Ming Tang [KMT] party disallowed democracy and opposition parties, but was still backed by the USA, with whom, until Clinton, they held the ludicrous one-China policy, firm – the relationship across the Strait was tense.

The USA wanted free market franchisism as close to “Red China” as possible and enabled the major chains access to Japan, Taiwan, Korea, the Phillippines. Coca-Cola and McDonald’s of course, first. But Pepsi, KFC, Shakey’s, Pizza Hut, Hardees, Wendy’s, Burger King and others began popping up all over East Asia. These became hangouts for youth enamored with US pop culture, who, when the Internet Cafés arrived, were ready to transition right into the latest US fad.

For the last two decades, traveling across Asia, I would be walking down a street in some busy metropolis, or even some small town, and come across a small, glowing storefront with frosted glass or a bamboo hut with power running through it, from which emanates an immense din – the screaming volume of video games. They stay open late into the night, usually running 24/7. The fascination with US gaming has spread like wildfire through these countries. While I would use these café’s to send data, read e-mail and transmit information, I was most often surrounded by packs of young teenaged boys – and sometimes girls – huddled around a monitor, playing or advising a gamer. By the turn of the millennium in Taiwan and Japan, these café’s included private booths and I was confident that the massive Internet porn industry was finding its way to Asia as well.

Having witnessed and participated in the beginning of the gaming era as a teenager in the US, and, having given it up, I had then witnessed in my 20’s the spread of the phenomenon through teenagers in East Asia. When, in my 30’s, I landed in Europe, in the late-90’s, I found the Internet café’s had recently arrived and the teenaged gamers were there, too – in Gothenberg, Lisboa, Paris and London. So, when I landed in India in 2006, amidst the boom time for that Asian economy, I expected to see the same effect at the Internet Café’s in my home country – but I was caught by surprise.

In India, the situation was totally different: for every one café filled with screaming machines and teenaged boys, there were 20 in which adults, men and women, and children of almost all ages were engaged in Internet research and connecting with others throughout the world. I found teenaged students and middle-aged thinkers trying to expand their consciousness with information from the net far more than participating games. Pridefully, I attributed this to a cultural sophistication of the Indian mindset, but soon I began to realize it was something else entirely: English.

ENGLISH, GLOBISH AND NEW MEDIA SPEAK TK TK

In India, English had reigned blah blah

In the August 9, 2010, issue of the New Yorker, Nicholson Baker wrote the best recent story about the top-selling games of the industry, reporting in a straight news style about playing each of the biggest sellers against his teenager, in a piece called, My Son is Killing Me. He talks about it online at: http://www.newyorker.com/online/2010/08/09/100809on_audio_baker

But perhaps more illuminating, is the reply by blogger Greg Costikyan, criticism that:

“Baker has done the equivalent of watching the top ten Hollywood blockbusters of the year; doing so will not develop a particularly acute appreciation for the virtues of cinema as an artform. I would suggest that something of the same applies to games; the most interesting work is rarely done in the most commercial venues,” and noting, “Of the games Baker plays, only Heavy Rain is, from a game designer’s perspective, remotely interesting. Better he should experience Braid, Flow, Passage, Dwarf Fortress, The Baron.”

Gaming of course, has left the territory of being solely for teenaged boys and has become designed in full for adults and families. Guitar Hero and the Wii are as common in homes today as Monopoly and Chess.  Because I’ve plugged-in and unplugged in calculated ways, I’m out of the loop with regard to this culture often. I will have missed a trend or fad in gaming, or a popular television show in my ‘absence.’ I have come to realize that in this, I’m not alone and that as the Digital Age proceeds, our concepts of time and truth grow increasingly stretched.

Pluralism of media has diluted information and the concepts of time and truth. The fluidity of the new pluralized media; the timeless, interconnectedness of the digital era that puts old tv, movies and games and new content all out together in the huge, mostly corporate library we call the spectrum makes it possible to skip across generations of consumers in a moment, to verify claims of memory in an instant and has, in a very short time, created vast groupings of consumers arrayed in competing technology cliques on the basis of their consumption of media. People rarely agree on what’s best or true anymore – there’s too many options across generations to compare and, at any rate, it‘s like comparing apples to oranges.

We can now consider the fascination of the original regular viewers of David Lynch’s TV epic Twin Peaks by broadcasting it episode by episode in a university classroom over a semester, considering it in relation to the nation’s social and political context contemporaneous to its original broadcast. I have myself, as an exercise in understanding culture, watched long-running programs that viewers consumed slowly over seasons in a matter of days, over a weekend, compressed, without ads. The election of 2008, which some referred to as The Youtube Election, cemented the position of the Internet at the forefront of the information delivery process for news and elections coverage, from Obama girl to McCain’s admission that he didn’t use e-mail, the net played an important role in nearly every campaign.

It is now common for our social and cultural institutions to include videographic data at all public venue. Most academics are connected and thus no one can disconnect. But I think unplugging for the short term is still possible. Ove the last 30 years, I’ve done it and I’ve felt the immense separation from the plugged-in world by the act. What exactly is it that I am outside of then? Can the plugged-in world be said to exist independent of the unplugged? Or is it just a rationalization, a fantasy projection of our marketplace?

The scramble to commercialize the Internet became a powerful act of authority that began an assault by global capitalism upon my generation; a process that has resulted in broad but superficial interconnectedness, the breakdown of privacy, the consolidation of mass media, and the creation of commercial and political propaganda and ultimately, sponsorship of unilateral wars for corporate interests, and a kind of enslavement to consumer technology.

A profound frustration for many millions of people was that inter-connective technology existed in 2001, 2002 and 2003, and despite many organized actions, distribution of educational materials across the world about Afghanistan, Iraq and the imperial-corporate interests of the USA/UK and Soviets over the decades, and failed to author a peaceful response to the attacks of 9/11/2001.

We were unable to resist the juggernaut of manipulation of these same tools by Rove, Bush, Cheney, et. al. (who perpetuated outright lies with the new information tools: Iraq has WMDs and can bomb their neighbors, the 16-word lie during the State of the Union concerning yellowcake uranium from Nigeria, and the worst, the utter ridiculousness that Iraq was somehow involved in 9/11).

If anything, Gulf War I and II cemented control of the press by the masters of war. At last, they invented the embedded journalist and consolidated the mass media into a handful of hands. Global media capitalists use the web, like television and other mass media before it, to redefine our world in terms of their ownership. And while we all gain by the amazing traffic of information the Internet has brought, privatization of knowledge and mass scale manipulation through the medium is now apparent, which is only slowly yielding to the power of the medium to organize and create social change. These tools have been witnessed in uprisings in Iran, Afghanistan, Gaza, Honduras and most recently Tunisia and Egypt.

What Julian Assange and Wikileaks are demanding, is that Information be put in a Commons – and the current U.S. government can’t stand what the idea exposes. An Information Commons threatens corporations and governments. Just fifteen days after the War on Iraq began with the bombing of Baghdad in 2003, biophysicist Dr. Vandana Shiva explained the contemporary redefinition of resources to me:
“The empire imperative arising out of oil is the same imperative that arises out of turning water into a tradable commodity and turning life into a tradable commodity; made tradable by first redefining The Commons – either the biological or intellectual Commons, related to biodiversity, or the Water Commons – as private property. The two go hand in hand: you redefine the Commons as private property, then [since] private property is tradable, [and] Commons are not tradable, you can put it into the marketplace and out of that comes the control.

“The metaphor of oil is being applied on every renewable resource. It used to be that oil was nonrenewable and fossil fuels were nonrenewable, [while] water used to be renewable and biodiversity was – precisely! ‘life-forms that reproduce themselves’ – that was the very definition of [biodiversity].

“But biodiversity, genetic resources, water … [these] are all being redefined as oil. So water is the Blue Gold of the future and biodiversity and genetic resources – whether they be cells in genes in human bodies or animals, or the genes in plants, or the traditional knowledge of societies like India where the neem and the basmati and the turmeric and the pepper – everything – is up for grabs, [these are] being called the Green Gold of the future. It is basically turning everything renewable into a non-renewable resource to be then “controlled and owned by a handful of giants and sold back to the very people from whom the water was taken, from whom the genes were taken, from whom the basmati was taken and the turmeric was taken.

“Sustenance resources – like water, like biodiversity, like our forests – need to be maintained in the Commons, that’s our big battle. You can be anywhere in the world, but defending these Commons from corporate takeover is now a global struggle.”#

I argue, extending Dr. Shiva’s teachings, that now Information has become a sustenance resource. An Information Commons must be built and protected first. It’s an overdue act that’s grown into a social imperative. Plugging in now must bear the social responsibility for the welfare of others who cannot and for a transglobal consciousness, exhibiting tolerance for the many millions of others who are plugging in as well.

THE GOOGLIZATION OF EVERYTHING
Truly novel work in this area is that of Dr. Siva Vaidhyanathan, Associate Professor of Media Studies and Law at the University of Virginia and author of the new book, The Googlization of Everything, University of California Press 2011, who proposes a total revisioning of how we think about what is in the hands, or rather on the servers, of the private corporation, Google.

It is immense territory for the mind. One has to consider the idea of privacy for the self in relation to the machine in relation to the corporate trust, in relation to the state, and in relation to our relationship under each of these to the rest of the world and Siva, a long-time scholar of U.S. History, Technology and Culture has tackled it head-on throughout the turn of the millennium. Remarkably, since Google has only been around for thirteen years and because so many academics now are financed by Google or use Google tools, Siva’s is the most extensive work on what they control that has yet been done by an academic, from the realm of what they do not control. A critical perspective of remarkable scale.

Rather than demonize the corporation, however, Dr. Vaidhyanathan’s work has led to a much more original and scalar envisioning of Information Science. It entrusts and puts first one of the oldest social institutions we have, the library, and flowers into a remarkable thesis about Information.

In May of 2010, the intern and I caught Siva’s talk at the TK TK

For twenty-five years, technology has outpaced our language, and small factions of corporate and political interests have taken advantage of it – most viciously, recently, the neo-conservatives in response to 9/11. We are beginning to witness however, the birth of incredibly nuanced discussions about our technology, from the highest work in the academy like Dr. Vaidhyanathan’s to the shortest burst of a video that goes viral in literally seconds to achieve global fascination, peak into wild, startled awareness and then drift into a pool of most viewed videos where generations slowly link to its data over a year or more by word of mouth.

Through pressure, force and will (and indeed, the collective-will of masses via democracy) Capitalism has become the defining social, labor and management order in every nation-state in the world. It has redefined the means of labor and production even in the former Soviet Bloc in such a complete way that the term “anti-Capitalist” now seems retarded.

Technology, high technology, information science and computing all folded easily into the model. It was an inevitability of the form leaving military control and entering the U.S.A.‘s industrial power sector. Silicon Valley and the many scientists in Massachusetts, Illinois and elsewhere should be revered for their work, but we must remember they were financed and fueled by immense corporate interests that had grown ever tied to the Universities during the 1990’s. Stanford.edu was Google, Inc. We must observe and acknowledge the moment of all this. It is the U.S. ingenuity and willingness to experiment and creativity at its greatest. In biotech, the human genome team that beats Ventner, and in computing, the authors of the Internet.

The Social Networking generation, a generation later, is an import to the valley and content-based, not software-based. It is, fundamentally, derivative work. That is what is onerous about The Social Network being nominated for an Oscar and Zuckerman rather than Assange being Time’s Man of the Year in 2010 … the sheer descent into nothingness.

Global capitalists who have used, and grown bloated by using, tech, have succeeded in creating workers and an international market of profit for the accumulation of wealth among a minority of private owning interests. They have suckered the majority of workers into accepting this state of alienation, numbing masses with superficial compensation and preventing resistance through endless repetition of propaganda via commercial mass media. Now much consumer technology is soma masking the powerlessness of the individual with fantasy power.

Struggling against a true minority – the clique of power elites who have ruled through Thatcher and Reagan, Major and Bush, and the Globalist Clinton/Blair and Imperialist Bush/Blair regimes – an exceptionally hardy current of anti-capitalist thought has survived the last 30 years of radical transformation of our world by technology in the hands of neo-liberal capitalists, Globalists and, in the 21st Century, U.S. and Israeli neo-conservatives. Globalism, is now an inevitability.

It will either be built like a staging area for a unified human future or shoved down the throat of the world through multinationals and Globalist structures like the IMF and World Bank, discarding, enslaving and killing millions … or something in between. But through our interconnectivity, another globalism [with a small g] has already, and inevitably, been born – a globalized movement brought an end to Apartheid in South Africa. A globalized movement marched millions against Bush, Blair and Aznar’s impending War on Iraq on February 15, 2003, dumping Aznar in its wake. We are becoming globalized in our shared concern for Chilean miners and the situation in Gaza and Jerusalem, and after earthquakes, hurricanes and tsunami in our world’s poor countries.

“Anti-globalism” is passé. The term compromises new and tender worldwide connections being born from pure intellectual discourse and social concern. We ought to speak directly to the problems Global capitalism brings to the world – massive inequity and excessive competition for control of common resources – while acknowledging that transformation must happen within Globalist structures because of their ubiquity.

In fact, capitalist-produced technologies, like the Internet, have allowed the other, humanist globalism to flourish. The Internet is the result of the ingenuity and creativity of scientific labor working in the U.S. system, but it only works if inter-linked. We all use it to organize and to distribute information. Its invention is the blessing; its capitalization and politicization, the issue. The election of 2008, which some referred to as The Youtube Election, cemented the position of the Internet, rather than television at the forefront of the information delivery process for news and elections coverage, from Obama girl to McCain’s admission that he didn’t use e-mail, the net played an important role in all campaigns.

I’ve used the Internet with artists and cultural institutions to connect across four continents to make, transport and install large-scale, cross-cultural art pieces. I have been able to realize these works because of technology, interconnectivity, the net – and indeed simply by having been born when I was. It is time to accept both the power and range of the tools to make major leaps in human consciousness on a global scale.
This work, plug/unplug, is dedicated to my son, Ocean Mandela Milan.

M.T. Karthik
Oakland 2011