Full Moon Fever

Sublime Reflections

The Practice of Everyday (Media) Life

By Lev Manovich

<version: March 10, 2008>

From Mass Consumption to Mass <Cultural> Production

The explosion of user-created media content on the web (2005-) has unleashed a new media universe. On a practical level, this universe was made possible by free web platforms and inexpensive software tools which enable people to share their media and easily access media produced by others; rapidly fallen cost for professional-quality media capture devices such as HD video cameras; and addition of cameras and video capture to mobile phones. What is important, however, is that this new universe was not simply a scaled up version of 20th century media culture. Instead, we moved from media to social media. (Accordingly, we can also say that we are graduated from 20th century video/film to early 20th century social video). What does this shift means for how media functions and for the terms we use to talk about media? There are the questions this essay will engage with.

Today “social media” is often discussed in relation to another term “Web 2.0” (coined by Tim O’Reilly in 2004.) While Web 2.0 refers to a number of different technical, economical, and social developments, most of them are directly relevant to our question: besides social media, other important concepts are user-generated content, long tail, network as platform, folksonomy, syndication, and mass collaboration. I will not be summarizing here all these concepts: Wikipedia, which itself is a great example of Web 2.0, does it better. My goal here is not to provide a detailed analysis of social and cultural effects of Web 2.0; rather, I would like to put forward a few questions and make a few points that I have not seen expressed by others and that directly relate to video and moving image cultures on the web.

To get the discussion started, let us simply state two of the important the Web 2.0 themes. Firstly, in 2000s, we see a gradual shift from the majority of Internet users accessing content produced by a much smaller number of professional producers to users increasngly accessing content produced by other non-professional users. Secondly, if 1990s web was mostly a publishing medium, in 2000s it increasingly became a communication medium. (Communication between users, including conversations around user-generated content) take place through a variety of forms besides email: posts, comments, reviews, ratings, gestures and tokens, votes, links, badges, photo, and video.)

What do these trends mean for culture in general and for professional art in particular? First of all, it does not mean that every user has become a producer. According to 2007 statistics, only between 0.5 % – 1.5 % users of most popular social media sites (Flickr, YouTube, Wikipedia) contributed their own content. Others remained consumers of the content produced by this 0.5 – 1.5%. Does this imply mean that professionally produced content continues to dominate in terms of where people get their news and media? If by “content” we mean typical twentieth century mass media – news, TV shows, narrative films and videos, computer games, literature, and music – then the answer is often yes. For instance, in 2007 only 2 blogs made it into the list of 100 most read news sources. At the same time, we see emergence of “the long-tail” phenomenon on the net: not only “top 40” but most of the content available online  – including content produced by individuals – finds some audiences. These audiences can be tiny but they are not 0. This is best illustrated by the following statistics: in the middle of 2000s every track out of a million of so available through iTunes sold at least once a quarter. In other words, every track no matter how obscure found at least one listener. This translates into new economics of media: as researchers who have studied the long tail phenomena demonstrated, in many industries the total volume of sales generated by such low popularity items exceeds the volume generated by “top forty.”

Let us now consider another set of statistics that show that people increasingly get their information and media from social media sites. In January 2008, Wikipedia has ranked as number 9 most visited web site; Myspace was at number 6, Facebook was at 5, and MySpace was at 3. (According to the company that collects these statistics, it is more than likely that these numbers are U.S. biased, and that the rankings in other countries are different. However, the general trend towards increasing use of social media sites – global, localized, or local – can be observed in most countries.)

The numbers of people participating in these social networks, sharing media, and creating “user generated content” are astonishing – at least from the perspective of early 2008. (It is likely that in 2012 or 2018 they will look trivial in comparison to what will be happening then). MySpace: 300,000,000 users. Cyworld, a Korean site similar to MySpace: 90 percent of South Koreans in their 20s, or 25 percent of the total population of South Korea. Hi4, a leading social media site Central America: 100,000,000 users. Facebook: 14,00,000 photo uploads daily. The number of new videos uploaded to YouTube every 24 hours (as of July 2006): 65,000.

If these numbers are already amazing, consider a relatively new platform for media production and consumption: a mobile phone. In Early 2007, 2.2 billion people have mobile phones; by the end of the year this number is expected to be 3 billion. Obviously, today people in an Indian village all sharing one mobile phone do not make video blogs for global consumption – but this is today. Think of the following trend: in the middle of 2007, Flickr contained approximately 600 million images. By early 2008, this number has already doubled.

These statistics are impressive. The more difficult question is: how to interpret them? First of all, they don’t tell us about the actual media diet of users (obviously these diets vary between places and demographics). For instance, we don’t have exact numbers (at least, they are not freely available) regarding what exactly people watch on sites such as YouTube – the percentage of user-generated content versus commercial content such as music videos, anime, game trailers, movie clips, etc. Secondly, we also don’t have exact numbers regarding which percentage of peoples’ daily media/information intake comes from big news organization, TV, commercially realized films and music versus non-professional sources.

These numbers are difficult to establish because today commercial information and media does not only arrive via its traditional channels such as newspapers, TV stations and movie theatres but also on the same channels which carry user-generated content: blogs, RSS feeds, Facebook’s posted items and notes, YouTube videos, etc. Therefore, simply counting how many people follow a particular communication channel is no longer tells you what they are watching.

But even if we knew precise statistics, it still would not be clear what are the relative roles between commercial sources and user-produced content in forming people understanding of the world, themselves, and others. Or, more precisely: what are the relative weights between the ideas expressed in large circulation media and alternative ideas available elsewhere? If one person gets all her news via blogs, does this automatically mean that her understanding of the world and important issues is different from a person who only reads mainstream newspapers?

The Practice of Everyday <Media> Life: Tactics as Strategies

For different reasons, media, businesses, consumer electronics and web industries, and academics converge in celebrating content created and exchanged by users. In academic discussions, in particular, the disproportional attention given to certain genres such as “youth media,” “activist media,” “political mash-ups” – which are indeed important but do not represent more typical usage of hundreds of millions of people.

In celebrating user-generated content and implicitly equating “user-generated” with “alternative” and “progressive,” academic discussions often stay away from asking certain basic critical questions. For instance: To what extent the phenomenon of user-generated content is driven by consumer electronics industry – the producers of digital cameras, video cameras, music players, laptops, and so ob? Or: To what extent the phenomenon of user-generated content is also driven by social media companies themselves – who after are in the business of getting as much traffic to their sites as possible so they can make money by selling advertising and their usage data?

Here is another question: Given that the significant percentage of user-generated content either follows the templates and conventions set up by professional entertainment industry, or directly re-uses professionally produced content (for instance, anime music videos), does this means that people’s identities and imagination are now even more firmly colonized by commercial media than in the twentieth century? In other words: Is the replacement of mass consumption of commercial culture in the 20th century by mass production of cultural objects by users in the early 21st century is a progressive development? Or does it constitutes a further stage in the development of “culture industry” as analyzed by Theodor Adorno and Max Horkheimer in their 1944 book The Culture Industry: Enlightenment as Mass Deception? Indeed, if the twentieth century subjects were simply consuming the products of culture industry, 21st century prosumers and “pro-ams” are passionately imitating it. That is, they now make their own cultural products that follow the templates established by the professionals and/or rely on professional content.

The case in point is anime music videos (often abbreviated as AMV). My search for “anime music videos” on YouTube on February 7, 2008 returned 250,000 videos. Animemusicvideos.org, the main web portal for anime music video makers (before the action moved to YouTube) contained 130,510 AMVs as of February 9, 2008. AMV are made by fans who edit together clips from one or more anime series to music, which comes from a different source such as professional music videos. Sometimes, AMV also use cut-scene footage from video games. In the last few years, AMV makers also started to increasingly add visual effects available in software such as After Effects. But regardless of the particular sources used and their combination, in the majority of AMV all video and music comes from commercial media products. AMVs makers see themselves as editors who re-edit the original material, rather than as filmmakers or animators who create from scratch.

To help us analyze AMV culture, lets put to work the categories set up by Michel de Certeau in his 1980 book The Practice of Everyday Life. De Certeau makes a distinction between “strategies” used by institutions and power structures and “tactics” used by modern subjects in their everyday life. The tactics are the ways in which individuals negotiate strategies that were set for them. For instance, to take one example discussed by de Certeau, city’s layout, signage, driving and parking rules and official maps are strategies created by the government and companies. The ways an individual is moving through the city, taking shortcuts, wondering aimlessly, navigating through favorite routes and adopting others are tactics. In other words, an individual can’t physically reorganize the city but she can adopt itself to her needs by choosing how she moves through it. A tactic “expects to have to work on things in order to make them its own, or to make them ‘habitable’.” 

As De Certeau ponts out, in modern societies most of the objects which people use in their everyday life are mass produced goods; these goods are the expressions of strategies of designers, producers, and marketers. People build their worlds and identities out of these readily available objects by using different  tactics: bricolage, assembly, customization, and – to use the term which was not a part of De Certeau’s vocabularly but which has become important today – remix. For instance, people rarely wear every piece from one designer as they appear in fashion shows: they usually mix and match diffirent pieces from diffirent sources. They also wear cloving pieces in diffirent ways than they were intended, and they customise the cloves themselves through buttons, belts, and other accessories. The same goes for the ways in which people decorate their living spaces, prepare meals, and in general construct their lifestyles.

While the general ideas of The Practice of Everyday Life still provide an excellent intellectual paradigm available for thinking about the vernacular culture, since the book was published in 1980s many things also changed in important ways. These changes are less drastic in the area of governance, although even there we see moves towards more transparency and visibility. But in the area of consumer economy, the changes have been quite substantial. Strategies and tactics are now often closely linked in an interactive relationship, and often their features are reversed. This is particularly true for “born digital” industries and media such as software, computer games, web sites, and social networks. Their products are explicitly designed to be customized by the users.  Think, for instance, of the original Graphical User Interface (popularized by Apple’s Macintosh in 1984), which allows the user to customize the appearance and functions of the computer and the applications to her liking. The same applies to recent web interfaces – for instance, iGoogle which allows the user to set up a custom home page selecting from many applications and information sources. Facebook, Flickr, Google and other social media companies encourage others to write applications, which mash-up their data and add new services (as of early 2008, Facebook hosted over 15,000 applications written by outside developers.) The explicit design for customization is not limited to the web: for instance, many computer games ship with the level editor that allows the users to create their own levels.

Although the industries dealing with the physical world are moving much slower, they are on the same trajectory. In 2003 Tayota introduced Scion cars. Scion marketing was centered on the idea of extensive customization. Nike, Adidas, and Puma all experimented with allowing the consumers to design and order their own shows by choosing from a broad range of show parts. (In the case of Puma Mongolian Barbeque concept, a few thousand unique shows can be constructed.) In early 2008 Bug Labs introduced what they called “the Lego of gadgets”: open sourced consumer electronics platform consisting from a minicomputer and modules such as a digital camera or a LCD screen. The recent celebration of DIY practice in various consumer industries is another example of this growing trend.

In short: during the time since the publication The Practice of Everyday Life, companies have developed new kinds of strategies. These strategies mimic people’s tactics of bricolage, re-assembly and remix. In other words: the logic of tactics has now become the logic of strategies.

Web 2.0 paradigm represents the most dramatic reconfiguration of strategies/tactics relationship to date. According to De Certeau original analysis from 1980, tactics do not necessary result in objects or anything stable or permanent; “Unlike the strategy, it <tactic> lacks the centralized structure and permanence that would enable it to set itself up as a competitor to some other entity… it renders its own activities an “unmappable” form of subversion.” Since 1980s, however, consumer and culture industries have started to systematically turn every subculture (particularly every youth subculture) into products. In short, the cultural tactics evolved by people were turned into strategies now sold to them. If you want to “oppose the mainstream,” you now had plenty of lifestyles available – with every subculture aspect, from music and visual styles to cloves and slang – available for purchase.

This adaptations, however, still focused on distinct subcultures: bohemians, hip hop and rap, Lolita fashion, rock, punk, skin head, Goth, etc. However, in 2000s, the transformation of people’s tactics into business strategies went into a new direction. The developments of the previous decade – the Web platform, the dramatically decreased costs of the consumer electronics devices for media capture and playback, increased global travel, and the growing consumer economies of many countries which after 1990 joined the “global word” – led to the explosion of user-generated “content” available in digital form: Web sites, blogs, forum discussions, short messages, digital photo, video, music, maps, etc. consumer industries. Responding to this explosion, web 2.0 companies created powerful platforms designed to host this content. MySpace, Facebook, Orkut, Livejournal, Blogger, Flickr, YouTube, h5 (Central America), Cyworld (Korea), Wretch (Taivan), Orkut (Brasil), Baidu (China), and thousands of other social media sites make this content instantly available worldwide (except, of course, the countries which block or filter these sites). Thus, not just particular features of particular subcultures but the details of everyday life of hundreds of millions of people who make and upload their media or write blog became public.

What before was ephemeral, transient, umappable, and invisible become permanent, mappable, and viewable. Social media platforms give users unlimited space for storage and plenty of tools to organize, promote, and broadcast their thoughts, opinions, behavior, and media to others. You can already directly stream video using your laptop or mobile phone, and it is only a matter of time before constant broadcasting of one’s live becomes as common as email. If you follow the evolution from MyLifeBits project (2001-) to Slife software (2007-) and Yahoo! Live personal broadcasting service (2008-), the trajectory towards constant capture and broadcasting of one’s everyday life is clear.

According to De Certeau 1980 analysis, strategy “is engaged in the work of systematizing, of imposing order… its ways are set. It cannot be expected to be capable of breaking up and regrouping easily, something which a tactical model does naturally.” The strategies used by social media companies today, however, are the exact opposite: they are focused on flexibility and constant chance. (Of course, all businesses in the age of globalization had to become adaptable, mobile, flexible, and ready to break up and regroup – but they rarely achieve the flexibility of web companies and developers.) According to Tim O’Reilly who originally defined the term Web 2.0 in 2004, one important feature of Web 2.0 applications is “design for ‘hackability’ and remixability.” Thus, most major Web 2.0 companies – Amazon, eBay, Flickr, Google, Microsoft, Yahoo and YouTube – make available their programming interfaces and some of their data to encourage others to create new applications using this data.

In summary, today strategies used by social media companies often look more like tactics in the original formulation by De Certeau – while tactics look strategies. Since the companies which create social media platforms make money from having as many as users as possible visit them (they do so serving ads, by selling data about usage to other companies, to selling ad-on services, etc.), they have a direct interest in having users pour as much of their lives into these platforms as possible. Consequently, they give users unlimited storage space to store all their media, the ability to customize their “online lives” (for instance, by controlling what is seen by who) and expand the functionality of the platforms themselves. 

This, however, does not mean strategies and tactics have completely exchanged places. If we look at the actual media content produced by users, here strategies/tactics relationship is different. As I already mentioned, for many decades companies have been systematically turning the elements of various subcultures developed by people into commercial products. But these subcultures themselves, however, are rarely develop completely from scratch – rather, they are the result of cultural appropriation and/or remix of earlier commercial culture by people. AMV subculture is a case in point. On the other hand, it exemplifies new “strategies as tactics” phenomenon: AMVs are hosted on mainstream social media sites such as YouTube, so they are not exactly “transient” or “unmappable” (since you can use search to find them, see how others users rated them, and so on). On the other hand, on the level of content, it is  “practice of everyday life” as: the great majority of AMVs consist from segments lifted from commercial anime shows and commercial music. This does not mean that best AMVs are not creative or original – only that their creativity is different from the romantic/modernist model of “making it new.” To use De Certeau’s terms, we can describe it as tactical creativity which “expects to have to work on things in order to make them its own, or to make them ‘habitable.’”  

Conversations through Media

So far I discussed social media using the old familiar terms. However, the very terms, which I was evoking so far – content, a cultural object, cultural production and cultural consumption – are redefined by Web 2.0 practices.

We see new kinds of communication where content, opinion, and conversation often can’t be clearly separated. Blogs is a good example of this: lots of blog entries are comments by a blog writer about an item that s/he copied from another source. Or, think about forums or comments below a web site entry where n original post may generate a long discussion which after goes into new and original directions, with the original item long forgotten.

Often “content,” “news” or “media” become tokens used to initiate or maintain a conversation. Their original meaning is less important than their function as such tokens. I am thinking here of people posting pictures on each other pages on MySpace, or exchanging gifts on Facebook. What kind of gift you get is less important than the act of getting a gift, or posting a comment or a picture. Although it may appear that such conversation simply foreground Roman Jakobson’s emotive and/or phatic communication functions described already in 1960, it is also possible that a detailed analysis will show them to being a genuinely new phenomenon.

The beginnings of such analysis can be found in the work of Adrian Chan. As he points out, “All cultures practice the exchange of tokens that bear and carry meanings, communicate interest and count as personal and social transactions.” Token gestures “cue, signal, indicate users’ interests in one another.” While the use of tokens in not unique to networked social media, some of the features pointed by Chan do appear to be new. For instance, as Chan notes, the use of tokens is often “accompanied by ambiguity of intent and motive (the token’s meaning may be codified while the user’s motive for using it may not). This can double up the meaning of interaction and communication, allowing the recipients of tokens to respond to the token or to the user behind its use.”

Consider another very interesting new communication situation: a conversation around a piece of media – for instance comments added by users below somebody’s Flickr photo or YouTube video which do not only respond to the media object but also to each other. (The same is often true to comments, reviews and discussions on the web in general – the object in question can be software, a film, a previous post, etc.) Of course, such conversation structures are also common in real life: think of a typical discussion in a graduate film studies class, for instance. However, web infrastructure and software allow such conversations to become distributed in space and time – people can respond to each other regardless of their location and the conversation can in theory go forever. (The web is millions of such conversations taking place at the same time). These conversations are quite common: according to the report by Pew internet & American Life Project (12/19/2007), among U.S. teens who post photos online, %89 reported that people comment on these photos at least some of the time.

Equally interesting is conversations which takes place through images or video – for instance, responding to a video with a new video. This, in fact, is a standard feature of YouTube interface. (Note that all examples of interfaces, features, and common uses of social media sites refer to early 2008; obviously details may change by the time you read this.) Why social media sites contain huge numbers of such conversations through media, for me the most interesting case so far is a five minute theoretical video Web 2.0 … The Machine is Us/ing Us posted by a cultural anthropologist Michael Wesch on January 31, 2007. A year later this video was watched 4,638,265 times. It has also generated 28 video responses that range from short 30-second comments to long equally theoretical and carefully crafted long videos.

Just as it is the case with any other feature of contemporary digital culture, it is always possible to find some precedents for any of these communication situations. For instance, modern art can be understood as conversations between different artists or artistic schools. That is, one artist/movement is responding to the work of produced earlier by another artist/movement. Thus, modernists in general are reacting against classical nineteenth century culture; Jasper John and other pop-artists react to abstract expressionism; Godard reacts to Hollywood-style narrative cinema; and so on. To use the terms of YouTube, we can say that Godard posts his video response to one huge clip called “classical narrative cinema.” But the Hollywood studios do not respond – at least not for another 30 years.

As can be seen from these examples, typically these conversations between artists and artistic schools were not full conversations. One artist/school produced something, another artist/school later responded with their own productions, and this was all. The first art/school usually did not respond. But beginning in the 1980s, professional media practices begin to respond to each other more quickly and the conversations are no longer go one way. Music videos affect the editing strategies of feature films and television; similarly, today the aesthetics of motion graphics is slipping into narrative features. Cinematography, which before only existed in films, is taken up in video games, and so on. But these conversations are still different from the communication between individuals through media in a networked environment. In the case of Web 2.0, it is individuals directly talking to each other using media rather than only professional producers.

Is Art After Web 2.0 still possible?

Do professional artists (including video and media artists) benefited from the explosion of media content online being produced by regular users and the easily availability of media publishing platforms? Is the fact that we now have such platforms where anybody can publish their videos and charge for the downloads means that artists have a new distribution channel for their works? Or is the world of social media – hundreds of millions of people daily uploading and downloading video, audio, and photographs; media objects produced by unknown authors getting millions of downloads; media objects fluently and rapidly moving between users, devices, contexts, and networks – makes professional art irrelevant? In short, while modern artists have so far successfully met the challenges of each generation of media technologies, can professional art survive extreme democratization of media production and access?

On one level, this question is meaningless. Surely, never in the history of modern art it has been doing so well commercially. No longer a pursuit for a few, contemporary art became another form of mass culture. Its popularity is often equal to that of other mass media. Most importantly, contemporary art has become a legitimate investment category, and with the all the money invested into it, it is unlikely that this market will ever collapse. (Of course, history has repeatedly has shown that the most stable political regimes do eventually collapse.)

In a certain sense, since the beginnings of globalization in the early 1990s, the number of participants in the institution called “contemporary art” has experienced a growth, which parallels the rise of social media in 2000s. Since the early 1990s, many new countries entered the “global world” and adopted western values in their cultural politics. Which includes supporting, collecting, and promoting “contemporary art.” Thus, today Shanghai already had has not just one but three museums of contemporary art plus more large-size spaces that show cotemporary art than New York or London. A number of starchitects such as Frank Gehry and Zaha Hadid are now building museums and cultural centers on Saadiyat Island in Abu Dhabi. Rem Koolhaus is building new museum of contemporary art in Riga. I can continue this list but you get the idea.

In the case of social media, the unprecedented growth of numbers of people who upload and view each other media led to lots of innovation. While the typical diary video or anime on YouTube may not be that special, enough are. In fact, in all media where the technologies of productions were democratized (video, music, animation, graphic design, etc.), I have came across many projects which not only rival those produced by most well-known commercial companies and most well-known artists but also often explore the new areas not yet touched by those with lots of symbolic capital.

Who is doing these projects? In my observations, while some of these projects do come from prototypical “amateurs,” “prosumers” and “pro-ams,” most are done by young professionals, or professionals in training. The emergence of the Web as the new standard communication medium in the 1990s means that today in most cultural fields, every professional or a company, regardless of its size and geo location, has a web presence and posts new works online. Perhaps most importantly, young design students can now put their works before a global audience, see what others are doing, and develop together new tools (for instance, processing.org community).

Note that we are not talking about “classical” social media or “classical” user-generated content here, since, at least at present, many of such portfolios, sample projects and demo reels are being uploaded on companies’ own web sites and specialized aggregation sites known to people in the field. Here are some examples of such sites that I consult regularly: xplsv.tv (motion graphics, animation), coroflot.com (design portfolios from around the world), archinect.com (architecture students projects), infosthetics.com (information visualization). In my view, the significant percentage of works you find on these web sites represents the most innovative cultural production done today. Or at least, they make it clear that the world of professional art has no special license on creativity and innovation. 

But perhaps the most conceptual innovation has been happening in the development of Web 2.0 medium itself. I am thinking about all the new creative software tools – web mash-ups, Firefox plug-ins, Facebook applications, etc. – coming out from both large companies such as Google and from individual developers who are creating and so on.

Therefore, the true challenge posed to art by social media may be not all the excellent cultural works produced by students and non-professionals which are now easily available online – although I do think this is also important. The real challenge may lie in the dynamics of Web 2.0 culture – its constant innovation, its energy, and its unpredictability.


See Adrian Chan, Social Media: Paradigm Shift? http://www.gravity7.com/paradigm_shift_1.html, accessed February 11, 20008.

Ibid.

“The Long Tail” was coined by Cris Anderson in 2004. See Cris Anderson, The Long Tail, Wired 10.12 (October 2008) < http://www.wired.com/wired/archive/12.10/tail.html>, accessed February 11, 2008.

More “long tail” statistics can be found in Tom Michael, “The Long Tail of Search,” September 17, 2007 < http://www.zoekmachine-marketing-blog.com/artikels/white-paper-the-long-tail-of-search/>, accessed February 11, 2008.

http://en.wikipedia.org/wiki/Myspace, accessed February 7, 2008.

http://en.wikipedia.org/wiki/Cyworld, accessed February 7, 2008.

http://en.wikipedia.org/wiki/Facebook, accessed February 7, 2008.

http://en.wikipedia.org/wiki/Youtube, accessed February 7, 2008.

According to research conducted by Michael Wesch, in early 2007 YouTube contained approximately %14 commercially produced videos. Michael Wesch, presentation at panel 1, DIY Video Summit, Univeristy of Southern California, February 28 < http://www.video24-7.org/panels&gt;.

http://www.youtube.com, accessed February 7, 2008.

Conversation with Tim Park from animemusicvideos.org, February 9, 2009.

Michel de Certeau. L’Invention du Quotidien. Vol. 1, Arts de Faire. Union générale d’éditions 10-18. 1980. Translated into English as The Practice of Everyday Life. Translated by Steven Rendall. University of California Press. 1984.

https://www.puma.com/secure/mbbq/, accessed February 8.

http://buglabs.net/, accessed February 8.

Here is a typical statement coming from business community: “Competition is changing overnight, and product lifecycles often last for just a few months. Permanence has been torn asunder. We are in a time that demands a new agility and flexibility: and everyone must have the skill and insight to prepare for a future that is rushing at them faster than ever before.” Jim Caroll, The Masters of Business Imagination Manifesto aka The Masters of Business Innovation” http://www.jimcarroll.com/10s/10MBI.htm>, accessed February 11, 2008.

See very interesting feature in Wired which describes a creative relationship between commercial manga publishers and fans in Japan. Wired story quotes Keiji Takeda, one of the main organizers of fan conventions in Japan as saying “This is where [convention floor] we’re finding the next generation of authors. The publishers understand the value of not destroying that.” Qtd. in Daniel H. Pink, Japan, Ink: Inside the Manga Industrial Complex, Wired 15.11, 10.22.2007 < http://www.wired.com/techbiz/media/magazine/15-11/ff_manga?currentPage=3&gt;

See http://www.signosemio.com/jakobson/a_fonctions.asp, accessed February 7, 2008.

http://www.gravity7.com/paradigm_shift_1.html, accessed February 11, 2008.

According to a survey conducted in 2007, %13 of internet users who watch video also post comments about the videos. This number, however, does not tell how many of these comments are responses to other comments. See Pew/Internet & American Life Project, Technology and Media use Report, 7/25/2007 < http://www.pewinternet.org/PPF/r/219/report_display.asp>, accessed February 11, 2008.

The phenomenon of “conversation through media” was first pointed to by Derek Lomas in 2006 in relation to comments on MySpace pages.

< http://youtube.com/watch?v=6gmP4nk0EOE>, accessed February 8, 2008.

Ibid.

September 21, 2008 Posted by | Cyber culture, Facebook, Lev Manovich, MySpace, New Media | Leave a comment

Buying In: The Secret Dialogue Between What We Buy and Who We Are

By Emily Wilson*, AlterNet, September 18, 2008

Conventional wisdom says that today’s savvy consumers are immune to marketing and unaffected by advertising. Rob Walker, the “Consumed” columnist for the New York Times Magazine, disputes that and says there is an important shift going on, which he calls “murketing” — a blurring of the lines between marketing and everyday life. Rather than disappearing, he says, marketing is just harder to detect, and many consumers, rather than rejecting brands, are giving their own meaning to them and embracing them as part of their identity. In his new book, Buying In: The Secret Dialogue Between What We Buy and Who We Are, Walker writes about the intersection of identity and consumer culture, how marketers want us to think we’re beyond advertising, and just how Pabst Blue Ribbon got so popular. AlterNet’s Emily Wilson spoke to him by phone at his home in Savannah, Ga.

Emily Wilson: You say that a lot of people don’t think of themselves as consumers and they reject corporate culture, so they think advertising doesn’t affect them. You call that dangerous. Why?

Rob Walker: Well, I think it lulls you into a false security. Some people associate branding with just a logo. And they say “Well, I would never wear a logo on a T-shirt,” and that’s fine, but branding is more complicated than just a logo or a slogan; it’s the process of attaching an idea to something. Often people who say they don’t buy into corporate culture are hyper-aware of the brands they’re buying — it might be Tom’s of Maine or whatever — but they often have very specific opinions. Sometimes those choices are based on rational thinking, but sometimes they’re based on assumptions or emotions, and it’s hard to see that.

I talk in the book about my own experience with this with Nike and Converse. I was the kind of person much like the kind of person we’re talking about. I thought, “Oh Nike, the swoosh, I would never do that.” It wasn’t until Nike bought Converse that I thought, “Oh, I’ve always worn Converse, what am I going to do?” There had never been a moment that I woke up and thought, “Oh, I am an outsider nonconformist.” You don’t think about those things consciously, but then suddenly something happens and you realize it’s there, and supposedly I don’t care about brands yet I’m having this big existential dilemma about what kind of shoes I’m going to wear because the meaning of them has changed.

EW: But you write about ethics being a factor in our consumer decisions. Wouldn’t some people say that’s about ethics because they don’t want to support Nike?

RW: In some cases it is. But often it’s a little bit selective. And to stick to my own hypocrisy: I tend to wear Levi’s jeans, and what really is the difference between the production process of Levi’s and Nike, and can I really defend myself on that? Not really. I run into that a lot.

People will kind of get their ethical hit from doing one type of consumer behavior and one brand they’re really loyal to, something like fair trade coffee for example. And then they don’t apply that in other (cases), and they don’t really stop and ask any questions at all.

So I think this sort of attitude of “I’m above it all, and all my decisions are right” is the mind-set marketers want you to be in. They want to push your buttons, whether it’s about ethics or whatever.

EW: You say there is a tension we have between wanting to be an individual and wanting to belong to something. How does that play out in the marketplace?

RW: I use the iPod as an example of something that serves different roles for different people. For some people, that is a very individualistic device with their personal soundtrack on it. And most analysis nowadays really focuses on how, as a culture, we’re all into personalization and individualization and customization, and we all want to be different, but that is sort of overlooking this equally powerful urge, I think, which is to be part of something bigger than ourselves. So with a product, it’s getting the one everyone has because it’s the one to get. … You can’t really make a straight-faced case any more for the iPod as individualistic. I said in a column recently that owning an iPod is about as individual as the gray flannel suit.

EW: It was when you wrote about the Red Bull campaign that you coined the term “murketing.” What was unique about that campaign?

RW: It was not known in the United States but was a big company overseas, and the traditional way for something like that to roll out through much of the 20th century has been to make a big noise. They weren’t a start-up; they were a big company with a lot of resources coming into a new market with a new idea. So, usually, what you do is you take out ads and sell it and say, “Here’s this new idea, and here’s why you should buy it” and sort of explain yourself. As loudly as possible. They took a totally different approach. They did a lot of really small events, and they never really explained themselves. The full extent of the sell is it said “with taurine” on the can, and no one knew what taurine was, and that was sort of the mystique, as they would put it, of this stuff. And what you hear is to approach the influencers first, but they didn’t really do that. They were approaching all kinds of different groups of people. Extreme sports people, club kids, college kids, people leaving gyms, people taking a break outside their office. That’s kind of the opposite of traditional marketing, which is supposed to make clear what this is and who it’s for. But by doing this small-scale thing, everyone thought it was for them, and I think they did it by being kind of vague and letting consumers fill in the blanks.

EW: You say that now the consumers are giving meaning to brands. Could you give an example of that?

RW: Well, in some ways, the Red Bull example. Another one is the Pabst Blue Ribbon example. That was one where that brand started to make a comeback after many decades of declining sales, and the company itself didn’t really know what was going on or why. It was the cheap beer, and then people were embracing this idea that it was an underdog and wasn’t insulting you with advertising. It was kind of anti. Embracing that was supposedly a statement against the mass beers. It gave it a kind of brand meaning that the company had nothing to do with. It was invented completely by consumers. There was never any kind of outsiderness or rebelliousness or nonconformity associated with Pabst. It was a poor example of a lot of what they were talking about because they had long since liquidated the factory and laid off the workers and subcontracted the brewing to someone else.

EW: You write that we think that, because of TiVo, we are in control and about how being able to click makes us feel that we have choices and that we’re immune to marketing. You give examples of how people have been saying that the consumer is in control for years and years. So what sort of shift has technology made?

RW: Ever since there’s been advertising, there have been people complaining about advertising, seeing through advertising, and mocking advertising. And advertisers get kind of upset about that, saying, “These consumers are such a pain in the ass; why can’t they just do what we want them to do?” What’s a little bit different in this go-round is that I think it’s been embraced by the consumers themselves. That click gesture gives us a lot of feeling of control. And it does give us some control. There is some truth in that.

The thing is that marketers aren’t dopes, and they didn’t react to TiVo by saying, “Well, we’re out of business.” They reacted by saying, “We have to, once again, come up with new tactics and new modes,” and as it happens, every single new piece of technology that comes along offers opportunities to them as well. With the possible exception of TiVo, I guess, but TiVo has been responded to. Just one example is the incredible spike of branded entertainment itself, of the brands moving into the shows.

The upshot is you see a campaign like the one that has built Axe deodorant, where they built Web sites and Web games that people interact with and forward to their friends. They created a fake girl group called the Bom Chicka Wah Wah Girls, which gets millions of views on YouTube and is just basically a long-form ad for Axe.

They dream up a concept for a television show called “Game Killers” that comes right out of a creative brief, and it gets picked up and becomes a show on MTV. So all of this technology presents interesting opportunities for marketers as well because, you know, they’re doing their job and they’re not fools, and in fact, they’re very smart and creative people who are well paid to come up with solutions to these problems.

EW: How do you think this idea that we’re in control plays out in other areas like politics?

RW: People will point to things like Facebook and text messaging and so on as grassroots — empowering ways to spread ideas. It’s hard for me to say. I heard this guy talking who was from a climate change group, and he was talking about how, in the ’80s, the South African divestment movement came out of campuses all over the country, and that actually was pretty effective pressuring universities to divest from South Africa. It was kind of a social movement, and it had a real effect, and this guy was saying he was trying to recreate this with the issue of climate control.

There is something equivalent to that in joining a Facebook group that says “Save Darfur,” and you put that in your Facebook profile. What do you actually do? Is that activism? Does it have an impact? I don’t know. And if you’ve done it, do you feel like you’ve done a good deed or that you’ve participated in activism?

I think people are a little glib about a lot of this stuff. And they’ll say, “Isn’t it amazing that a political candidate or a social movement can connect with all these people.” And it is amazing if it results in something changing. But it’s not so amazing if it just means people feel like they’ve done something and not much has happened.

I’m not condemning Facebook, but I think the bottom line is it’s still a little early to know if it going to lead to things happening in the world that we haven’t seen happen, or is it going to lead to us sitting around clicking and feeling good about ourselves. I don’t think it’s known yet.

EW: You have a chapter in Buying In about the DIY/craft movement. What do you think is significant about that?

RW: Well, it’s a rather large subculture of younger people kind of responding to their problems or questions or alienation from mass production culture with more material culture. They’re sort of saying, “We’ll make things ourselves,” and then that leads to selling things ourselves. I think what’s interesting about it is that there is something that ties into real behavior, wanting to know how it’s made and what it’s made of, and brings in some of these environmental or labor or ethical concerns. And I think what’s interesting with the DIY world is they’re selling things on Etsy for example, and people are drawn into Etsy for reasons that have nothing to do with those ethical concerns, but then once they get there, it becomes something they can maybe get engaged in.

I wouldn’t call it a consumer movement. There’s actually a history of very powerful movements that have led to really important things like food labeling and safety standards and so on. Those have tended to be led by what I would call more traditional terms of activism that are aimed at making change, not on an individual level, but one that benefits the greater good. The DIY world is much more capitalist, but it was the most hopeful movement I could see out there, where perhaps these marketing mechanisms can lead to at least different ways of thinking about consumer culture.

EW: In Buying In you write about the secret dialogue between marketers and consumers. How do you hope that dialogue might change after reading the book?

RW: Well, what I’m trying to do is pull back the curtain and say, Here’s how two things work: one, the marketing industry, and two, your mind. I call it a secret dialogue because there’s a lot going on there that we sort of overlook. We think we understand, but we really don’t.

And I believe we do, by and large, care about the impacts of our consumer decisions on our own lives and on the planet. In survey after survey, people will say they care about those things, but we don’t really behave that way, so I hope this will let people be more equipped to make the decisions that are more satisfying to them.

*Emily Wilson is a freelance writer and teaches basic skills at City College of San Francisco.

September 19, 2008 Posted by | Advertisement, Consumerism, Consumption, Conversation, Interview | Leave a comment

Insecure at last: the age of surveillance

By Rahnuma Ahmed*

‘I am worried about this word, this notion — security. I see this word, hear this word, feel this word everywhere. Security check. Security watch. Security clearance. Why has all this focus on security made me feel so much more insecure?’

— Eve Ensler, ‘Insecure at Last: A Political Memoir.’

Tailor-made, to suit your needs

Surveillance often works innocuously. Consider this: billboards equipped with small cameras that gather details about passers-by — gender, a rough estimation of age, and how long she or he looks at the billboard. The cameras, it is said, use software to establish that the person is a billboard-viewer, it then analyses her or his facial features like cheekbone height, distance between nose and chin, to judge the person’s gender and age. Race is not used as a parameter. Not yet, but the companies say that they can, that they will. These details are transmitted to a central database. The purpose is to ‘tailor’ a digital display of the viewer, ‘to show one advertisement to a middle-aged white woman,’ and another to ‘a teenage Asian boy.’ To sell products more efficiently. More rationally. It does not intrude on privacy, so the argument goes, since actual images of billboard viewers are not stored.

These billboards are similar to websites such as Amazon, described as the largest (virtual) bookstore in the world, tailor-made to assist the customer, her needs and interests. I visit the website to look up books on feminist theory, I am shown bell hooks’ Feminist Theory: From Margin to Centre, along with, Ain’t I a Woman: Black Women and Feminism, also written by her, one that is, so I am told, ‘Frequently Bought Together.’ Simultaneously, five other products are displayed, that Customers Who Bought This Item Also Bought. Down below are menus which, at a click, will display my Recent History, books recently purchased, or viewed by me.

The Surveillance Society

Surveillance, as a growing number of Western writers, journalists, artists, academics and human rights activists keep reminding us, is no longer ‘the future’. In the words of Henry Potter, London editor of Vanity Fair, ‘we are already at the gates of the surveillance society.’ According to a group of academics, writers of A Report on the Surveillance Society (September 2006), it exists ‘not merely from dawn to dusk,’ but for twentyfour hours a day, seven days a week. It is systemic, expressed not only through supermarket check-out clerks who want to see loyalty cards, or the coded access card that allows one to enter the office, or CCTV (closed-circuit TV) cameras, which in Britain, are ‘everywhere.’ A CCTV consulting firm puts the number deployed at more than 4 million, nearly as many as the rest of the world combined, minus the United States. The report’s authors write, ‘these systems represent a basic, complex infrastructure which assumes that gathering and processing personal data is vital to contemporary living.’ Surveillance is, in their words, a ‘part of the fabric of daily life.’

They write, it would be a mistake to think of surveillance as ‘something sinister, smacking of dictators and totalitarianism,’ or as ‘a covert conspiracy.’ Instead, it is the outcome of modern organisational practices, business, government and the military. It is better viewed as the progress towards efficient administration, as a benefit for the development of Western capitalism and the modern nation-state. Four hundred years ago, rational methods began to be applied to organisational practices, to ensure that the new organisations ran smoothly. It made informal social controls on business and governing, and people’s ordinary social ties ‘irrelevant.’ The growth of new computer systems after World War II reduced labour intensity, it increased the reliability and the volume of work that could be accomplished. Subsequent growth of the new communications system, now known together as ‘information technology’ (IT), is related to modern desires for efficiency, speed, control and coordination, and is global.

Capitalism’s push to cut down on costs and to increase profits has accelerated and reinforced surveillance. This, accompanied by the 20th century’s growth of military and police departments, and the development of new technologies, has improved techniques of intelligence-gathering, identification and tracking. Surveillance thus, has become part of being modern.

It is undoubtedly two-sided. It has its benefits: it helps deter traffic violations, tracks down criminals, medical surveillance programmes provide necessary information to public health authorities etc. But, the authors warn us, there are things that are ‘seriously wrong’ with a surveillance society. Large scale technological infrastructures suffer from problems, equally large in scale, especially computer systems where a mistaken, or an imprudent keystroke can cause havoc. For instance, twenty million ordinary peoples’ online search queries from AOL were released for ‘research’ purposes in August 2006. The names of identifiers were not tagged, but connecting search records with names took only a couple of minutes. Corruptions and skewed visions of power, not that of tyrants, but of leaders justifying extraordinary tactics in exceptional cicumstances, such as the endless ‘war on terror,’ can be disastrous. Many Muslim Americans have been branded as unfit for travel, or subject to racial profiling. Surveillance systems are wrong on three other counts: they are `meant to discriminate between one group and another’, as recent trends show, distinctions of class, race, gender, geography and citizenship are being exacerbated and institutionalised. Second, it undermines trust, something necessary to social relationships, breeding suspicion in its place. When parents start to use webcams and GPS systems to check on teenage childrens’ activities, or spouses check each others’ suspected infidelities, it speaks of a ‘slow social suicide.’ And third, surveillance systems associated with high technology and anti-terrorism distract us from pursuing ‘alternatives,’ from paying attention to larger and more urgent questions.

Fear internalised

Caroline Osella, a contributor to the ASA (Association of Social Anthropologists) blog discussion on recruiting anthropologists in the ‘war on terror’ (through the Human Terrain System programme), wrote of a personal experience that illustrates the ‘state of paranoid anxiety’ that grips people. As the mother of an 11 year-old, she had gone to a school meeting for parents to discuss a planned residential adventure school trip. She was astounded, she writes, to see parents not asking questions about activities planned, or practicalities like food, or other stuff to take along. Instead, questions revolved exclusively around security. School authorities were asked: ‘will an adult stay awake all night to monitor that kids are safe and not wandering?,’ ‘can the kids escape to the outside?,’ ‘can strangers get in?’ And she writes, incredible as it may sound, one father finally asked, ‘what guarantee can the school provide that paedophiles will not be able to break the perimeter fence and get into the site, where the kids will be sleeping unchaperoned in tents?’

It was surreal, Osella writes, to sit and listen to ‘reasoned and careful discussions’ of a totally fantastic scenario. It would be great, she says, to embrace some insecurity and uncertainty, and to accept the absence of ‘total control’ over our lives.

How does surveillance get naturalised? Mark Andrejevic, author of Reality TV: The Work of Being Watched, believes that reality TV has played a part in transforming American attitudes toward surveillance. Producers of early reality programs such as MTV’s The Real World (1992) had a hard time finding people willing to have their lives taped nearly 24 hours a day for several months. Now, thousands of young people form audition lines in college towns, ‘more people applying to The Real World each year than to Harvard.’ New generations, Andrejevic says, are growing up viewing television shows that let anyone see the lives of others recorded voluntarily. There are other reality shows too, like COPS, where police chases of criminals is filmed. Increasingly, he says, the results of surveillance are seen as `entertainment,’ as being within the realm of the public’s right to know.

The mass collection of DNA data, and ‘policy’ laundering

The introduction of the Serious Organised Crime and Police Act 2005 in the UK has led to anyone being arrested on ‘suspicion’ of committing the slightest offence. After arrest, the police remove a DNA sample, which stays on the police database, even though the person may not be charged. Increasing by 40,000 samples per month, the database has surpassed more than 3 million DNA samples, a fifth of which belong to people of African-Caribbean origin. Who owns these DNA samples? ‘Once a database like this is established, the authority concerned tends to regard the information as being in its ownership, to be exchanged without reference to the subjects,’ writes Potter. The British government admitted that it had passed more than 500 DNA samples (I wonder whose, Arabs? Muslims?) to foreign agencies. But when asked to which countries, ‘no one seemed to know.’ The chairman of the Nuffield Bioethics Committee, Sir Bob Hepple anxiously commented, ‘We didn’t have any legislation to establish the DNA database and it has not been debated in parliament.’

Western governments, it seems are devising new strategies to circumvent traditional ideals of civic liberty, based on notions of freedom and privacy (mind you, not in its colonies). Dr Gus Hosein, senior fellow with Privacy International says, ‘illiberal policies’ are pushed through international treaty organisations. The British government brought into effect communications surveillance policies through the European Union, and ID cards through the United Nations. ‘The government returns home to Parliament, holding their hands up saying ‘We are obliged to act because of international obligations’ and gets what they want with little debate.’ It is a strategy that has led to the coinage of new words: ‘policy laundering.’

Having originated in the West, these surveillance systems are gradually extending outside it, to control, regulate and limit the lives of people in non-Western countries.

*Rahnuma Ahmed is an anthropologist based in Dhaka. Contact: rahnuma@drik.net

September 16, 2008 Posted by | Rahnuma Ahmed, Surveillance | Leave a comment

Meetings, Purchases, Pleasures

By William Deresiewicz, The Nation, August 27, 2008

Like a peddler just arrived in town, or a traveler come from foreign shores, Salman Rushdie spreads before us his magic carpet of stories. Rushdie has been many things–political novelist, national epicist, probing essayist, free-speech icon out of force of circumstance–but he has always been, first and last, a storyteller. As Conrad sought to return to fiction the immediacy of the sailor’s tale–one man entertaining his mates over claret and cigars–so Rushdie seeks to reanimate the printed page with the exuberance and exoticism of legend and fable, fairy tale and myth: the province of the wanderer, the yarn spinner, the bard. More than Ulysses or The Tin Drum, his most persistent models have been the Thousand and One Nights and the Hindu epics, The Wizard of Oz and Bollywood. He doesn’t want to be Joyce; he wants to be Scheherazade. His greatest works engage the tragedies of modern history through the most audaciously archaic of narrative devices. Midnight’s Children hinges on the switching of two babies in the cradle; The Satanic Verses features flying carpets and Ovidian metamorphoses.

Barring his children’s book, Haroun and the Sea of Stories, Rushdie’s new novel, The Enchantress of Florence, may be the purest expression yet of his fabulating impulse. Set in a faraway time, the 1500s, and dividing its pages between two storied lands, the Mughal Empire and Renaissance Florence, it is replete with princesses and pirates, mysterious strangers and long-lost cousins, enchanted waters and magic cloaks. But what it does not contain is as telling as what it does. The Enchantress of Florenceexhibits none of the complex allegorical structures, dense systems of allusion or broad political implications–in short, none of the satanic ambition–that both weigh down his major works and give them weight.

The result, if relatively slight, is probably Rushdie’s most coherent and readable novel. The 500-plus-pagers tend to sprawl as they spread, bogging down in their proliferating mass of characters and plotlines. Their language, while often playful, is also sometimes labored, sweating to keep the narrative machinery aloft. Here the story is clean and compact, and the ever-so-slightly archaic style goes down like ice cream:

The path sloped upward past the tower of the teeth toward a stone gate upon which two elephants in bas-relief stood facing each other. Through this gate, which was open, came the noises of human beings at play, eating, drinking, carousing. There were soldiers on duty at the Hatyapul gate but their stances were relaxed. The real barriers lay ahead. This was a public place, a place for meetings, purchases, and pleasure. Men hurried past the traveler, driven by hungers and thirsts. On both sides of the flagstoned road between the outer gate and the inner were hostelries, saloons, food stalls, and hawkers of all kinds. Here was the eternal business of buying and being bought. Cloths, utensils, baubles, weapons, rum.

The novel, on its fourth page, is finding its subject, and its subject is storytelling itself. The men are driven by hungers and thirsts, and so is the writing. In its greedy piling up of nouns–“hostelries, saloons, food stalls, and hawkers”; “Cloths, utensils, baubles, weapons, rum”–we feel the force of storytelling’s appetite for the world, its sheer sensual relish for the thingness of things. It is no surprise that the great compendiums of stories tend to swell virtually without limit: the Mahabharata and the Ramayana, theDecameron and the Thousand and One NightsDon Quixote andGargantua and Pantagruel. This is the same impulse, of course–under stricter regulation in The Enchantress of Florence–that gives Rushdie’s greatest novels their girth.

Storytelling, in Rushdie, is also typically aligned with two other human things, as the passage above suggests. The first–and this is true of storytelling in general–is commerce, “the eternal business of buying and being bought.” It is trade that brings the people who bring the stories, and it is the marketplace, above all, where stories are told. Indeed, storytelling is a kind of trade, an exchange of goods for the satisfaction of appetites, a busy engagement with the world; and stories, like markets, are public places, places for “meetings, purchases, and pleasure.” Rushdie’s characters are usually performers–storytellers themselves–or businesspeople: merchants, hucksters, speculators, a class of people in whom he clearly delights. Not for him the Modernist disdain of the bourgeoisie, nor the passive, reflective souls of modern fiction–Proust’s Marcel, Mann’s Hans Castorp, Woolf’s Clarissa Dalloway. His medium is will, not introspection, and the change in tone and character type in Rushdie, García Márquez and others marks the postmodern rediscovery of story after the exhaustion of Modernist experimentation.

Trade’s supreme locus–one might say, its supreme creation–is the city, and for Rushdie, the city is storytelling’s supreme subject. Delhi, Karachi, Cochin, New York, above all Bombay, the city of his childhood (“Back to Bom!” is Saleem’s happiest thought in Midnight’s Children), and London, the city of his maturity (“Ellowen Deeowen,” The Satanic Verses calls it, yoking Semitic and Indo-European divinities in a numinous pun on the spelling of the city’s name). The city, for Rushdie, is the place of variety, mystery, fortuity, possibility, conflict–all the elements that most make for good stories. It is the place where strange people live next door and unimaginable worlds are waiting to be discovered on the next block, a place that invites you, as the title of his latest essay collection urges, to “step across this line.” Of the city the traveler approaches as The Enchantress of Florence opens, we read this:

Its neighborhoods were determined by race as well as trade. Here was the silversmiths’ street, there the hot-gated, clanging armories, and there, down that third gully, the place of bangles and clothes. To the east was the Hindu colony and beyond that, curling around the city walls, the Persian quarter, and beyond that the region of the Turanis and beyond that, in the vicinity of the giant gate of the Friday Mosque, the homes of those Muslims who were Indian born.

A world in miniature, and like the great world, a seemingly endless series of “beyonds.”

The city in question here is Fatehpur Sikri, built by Akbar, third and greatest of the Mughal rulers, as his imperial capital. The traveler in question is a pale-haired European, a trickster, adventurer and thief who calls himself Uccello di Firenze and Mogor dell’Amore (Mughal of Love) but whose given name is Niccolò Vespucci. The conjunction of two famous Florentine names is no coincidence, for the story the stranger has borne across the world for Akbar’s ears alone–his story and, in a sense, Akbar’s as well–begins with the friendship of his grandfather Ago, Amerigo Vespucci’s cousin, with Niccolò Machiavelli. There was a third friend, too, Nino Argalia, who ran away from Florence to become a great warrior in the service of the Turkish sultan. He also became the lover of Akbar’s great aunt–expelled from family memory for choosing love over home–who had been making her own journey west as she passed from conqueror to conqueror. It is she–Angelica, Qara Köz, Lady Black Eyes–who is the enchantress of the title, a woman of such surpassing beauty that she bewitches not only the citizens of Florence when Argalia returns in triumph to his native town but also the inhabitants of Fatehpur Sikri, two generations later, when the mere story of her gets abroad.

Rushdie is working here with the twinned powers of erotic charm and artistic imagination. Men enslave women and are enslaved by them in turn. Or by their ideas of them. The painter Akbar commissions to envision Qara Köz’s life–“Paint her into the world,” Akbar exhorts him, “for there is such magic in your brushes that she may even come to life”–becomes so enamored of his vision that he disappears to join her inside the painted world instead. In this book of mirrorings and doublings and opposites, Akbar does the reverse. His favorite wife, Jodha, is a woman he has imagined into being, taking bits and pieces from his other wives to form the ideal consort, sustaining her existence through a “suspension of disbelief” in its possibility.

The reader may not be so ready to share Akbar’s conviction. The question of Jodha’s status is one about which the novel maintains a strategic vagueness, preventing her from coming into focus as either a character or a thematic idea. Does she or does she not achieve independent existence, “come to life” as she is imagined “into the world”? We read her thoughts as if she were a real character, but she fades away, displaced in Akbar’s imagination, when Lady Black Eyes comes along.

The uncertainty goes to the heart of that much-handled critical concept, magic realism, or at least to Rushdie’s deployment of it. Unlike García Márquez, the mode’s other most famous exponent, Rushdie never fully commits to the magic-realist premise, a hesitation that makes his practice more sophisticated and less satisfying. García Márquez proffers his levitations and memory plagues with a completely straight face; they are as natural a part of the world–and, to its inhabitants, as normal–as anything else. But Rushdie is always hedging his bets. Can Saleem really communicate telepathically with the hundreds of other children born at the hour of Indian independence, or is that merely the fantasy of a lonely little boy? Can Shalimar the Clown really walk on air, or is that just a conjuror’s trick? In both instances and many others, Rushdie equivocates between the two possibilities, awkwardly straddling the domains of realism and magic.

Why should this be? Magic realism is, among other things, an attempt to re-enchant the world in the wake of scientific rationalism and global exploration, to recover the premodern mindset in which giants and witches and magic hats were real possibilities. That is why it has flourished in regions that were the object rather than the agent of capitalist and colonial expansion. That is also why the magical effects in One Hundred Years of Solitude tend to fade as the story approaches the present, washed out in the glare of modernity. But like a colonial subject stubbornly maintaining his traditional practices in an imperial space that stigmatizes them as primitive–a young Indian writer transplanted to London, say–Rushdie has consistently sought to insert magical elements into narratives of the present, flourishing the marvelous in the face of modernity. It is no wonder that, like the gestures of the colonial subject, the act is fraught with hesitation, uncertainty and self-doubt, that it reveals a mind divided between old allegiances and the ineluctable logic of rationality.

In asserting the rights of magic in the present, Rushdie is also testing the power of the imagination to affect reality. This is his highest theme, his persistent obsession. If so much of what seems magic at first turns out to be the result of art or artifice, that is exactly the point. Imagination does have the power to affect reality–personal, social, political. Argalia imagines a fantastic life and then goes out and lives it. The story of Lady Black Eyes drives a whole city mad. Lines are drawn on a map, and a nation conjectures itself into being. Magic in Rushdie often approaches a kind of lucid dreaming, where the boundary between imagination and reality is breached and desire is given direct power in the world. But by the same token, he often runs his effects right up against the border of plausibility, challenging us to discern how much is real, how much a trick–how much, in other words, imagination can really do. There may be no other major novelist whose imagination is so steeped in the movies; his first literary influence, he has said, wasThe Wizard of Oz. Magic, for Rushdie, is another name for special effects, and it is part of his purpose to give us a glimpse of the wires every now and then. Sometimes he shows us the Wizard, sometimes he lets us see the man behind the curtain.

The problem comes when he can’t seem to decide for himself what is magic, what is art and what is simply the form of delusion we call “imagining things.” The pressure of skepticism is actually lesser in The Enchantress of Florence than in his other works, precisely because the novel is set in a remote time and decorated with the language and properties of legend. We accept and even expect a certain quantum of the marvelous here, so Niccolò’s magic cloak, for example, passes without trouble. But Jodha is a different matter. She is central to Rushdie’s thematic conception–that men create women to fall in love with–but he leaves her stranded between imagination and reality. She is more than an idea for Akbar but remains less than a full person. She has interiority, but she has no agency, no force in the world. As a result, she has little force in the novel, little hold on our imaginations, remaining nothing more than a nice idea that never fully comes to life.

There are other problems. The novel proposes too facile an equivalence between East and West. “This may be the curse of the human race,” we hear more than once, “not that we are so different from one another, but that we are so alike.” Florence and Fatehpur Sikri, Italy and India, are set up as mirrors: in each place a besotted painter, in each a pair of prostitutes fat and thin, in each enchanting beauties and wicked young princes. Akbar muses in terms that make him sound suspiciously like a Renaissance humanist. The historical Akbar promulgated a divine right of kings; Rushdie’s doubts the existence of God. It’s all a little too comfortable, a kind of full-bellied, avuncular globalism that conjures away difference altogether. Rushdie has always been a humanist, has always believed that our similarities go deeper than our differences, but the younger writer was also a courageous defender of difference, of human variety and multiplicity, against the totalitarian impulse to impose uniformity. Midnight’s Children restages the classic rivalry of poet and king as the storyteller Saleem, Rushdie’s alter ego, speaking truth to Indira Gandhi in the wake of Emergency Rule, when India’s magic tumult of voices was reduced to a grim silence. In The Satanic Verses, Rushdie’s surrogates–the satirist Baal and Salman the Persian–mock and subvert the certainties of the Prophet, speaking for pleasure over purity, fluidity over fixity, the many against the One.

But here he unstrings the tension between truth and power by merging poet and king in the figure of Akbar, the emperor-artist. So too with East and West, though perhaps for understandable reasons. The younger Rushdie was an insurgent colonial fighting for legitimacy within the West and the culture of the West. Whether in India or England, he was undoubtedly never allowed to forget the difference between the place he came from and the place he wanted to get to. While it is true that he has always been in the business of bridging that distance by writing what Midnight’s Children calls “eastern Westerns,” meeting is not the same as merging, hybridity not the same as homogeneity. But Rushdie the international literary superstar is very far from the young man he once was. As he comes and goes on his magic carpet of fame and money (which one does not begrudge him), East and West must feel like one big world. In the figure of Niccolò, Western descendant of an Eastern princess come back, after the lapse of years, to reclaim his ancestral connections, we can read Rushdie’s triumphant, nostalgic return to his place of origin.

The gesture points to the deepest sources of Rushdie’s art. More than his familiar–and, by now, shopworn–postcolonial themes, more even than the erotic love that is the book’s ostensible concern, it is family that is his most profoundly felt subject, here and throughout his work. The charge against Rushdie has always been that amid the whirlwind of ideas and allusions and allegory and wordplay, his characters never take shape as full people about whom the reader can feel real emotion. But the one exception has always been the figures and feelings of childhood. The most vital relationships in his fiction are family relationships. Midnight’s Children exerts its strongest pull in the chapters devoted to Saleem’s early years, when he is surrounded by parents and grandparents, uncles and aunts. The Satanic Verses doesn’t come alive emotionally until the very end, when Saladin reconciles with his dying father. Rushdie, a famous ladies’ man–he has been married four times, the last to a supermodel–may think of The Enchantress of Florence as his tribute to erotic love, but the romance here feels pretty secondhand, a collage of Petrarch and grand opera. In many ways, for better and worse, Rushdie is still the 10-year-old who sat spellbound watching The Wizard of Oz. His work is sometimes childish, but it is more often childlike. As a portraitist of women, he has always done much better with matriarchs than with love objects; his mothers and aunts are the solidest characters in his work.

So it is here. The novel’s best scene takes place right after Niccolò has made his claim of kinship at the Mughal court. Akbar summons the queen mother and her sister-in-law Gulbadan to see if they retain any memory of a long-lost aunt:

“Allow me to remind you, O all-knowing king, that there were various princesses born to various wives and other consorts,” Gulbadan said. The emperor sighed a little; when Gulbadan started climbing the family tree like an agitated parrot there was no telling how many branches she would need to settle on briefly before she decided to rest.

The passage suggests the underlying unity of Rushdie’s two great commitments: storytelling and family. Storytelling is the place where families begin. Families know themselves through the stories they tell themselves about themselves. Family trees are storybooks in graphic form. Like Lady Black Eyes, long-lost relatives come back all the time, in the stories we tell about them. Like Niccolò, we are defined by the family stories we carry within us. But at the same time, families are the place where storytelling begins. The first stories we know are the ones we hear from our family, about our family. Childhood is the time of stories, the time when everything is still possible and every story is still true. If Rushdie’s magic realism is meant to re-enchant the world in the wake of modernity, it is also meant to re-enchant it in the wake of adolescence and adulthood. But again, with a bittersweet ambivalence, he seeks to incite two simultaneous and contradictory reactions, and perhaps 10 years old is exactly the age he wishes to make us. On the one hand, the childhood sense of open-mouthed wonder. On the other, the dawning skepticism that looks up from the page and asks, “But is it really true?”

September 12, 2008 Posted by | Book Review, Fiction and Novel, New Book, Salman Rushdie | Leave a comment

Meeting Slavoj Žižek: Coffee, Chocolate, Coke, and Colonialism

By Azfar Hussain, NewAge, September 2006

Photo: Google Image

He can make a lot of scratch out of one itch. And he can bring together Plato and Patanjali. And—even more characteristically—he can yoke together Hegel and Hitchcock, offering a Hegelian reading of Hitchcock and a Hitchcockian reading of Hegel, for instance. And he can go on and on telling you what Charlie Chaplin meant when he quipped: ‘The Method is a lot of nonsense.’
   

And he would probably prove Jean-Luc Godard wrong. And Godard told us in Rolling Stone quite some time ago: ‘If you go out of a James Bond film and I ask you if you can tell me what you’ve seen, you can’t. There are 20,000 things in James Bond. You can’t describe a mixed salad. Too many things in it.’
   

And he not only prepares a mixed salad, but also describes it. And 20,000 things in a film? He seems to be at least noticing all of them, characteristically ready as he is to catch the devil in the detail.
   

And his interests range widely. And they include not only film but also fashion, fantasies, perversions, lifestyle, media, multiculturalism, ideology, psychoanalysis, theology, political economy, political ontology, revolution, Kant, Hegel, Lenin, Lacan, Christianity, cyberspace, coffee, Coke, and colonialism. And the list itself—which already resembles a stubborn procession—can certainly be way longer, simply because I am far from exhausting the field of his possible interests.
   

And he could act like—to use a Salman-Rushdiean word—a ‘marxleninfreak,’ with varying effects of course, with his dazzling verbal pyrotechnics.
   And he has meanwhile attracted different labels and descriptions from different quarters. Google him—he loves to google himself, of course, in his attempts to theorize what I wish to call today’s ‘dotcom culture’—and you will find all kinds of stuff, including his colourful biographies, short and long. And you will find that Chronicle of Higher Education has already described him as the ‘Elvis of cultural theory,’ while Village Voice Literary Supplement eagerly takes him as a ‘one person culture mulcher. . .a fast-forward philosopher of culture for the post-cold war period.’ And there is by now a well-publicised feature documentary about him—about his ‘eccentric personality and his esoteric work,’ as a film reviewer once put it. And the documentary is called Zizek.
   

And, of course, he is none other than Slavoj Žižek (pronounced SLA-voy ZHEE-zhek)—a philosopher and an intellectual maverick and a cultural theorist and a filmfreak and a media analyst and a post-Lacanian psychoanalytic critic and a public intellectual and a profound joker who fashions an entire hermeneutic out of all those dirty jokes he keeps cracking in headlong succession and certainly a performative speaker who simply loves to talk.
   

II
   

So it was fun hearing Slavoj Žižek talk at the plenary session of Rethinking Marxism’s fifth international gala conference held at the University of Massachussetts-Amherst on November 6-9, 2003. And it was fun conversing with him for almost forty-five minutes at the lobby of the very hotel we happened to put up at. Although I am not a Žižekian, and although I cannot help noticing in his work at least sporadic traces of Eurocentrism accompanied by his relative inattention to the important theoretical and political questions of how racism and capitalism continue to affect and valence each other at the global level, I thought I should tabulate at least a few high points of Žižek’s own talk as well as narrate certain parts of my own conversation with him so as to be able to share them with those interested, or those who might be interested, in things and theories Žižekian.
   

There was a time when—admittedly—I used to have ‘what-the-hell-you’re-up-to-Dude?’-kinds of responses to Žižek’s elaborate psychoanalytic detours and pleasure-trips to the filmdom. But I found Žižek’s relatively recent ‘Leninism’ intriguing. Also, I found his highly charged, confrontational, and choreographic delivery against the American cultural theorist Michael Hardt simply superb and spectacular. (Parenthetically, I should drop at least this bit of information that Hardt—one of those Jameson-leaning Duke lefties—co-authored with Anthony Negri quite a sensational and controversial work called Empire, and that Hardt was another plenary speaker at the conference, who was sitting face-to-face with Žižek himself).
   

Žižek cracked us up with his jokes and his anecdotes, his bounce and his play, while I found his verbal energy simply phenomenal. In fact, in some ways, he reminded me of the verbal zest of Jacques Derrida, the French theorist who is regarded as the big daddy of the critical movement called ‘Deconstruction.’ I had heard Derrida speak—at the University of California-Davis—for long four hours uninterruptedly on the damn topic of the typewriter ribbon. Long four hours, yes! And the Prince of Deconstruction did not care to sip even a drop of water, as he kept talking. Žižek certainly evinced that kind of astonishing energy. But I must say that while Derrida contributed profusely to my headache, Žižek almost continuously made me laugh.
   

It is my impression that humour is probably the only way for Žižek to be serious about his world and his work; that Žižek tends to make you laugh and nervous at the same time; and that, if you are not nervous enough, you can make Žižek at least stammer in some funny ways. At one point, when he was full of quotes from Spinoza, Kant, and Hegel—the stuff that Žižek-the-lecturer sometimes seems made of—I could not resist the temptation of chanting a couple of Sanskrit slokas, while reproducing the cadences of the classical Anushtup meter. Žižek’s nervous response was: ‘Ah, cool sounds!’
   

Now, before I get to Žižek’s plenary talk, I think I would do well to say something at least briefly about his background and his work, although my purpose here is not to write an introductory essay on Žižek, or write something like ‘Žižek Made Easy.’ Born in 1949 in Ljublijana, Slovenia, Žižek received his formal academic training in sociology, philosophy, and psychoanalysis. In fact, he obtained two doctorates—mark his stamina exemplified even in the machinic academic underworld!—his first doctorate being in philosophy, and the second one in psychoanalysis. (‘How about calling you Dr. Dr. Žižek then?’ I asked him at one point, while his response was an outburst of laughter. And, together, we made fun of those Dr.-loving folks who get mad at you if you forget to use the damn ‘Dr.’ before their names. Žižek continued to laugh. And laughter, rather loud laughter, surely characterizes his personality, his style of oral presentations, and his writing style all at once).
   

Compulsive writer as he is, Slavoj Žižek has by now produced more than 50 books—either written or edited by him—while his work has been translated into nearly 25 languages. Although he is usually—and arguably narrowly—called a ‘Lacanian’ (after the once-influential French psychoanalytic theorist and thinker Jacques Lacan [1901-81]), Žižek characteristically amplifies, even politicizes, and thus transforms some of the ideas and insights, including even certain tools, tropes, and analytical apparatuses, offered by Lacan. In other words, by no means can Žižek be reckoned as a slavish disciple of Lacan. In fact, Žižek’s creative attempts to politicize Western psychoanalysis itself; his sustained interest in, and fresh re-readings of, the works of Marx and Lenin; his unmistakably theoretical and activist accent falling sharply on the questions of political commitment and hard politics; and his predilection for even Revolution together come to distinguish Žižek from those run-of-the-mill Lacanians who are mostly politically pacifist, high as well on their own discourse fetishism.
   

Although Žižek’s work—given its range and rigor and richness—simply resists a quick summary or a reductionist account, one can safely note that Žižek’s work is predominantly, if by no means exclusively, concerned with examining, interrogating, and theorizing how ideologies function and re-function, either visibly or invisibly, at numerous interconnected levels of human activities, perceptions, and lived practices such that creative ways to get out of all sorts of fetishes, traps, and illusions—produced and reproduced, say, by global capitalism itself and its concomitant discursive practices—begin to open themselves up. To this end, then, Žižek sometimes blends psychoanalysis, Marxist dialectical materialism, and critiques of pop culture, but not in a liberal-eclectic manner.
   

And Žižek—even for one who is not apparently a Žižek aficionado—is a provocateur with a vengeance. Even his titles are not only catchy but also tellingly provocative. Mark, then, a few titles of his pieces, some of which are of course Freud-Lacan-inflected: ‘There is No Sexual Relationship’ or ‘Schelling-in-itself: The Orgasm of Forces’ or ‘A Hair of the Dog that Bit You.’ Some of Žižek’s major works in English include, to begin with, The Sublime Object of Ideology (1989); by far one of the best and most stimulating introductory books on Lacan called Looking Awry: An Introduction to Jacques Lacan through Popular Culture (1991); Enjoy Your Symptom! Jacques Lacan in Hollywood and Out (1992); Tarrying with the Negative (1993); The Plague of Fantasies (1997); The Ticklish Subject: A Treatise in Political Ontology (1998), and his sensational book on Iraq called Iraq: The Borrowed Kettle (2004)—one that re-mobilizes Freudian jokes to track down the ‘weapons of mass destruction’ and to ascertain the connections between Saddam Hussein and Al-Qaeda which are themselves absences incarnate.
   

Žižek’s latest book is called The Parallax View, published in February this year—a book that Žižek himself reckons as his magnum opus. Again, this book is an example of Žižekianism itself, characterized as it is by his at once funny and serious insistence on the political at a conjuncture contaminated with all sorts of ‘diseases’; his own kind of verbal tricksterism and even tropesterism, exemplified in the ways in which he expands the interpretive horizon of the very metaphor of parallax, for instance; his restless erudition attesting to his unmistakable polymathic range; and, no less significantly, his never-ending search for the new.
   

III
   

Now let me get back to Žižek’s plenary talk. His topic was ‘Manufacturing Empire.’ Actually, the journal Rethinking Marxism came up with this topic for him and Michael Hardt.
   The assertion, then, with which Žižek makes his point of departure in his lecture is that the empire is not a matter of just manufacture or machinofacture as such, but that imperialism is real. He seems to be mobilizing some of the key Leninist analytics in order to underline the contemporary stage of imperialism—US imperialism, to be more specific. Thus, Žižek takes a firm stance against the Toni Negri and the Michael Hardt of Empire—Hardt himself was present when Žižek was speaking, as I have already indicated—vis-à-vis their book’s very proposition that the old term ‘imperialism’ should now be replaced by ‘Empire.’
   

And, for me, Žižek seems close to be saying to the Negris and Hardts of the world: How could you possibly play your damn flute when your ass is burning, eh? And I feel tempted to paraphrase Žižek this way: It ain’t just a faceless and nameless Empire, stupid! It’s imperialism—US imperialism itself.
   

As for the question of US imperialism, Žižek draws our attention to what he calls ‘the Trotsky-mentality in Bush’s cabinet.’ Re-citing and ridiculing what George W. Bush keeps saying—‘Freedom is God’s Gift to Humanity’— Žižek makes the point: ‘the US is hell-bent on naturalizing the idea that the US is the retailer of this gift and that if you fight the US you fight God’s gift to humanity.’ For Žižek, the contemporary conjuncture of US imperialism—which itself is the latest stage of US capitalism—is intimately implicated in the theologization of a gangster logic that finds almost relentless expression in ‘holy’ war.
   

Žižek also makes the point that the processes of imperialization, theologization, and financialization are all profoundly intertwined: Just mark how the US had already succeeded in inaugurating a fantastic but a real ‘dollar theology,’ if you will—one that continuously re-writes the Name-of-the-Father as the Name-of-God into those dollar-notes folks exchange and circulate. Invoking Eduardo Galeano, I feel like saying: yes, it is the US that can prove that its God, Gold, and Gun can be everywhere. Imperialism is ‘pan-US-theism.’ Funny? Yes. But it’s also dangerous, as Žižek suggests.
   

Then Žižek turns to certain mechanics of US imperialism: ‘the US acts globally but thinks locally.’ In other words, as Zizek maintains, the US acts like an empire but always thinks like a nation. I immediately find my own contention somewhat reinforced here, the contention being that US nationalism has already turned out to be the opium of the masses in many cases. Zizek then moves on to a particularly significant and signifying irony: ‘it’s ironic that the US also wants to be a secular nation and that’s exactly like what Saddam Hussein wants his nation to be.’ Zizek then offers a choice: now you decide if Saddam is the Real Enemy of Bush or an ideological mirror-image of the US ‘secular’ crusader.
   

In a quick but related move, Žižek then famously dwells on Donald Rumsfeld’s brand of epistemology. In fact, Žižek—funnily enough—contours Rumsfeldian epistemology by advancing three categorical enunciations: ‘1) There are things we know that we know; 2) there are things we know that we don’t know, and 3) then there are things we don’t know that we don’t know.’ According to Žižek, the ground assault against Iraq is itself the proof that the US knew Saddam didn’t have weapons of mass destruction (if Saddam did, the US wouldn’t have had a ground war). Then Žižek poses the question—‘What about the unknown knowns?’—suggesting that this question constitutes a philosophical debate within the US now, while also indicating with his characteristic laugh that without Rumsfeldian epistemology there’s indeed no imperial war—no imperialism.
   

As for anti-imperial resistances in and outside the US, Žižek does not say much, although he emphasizes the role of the US left to an extent. At every chance he gets, however, Žižek seems to be ridiculing at least part of the US left. He asserts that the problem with the US left sometimes resides in fighting the false battles, and that ‘the left accepts too much commonsense middle-ground with the argument that Iraq wasn’t so bad off, for instance.’
   

‘Really?’ asks Žižek. Then he ends his talk by arguing that Negri’s and Hardt’s proposition that the apparently old-fashioned word ‘masses’ be replaced by the ‘multitude’ does not wash with him at all not only because—in Žižek reckoning—Negri and Hardt end up misreading Spinoza’s formulation of the ‘multitude,’ but also because the term ‘multitude’ runs the risk of being co-opted by neoliberal capitalism today.
   

IV
   

Now, in my conversation with Žižek, he said a number of things that I found intriguing. One of the first things he told was: ‘Earlier I wasn’t Marxist enough. I was in fact more in psychoanalysis than in Marxism. But it’s different now. I think I’m moving in the direction of even Leninism.’ But why Lenin? ‘Oh yes, his acute analysis of imperialism!’ replied Žižek, while indicating that many of Lenin’s works have remained thoroughly unexplored even for those waxing lyrical on the need for things and theories Leninist.
   

And Žižek, in the conversation in question, advanced his by-now-familiar notion of ‘repeating Lenin’ whereby he continues to mean our re-invention of—not just a nostalgia for—Lenin through seizing the opportunities that Lenin himself missed, filling in the gaps that he left behind, and the possibilities he suggested. Žižek strongly emphasized the need for returning to Lenin, more than ever, while suggesting a la Lenin that, to quote Žižek, ‘economy is the key domain, the battle will be decided there, one has to break the spell of the global capitalism. But the intervention should be properly political, not economic.’
   

So those who have already celebrated the so-called ‘death of politics’ find Žižek not only on the other side of the conceptual and theoretical spectrum, but also amply politically threatening. No, politics is not dead.
   

What I also found intriguing was the ways in which he was re-reading (in every conversation—as many stories have it—Žižek appears as a stubborn re-reader!) Lenin to show how he was already developing a theory of a particularly strategic role of World Wide Web. What Zizek said that day in our conversation can also be found in his kick-ass essay called ‘Repeating Lenin.’
   

Then Žižek expressed his discontent with Verso, saying that it didn’t accept his edited collection of essays on Lenin. He added, ‘Do you think Verso is on the left? Left, my foot! But, you know, Duke has accepted my manuscript.’
   

Apart from Lenin, the issue of war came up in our conversation. Žižek spoke of ‘decaf war’—‘war without casualties on our side’—while at the same time he underlined today’s hedonism in the US thus: ‘drink as much coffee as you want, because it’s already decaf!’ Then he spoke of ‘chocolate laxatives’ that, according to him, coincide with their opposite, marijuana: ‘decaf opium.’ For Zizek, then, Rumsfeldian epistemology together with ‘decaf war’ or ‘chocolate laxative war’—among other things—come to characterize today’s US imperialism.
   See, Žižek is so capable of rambling about coffee, chocolate, Coke, colonialism, and Columbus—and, trust me, conundrums.
   

Dr Azfar Hussain taught English, cultural studies, and comparative ethnic studies at Washington State University and Bowling Green State University in the US before his recent move to North South University, Dhaka, where he teaches English.

September 12, 2008 Posted by | Azfar Hussain, Conversation, Slavoj Zizek | Leave a comment