Archive for February 2007
Carrie Young writing for BlogTo:
Last night I was at Gallery 1313 as one of six guest panellists in The Role of the Art Critic, the first roundtable discussion (well, rectangular-table discussion) as part of Gallery 1313’s ArtSPEAK series, along with Toronto art writers and critics David Balzer, Peter Goddard, Claudia McKoy, the Editor-In-Chief of MIX, Stephanie Rogerson, Timothy Comeau, and last but not least, our moderator Nick Brown.Gallery 1313’s Director, Phil Anderson, did a fabulous job in organizing the event, which packed in a full-house despite the inclement weather of yesterday’s “perpetual snow”, and everyone involved was simply mahvelous to meet (though some were simply more mahvelous than others — saucy Timothy Comeau for instance or Stephanie Rogerson, with whom I share a fondness for Claude Cahun — but I may be biased as they were both on my end of the table).
Source
Over the past three weeks I’ve been reading Hardt & Negri’s Multitude, which I learned of by way of Darren O’Donnell’s ‘darren-in-pakistan‘ blog (speciffically this posting) and I was actually inclined to read it. With regards to Empire I’ve had the same attitude that Noam Chomsky expressed about it here:
QUESTION: It seems to me, with a certain degree of difference, that the concept of a virtual senate is similar to Negri’s and Hardt’s concept of Empire. [Michael Hardt, Antonio Negri, Empire (Harvard University Press, 2000)]
CHOMSKY: Empire, yes, but I have to say I found it hard to read. I understood only parts, and what I understood seemed to me pretty well known and expressible much more simply. However, maybe I missed something important.
QUESTION: Yes, and the book arrives to the same conclusion as yours but through a more complicated, less readable way…
CHOMSKY: If people get something out of it, it’s okay. What I understand seems to be pretty simple, and this is not a criticism. I don’t see any need to say in a complicated way what you can say in an easier way. You can make things look complicated, that’s part of the game that intellectuals play; things must look complicated. You might not be conscious about that, but it’s a way of gaining prestige, power and influence.
I’ve thus carried a prejudice toward Hardt & Negri’s work over the past few years, but something in the way Darren introduced it intrigued me; further web-reaserch led me to believe it was the type of thing I should read since it elaborated on what I’ve been personally calling the politics of variety.
I can’t say I totally agree with all of it – but I was open to its analysis. Getting through the first section was difficult – I don’t feel like I’m living in a Global State of War as much as I feel I’m living in a culture of propaganda continually devoted to telling me some asshole blew up some other assholes somewhere I have no plans of visiting.
My own understanding of American Empire is largely metaphorical – I agree with the premise of Empire but my feeling is that they administrations of the Western world’s nation-states don’t know what they’re doing in that regard – they are too incompetent. In other words, while academics can diagnose and Imperial condition, the reality is that the Western governments are in denial.
The ending, focusing on the democratic project of multitude, seems to elevate multitude to the sovereignty they wish to deconstruct. It also seeks to abolish all authoritarian structures with the aim of having an anarchist state ruled by the common interest and affective (‘niceness work’ one could call it) labour of the multitude – the network of individual interests.
What seems lacking is a recognition of the capitalist market as it exists as a form of multitude already self-organizing – but I know that’s such a faux pas to even mention it. As well, there seems to be a lack of acknowledgment of the substantial work that would need to be undertaken to ‘phase in’ the current mainstream mass of corporate workers and other people who don’t read, don’t think, and have no real capacity to function as they are within such a democratic society. In other words, how does one prevent the democratic multitude from being merely an autocracy of cliques? Such cliques would be identified as knowing who Hardt & Negri even are, as opposed to the mainstream, which is more preoccupied this weekend with Brittany Spears’ new haircut.
I. But first, let’s imagine how we might be thought of in the future.
The Modern, Nodern, Oddern, Podern, and Qodern Periods
The predominance of using the prefix ‘post’ to name a period (almost always the one in which people found themselves at the time) flourished in the first decade of the 21st Century, and as one writer noted, ‘everything is posts … I need a saw to cut them down, too see the horizon’. Recognizing the Modern period as being the one which encompassed most of the 20th Century, one that was clearly defined, it followed that one should simply used the letters of the alphabet to replace the ‘post’ fashion, and hence, post-modernism was renamed the Nodern, and post-post-modernism was renamed the Oddern (although it is notable that post-post-modernism was never as popular, and many people were confused by this point not knowing exactly what time they were living in).
The first decade of the 21st Century, according to the historical records, referred to its self variously as:
‘post-nine-eleven’
‘post-national’
‘post-modern’ (or shortened to ‘pomo’)
‘post-post-modern’
‘post-industrial’
‘post-agrarian’
The Nodern Period
The Nodern was once known as the ‘post-modern’ and characterized the time between, roughly, 1975-1995. Since it saw itself as a movement that put Modernism behind it, it is perhaps explained best by looking at how it saw Modernism. As the overall paradigm of the 20th Century, Modernism defined how human beings in the West saw both themselves and their creative works. It created neat categories to enable definition, but in the language of the late 20th Century’s marketing, ‘post-modernism’ was about ‘thinking outside the box’.
It’s language emphasized the prefix ‘meta’ meaning ‘overarching’ and so, Post-Modernist/Nodernist talk refers to ‘metanarratives’. The most famous definition of what it meant to be beyond Modernism was to see ‘metanaratives’ as unbelievable.
The privileging of one story as had been the case under Modernism came into question, and the Nodern began to look into the as yet untold stories. Although, it is also necessary to point out the Western centric dominance of this vision, as the so-called untold stories were simply untold by Western thinkers to a Western audience. The Nodernist thinkers, while claiming to be on the side of the ‘non-west’ really saw themselves as deeply involved in the Western tradition that goes back to Latin Classicism.
The Nodern period was also characterized by a dominant political ideology that attempted to recast human life in simple economic terms. The politics of the time were characterized by an overall concern with ‘lowering trade barriers’ with the belief that such action would improve human life across the globe, and while misguided in the extreme, represents the first stirrings of a globalized mono-culture, one that began to develop with the increased capacity of telecommunication technology and the ease of global travel in the late 20th Century. The unbelief in One-Story (metanarrative) was fostered by the evidence that there were an extraordinary variety of stories that people could pick and choose from in order to lead richer lives.
Noism
Cultural historians often joke that the Nodern refers to ‘nothing good came out of it’. The clash between the tradition dating back to Ancient Rome in the West, and the confrontation with an global political and cultural tradition, reminiscent of empires (especially those of 19th Century Europe) caused much confusion and is one of the reasons the joke came about. It is best seen as a highly concentrated period of upheaval and transition, and it is sometimes popularly called Nomo.
This has prompted some contemporary culturalists to claim they are Noists, with the Toronto group The No-no Things being perhaps the best well known. In this way, they claim to be the fulfillment of the 20th Century avant-garde project (which the Nodern claimed to be the height of at the time) as the 20th Century Dadaists were named after the Russian term for ‘yes’ – ‘da’. Hence, answering the Dadaists nonsensical Yes Yes with their highly contrived and intellectual No No. It is notable to point out that Noism is highly esoteric and therefore culturally irrelevant within our larger Qodern context.
The Oddern and Podern Period
The six year period between 1995 and 2001 is referred to by historians who use this terminology as the Oddern period, who smirk when they say it was characteristically ‘odd’. The beginning of the end of the globalist ideology manifested itself in a rise of popular protests – first in 1997 against the Multilateral Agreement on Investment (MAI) and most famously, ‘The Battle in Seattle’ in 1999, which was followed by popular protests throughout 2000, culminating in April 2001 in Quebec City. Protests were planned for events that autumn, but the terrorists attacks of 11 September suddenly altered the political dialogue and ended the care-free callousness that had been popular in the developed world since the end of the Cold War ten years before.
The Oddern was also seen to be odd due to the rather sudden blossoming of the internet, which transformed everyone’s lives – henceforth, email and websurfing and stories of ‘dot-com millionaires’ became ordinary, while the politics of the United States focused on the sex-life of the President culminating in an attempted impeachment.
The Podern was so called because P followed O which followed N which followed M; so wrote the historian who coined the term. But a rival school of thought argues that the Podern is specific to the autumn of 2001 when Apple Incoperated introduced the iPod, which became the defining artifact of the time. As the iPod allowed for the assembly and playback of a vast amount of files (which hadn’t been possible before, and the iPod’s storage capacity at the time was unique) it is seen to be an appropriate term for this period since its culture consisted to a large extant of reassembly and recontextualization.
People living during the Podern Period sometimes called this ‘the deejay culture’. The term deejay comes from the acronym, D.J, (disc jockey) those who remixed and assembled playlists of music at nightclubs or on radios. The term jockey goes back to horse racers, hence the sense that this was the one in control. The first radio broadcasters would play a variety of music singles (which at that time consisted of vinyl discs) and with the development of the music-movie (known as ‘music videos’) the term was modified for those who introduced them on television. They were called veejays (‘video jockeys’).
The deejay began to overtake the rockstar in the early 1990s as the appreciable peak of music performance, with the rave dance parties that began in the UK in the 1980s. By the early 90s, the rave had been imported to North America, and by 1995 it was known in the mainstream. Hence, the late 90s, while known as the Oddern, are also referred to by cultural historians at The Early Podern Period. The school which promotes the presence of the Apple iPod device as the defining characteristic of the Podern (rather than merely going with the alphabetical arangement) agrees with this assessment, noting that the iPod’s arrival in 2001 was also the first proper year of the 21st Century, and the year in which the terrorists attacks on the United States occurred, the defining political event of that era.
It was also during this time that ‘classicism’ was re-defined to encompass more that it had previously. During the 18th, 19th, and 20th Centuries, Western centric cultural observers always referred to ‘classical’ as being the culture of Ancient Greece and Rome – defined simply perhaps as ‘the architectural column’. Neo-classicism developed in the late 18th and 19th Century, and at that time referenced the fashion of imitating the culture of Ancient Greece and Rome, but neo-classicism was followed by Modernism. By the early 21st Century, Modernism had developed it’s own cannon of ‘classics’ which were then copied, referenced, and imitated in such as way that the sampling and re-assembly of this cannon by the Podernists represented a Modernist neo-classicism. Historians now speak of ‘Latin Classicism’ when referring to the culture of Ancient Greece and Rome, and even that of the European Renaissance to the 19th Century Neo-classical period, since what these cultures all have in common (except for the original Greek) is the presence of the Latin language in the culture (either as the vernacular in the earliest, or as the language of European scholarship later on).
The Old Master Painters, who had become unfashionable during Modernism, began to be imitated by a generation of painters in the late 20th Century, and who were then called ‘New Old Masters’ and eventually, New Masters, until that died out and the term ‘Master Painter’ returned to the vocabulary as someone who excels in the craft of image making by hand.
Self-reflection in the Podern Period
The Podern is marked by the terrorist attacks on the United States of America on 11 September 2001, which sparked a flourishing of American militarism, and subsequent wars against Afghanistan during 2001-2002, Iraq in 2003, and Iran in 2007. The international dialogue shifted from one of ‘globalized trade’ (popular during the Nodern) to one of renewed nationalities and cultural identities.
Many at the time were dismayed to see the dialogue revert from that of the late 20th Century’s secular humanism to one that seemed to pit the United States’ version of fundamentalist Christianity against the Middle East’s version of fundamentalist Islam. Centering the dialogue on cultural identity seem to be nothing more than the mainstream catching up to much of the cultural elites preoccupation with what was called ‘identity politics’ during the Nodern’s 1990s, and encouraged self-reflection at all levels during the decade of 2000s.
In television, Dr. Phil was a popular therapy show, (although never had Foucault’s warnings about conformity and madness had a greater example); the show had been an offshoot of the ever-popular eponymous Oprah Winfrey show, which emphasized self-improvement through the ‘tales of personal triumph’ of common people and celebrities, with handy shopping tips thrown in for good measure and promotion. In movies (the dominant art form of the time) self-reflection is emphasized in the films written by Charlie Kaufman: Being John Malkovich (1999) is about seeing the world through some-one else’s eyes (John Malkovich was a popular actor who himself stared in the film). In Adaptation (2002), Kaufman caricatured himself by making a confused scriptwriter part of the story, inventing an identical twin brother to balance his neurotics. In Chad Schmidt (2008), the eponymous character is an actor who has a hard time finding work because he too closely resembles Brad Pitt, the most famous actor of the decade, played by Brad Pitt himself.
In theatre, Darren O’Donnell exemplified this intensive self-reflection with his play A Suicide-Site Guide to the City (2003-2006).
The Qodern Period
Some historians, uncomfortable with the easy explanation of using the alphabet to name the eras, look to the 20th Century’s use of Q to explain the Qodern. The letter Q became very popular in the second-half of the 20th Century, being used as pen-names and as the title of books; in the James Bond film series, Q was the alias of the UK’s Secret Service engineer; in the Star Trek Sagas Q was a mischievous god. The Dutch author Harry Mulisch named the main character in his 1997 novel, The Discovery of Heaven Quinten Quist, whose initials of QQ hinted at his supernatural characteristics. As Mulisch wrote:
‘His initials are Q.Q.’ ‘Qualitate qua,’ nodded Onno. ‘That is rare. The Q is the most mysterious of letters, that circle with that line,’ he said, while he formed a slightly obscene gesture a circle with the manicured thumb and index finger of one hand and the line with the index finger of the other, ‘the ovum being penetrated by a sperm. And twice at that. Very nice. My compliments.’ (p.361)
The contemporary historian Wu Zhenguo identifies the Qodern with that of the Star Trek Sagas‘ character. Noting that Q was seemingly omnipotent, omnitemporal, and omnipowerful he argues that our present society’s capacity for ‘all-awareness’ via the net is an adequate metaphor for our capacities. We may not be able to have things materialize out of thin air with a snap of the fingers, as could Q, but the idea Wu advances is that our capacities through nanotechnology and intercommunication most resembles that of our historical ideas of what only gods were capable of.
In addition, through our Representative technology, we can indeed speak with historical characters, in ways that Q flaunted and that which we were not capable of doing during the Podern.
The Qodern is a time period of psychological health, intensive communication and dialogue. The banes of existence throughout history: poverty and disease, have been eliminated for the most part: all diseases are at least treatable but no longer death sentences, and the lifespan has been extended so that one can afford a greater amount of time reading, thinking, or playing, barring the unfortunate accident of course.
The development of the Podern period began to show people how unprofitable it was to continue to dehumanize any segment of society, and how much better off everyone could be by extending benefits and encapsulating dialogue in a system of rights. While the globalist economic arguments which petered out by the Podern created for a time a sense of confusion, ultimately we look back and see how this time allowed for our new view of society as an engine for creativity and education to emerge. The view that all of us enjoy and benefit from today.
I’ve been so impressed with what WordPress allows me to do with this blog, I applied it to Goodreads, and over the past week worked toward launching a new interface, which I did tonight. I still need to go through and complete the categorization, but that won’t take too much longer. I plan on beginning the podcast feature in March, which won’t be anything too special: simply another and simple way to access the audio content. There are some aspects to the current design I’m unsure of, and they could be modified in the days or weeks ahead.
Last Sunday saw this year’s Superbowl, when the marketing agencies try to wow us into another enthusiastic year of American consumerism. I was in no mood for any of it; in fact, I was rather grumpy last weekend. So when I found Theodore Dalrymple’s intolerant text entitled Freedom and its Discontents in which he expresses thanks for not having to voice on radio his thoughts on the 12 year old Austrian boy who recently had a sex change, I was annoyed and grumpified even more, although I appreciated his perspective. He wrote:
If I had spoken my mind, without let or hindrance, I should have said what I suspect a very large majority of people think: that there is something grotesque, and even repugnant, about the whole idea of sex-changes, let alone of sex-changes for twelve year-olds.
I don’t find the issue repugnant nor do I find it very interesting. Dalrymple goes on to write about how the freedom of expression has been curtailed, not by onerous censorship laws, but by the intolerance of the politically correct. He concludes by writing: ‘Please don’t reply to any part of this article. I won’t read it: I know I’m right.’Those who know they are right are the most exasperating people one ever has to deal with. Stubborn minded fools so set in their ways they don’t even care about appearing to be ignorant, deluded and hateful. Dalrymple’s work nevertheless tends to be a good read because we can learn and gain something from his perspective. He isn’t constrained by an idealism, nor his he constrained by the specialized knowledge that cuts ‘those in the know’ off from the common.
Over my time doing this list, I’ve occasionally received letters taking to task something I wrote in introduction, or questioning my link selection. I thought I would need a defense of Dalyrmple’s article saying basically: don’t shoot the messenger, and began it anticipating this edition. But over the past week, I saw more than one article appear which basically underlines a theme of intolerance. It is one of the things I’ve enjoyed doing with Goodreads, and that is attempting to document through the link selection the occasional popular meme – an idea which seems to be expressed in more than one article appearing simultaneously from different sites.
The greatest example of intolerance in current public/web discussion has to do with the Holocaust, and seems focused on the latent assumption that the next war will be with Iran. There seems to be a lack of appetite in the United States for another invasion, which is a good thing, but churning along underneath the popular sentiment is the attempt by the right-wing blowhards to demonize Iran’s president Ahmadinejad who made the cover of yesterday’s (Feb 10) Globe & Mail. We have been told for months that Ahmadinejad is a Holocaust denier, because he has said in the past that it was a myth. Out of an extreme generosity and skepticism of North American propaganda, I’ve questioned whether he didn’t mean the anthropological sense of the word, until I remembered referring in recent conversations to consumerism as a myth (meaning it as an inaccurate oversimplification of our economic activity) and I was using the popular form of the word.
To clarify: anthropologically a myth is a story of meaning, one that punches above its weight of accumulated incidents. To say that the Holocaust is a myth under this context I think is accurate. It is has found a high, and defining, place in the Jewish story, and in a world of secularism, it seems that while not all contemporary Jews may believe in their God, they certainly all believe in their near genocide. As a gentile I find the overwhelming presence of the story sometimes noxious, as it has seemed to breed an unhealthy and unproductive paranoia that generates more hatred and anger than peace. And as a gentile I have to be very careful about what I say regarding this historical incident, since there is an element within Judaism who are ready to condemn any one who questions this reality in any way, who seem to think that all gentiles are closeted anti-Semites ready to light up the ovens again if given the chance. The taboo and reverence that is now tied to the Holocaust story is surely mythic in this regard, making condemnable heretics of those who deny.
But popularly, a myth is a fairy-tale, a fiction, and I don’t question the veracity, or the horror of the Shoah. The reality of Holocaust denial fits in perfectly with the stupidity of the age which questions even the Moon landings; such is a healthy skepticism toward the stories of authority taken to an extreme and absurd level. We live at a time when some believe in the literalness of the Bible, that people lived with dinosaurs, and that perhaps Jesus only lived a thousand years ago. It is doubtful that Ahmadinejad is sophisticated enough to mean the anthropological sense of mythology when referring to those events.
But my problem is essentially based on the fact that I have no reason to believe anything I’m ever told by Western governments in general with regard to foreign policy. Since childhood I’ve been told that political leaders on the other side of the planet are generally untrustworthy and/or crazy. And because everything nowadays seems to be about the other side of the planet, I was left with cognitive dissonance when I heard Mike Wallace interview the President of Iran, as he did last August (and available in the two mp3s below). Because Mr. Ahmadinejad sounds saner than my own political leaders.
Wha? I mean, listen closely to the interviews: at one point Ahmadinejad says to Wallace (who prompted him to be more sound-bitey) that all of his questions require book length answers. What North American politician would say such a thing? ‘The problem that President Bush has is that in his mind he wants to solve everything with bombs. The time of The Bomb is in the past, it’s behind us. Today is the era of thoughts, dialogue, and cultural exchanges’. Who the fuck said that!?
Now, with props to my culture’s conditioning, who knows if he was just putting on a show of reasonableness for the Western cameras. We are told continually that these foreign leaders are like that: crafty propagandists who seduce our liberal left-wingers with their talk of international justice and wanting to do good things for their people. But we know The Truth, because our warmongering political elite have deemed to tell us The Real Story in between all of the secrets they keep. These leaders in the next hemisphere want to nuke us, they hate our freedom, they’re insane and hateful, unenlightened and ignorant, and they regularly flaunt international laws. They are also undemocratic and barbaric, because their elections are either rigged or the wrong people (Hamas) win. Further, when they execute their past tyrants they don’t do it tastefully.
Worst of all, they’re all anti-Semtic and want to destroy Israel, which is another way of saying they are Latter Day Nazis and thus we’re in another Just War against genocidal fascists. In the midst of this snake pit there is Israel, and the Israeli Cabinet, we need to remember, is along with the Pope and the American President, infallible; all graced by God with the ability to never be wrong about anything.
On Freedom of Expression
As I’ve said, I’m being extremely generous in assuming that Mr. Ahmadinejad could be more intelligent than he is portrayed. But such an example, based on an uncommon view, removes my argument from the realm of shared experience from which we should be debating ideas about free expression. The controversial issues of our time are discussed based on common understanding and misunderstandings, and it’s important that we debate within those limits, rather than resort to extreme examples which make everything hypothetical fast.
Abortion is the example that comes readily to mind – growing up in the 1980s and hearing about Henry Morgentaler in the news, and even once participating in a junior high school debate on the subject, the pro-choice contingent regularly argued for cases of rape, incest, and maternal health concerns as deserving abortions. I haven’t checked out the stats, but I’ll hazard a guess that over 90% of abortions performed in North America have nothing to do with those examples. Common knowledge – which may be ignorant and flawed granted – suggests that most abortions are a form of birth control. To hedge around that by arguing the extremes keeps the debate from really being held in the first place, and thus the camps can remain unconvinced by the other’s position.
American commentators see free speech as a sacrosanct right, and as a result have one of the most intolerant and ignorant cultures on the planet. But that is their self-described right. The United States gift to the world seems to have been the enlarge definition of rights to include the right to degrade, discredit and humiliate oneself to a state of unreserved indignity. Anna Nicole Smith had the good fortune to die this past week to provide me with her example. The idealists of the U.S. make it a point to defend the offensive and vulgar as a part of this right, and perhaps here I shouldn’t remind you that vulgar came from the Latin word for common, as I want to try and elevate the common to think of our common capacity for intelligence and compassion rather than our current and common psychopathologies. It is to this end that we need free expression defended: so that we are able to judge things for ourselves.
Our position in Canada is a more intolerant view on intolerance. We accept limits to free-speech which includes anti-hate speech laws. This is meant to prevent harm, and as I understand it, our Supreme Court allowed this by stating that some forms of speech are not worth defending.
A case in point is Holocaust denial: questioning the interpretation of the evidence is one thing, but what is the motivation behind it? The Jews have a right to mythologize (anthropologically) the story, and why should any of the rest of us care? When did the phrase ‘mind your own business’ fall out of favour? I think I know the answer to my rhetorical question, and it’s basically the one favored by Ahmadinejad and his fellow skeptics, one that prefers to dehumanize Jews with the word ‘Zionist’. I don’t think I need to get into it. I think the point raised by the Supreme Court’s decision is essentially it isn’t worth the debate, and that in fact it could be perceived as harmful to engage in it.
Somehow (and I think this has remained largely unexplained and unexplored) we can enjoy a freedom of expression without regularly crossing the line into hate speech. Seldom is anyone investigated or charged: you really have to make an effort to be that offensive. Or one has to be basically poking a bee’s nest: posting calls for Bush to be assassinated online, creating cartoons of Muhammed as a terrorist and the like. As free expression those examples are a waste of the freedom, since it contributes nothing to a discussion and is really only retrogressively ignorant.
How do we manage to use our freedom of expression productively when and if we do? I think it comes from our appreciation for those who offend in ways that increase our capacity for all of expression by showing us a new idea, a new way of life, and a new way of thinking. But we are wary and even intolerant of those who want to limit our expression, or limit our innate sense of progress toward a better world, through the expression of their retrogressive views. In other words: blowing away a stale old convention and offending conservatives by doing so rocks; bringing about the downfall of civilization with a medieval attitude and mindset does not. Somehow we understand what constitutes this through a language of behavior rooted in our common experience. This is what makes conservatives so defensive: they know when they’ve been beat by a new expression. It used to be rock n’ roll: now it’s their teenagers using abbreviation, emoticons, and chatting online with strangers.
While we are united by a common grammar of speech, so too we are united by a common grammar of behaviour. This has been in the past referred to as bourgeois values and considered worth rebelling against, and thus movements created a type of poetry of misbehavior which expanded our own vocabularies of affect. But within these values is a core set of ideas about how we should treat one another, a common value set which sees the benefit to the whole at the individual’s expense.
Consider littering. Off hand, I’m sure we all agree that littering isn’t really a good thing. We’ll define it as saying it’s the introduction of garbage into a public space meant to be shared by all. We’ll further define garbage as something unwanted by someone. Thus, our definition here of littering is the introduction, of something unwanted, into a public space.
But what if this unwelcome introduction of something unwanted is called art by the litterer? Then it’s an intervention. Then, that cigarette cellophane you just dropped on the sidewalk is a performance. According to the art-rules I should shut up now, because the recontextualization destroys it as litter and makes it a human expression that should be nurtured, encouraged, and supported by art council grants. But here I really want to link littering to graffiti and say that because some people consider it unwelcome it is also a form of littering, but it’s one that I personally support as a human attempt at the beautification of plain (plane?) architecture.
While we all understand why we shouldn’t litter as part of our common knowledge, we also understand the deal with most abortions and why hate-speech could be criminal. We don’t need freedom of expression – or whatever other freedoms we enjoy – to be defended by extreme examples, because all laws, all social agreements, all freedoms exist first as a social convention in common knowledge and it is from this basis that the state feels it has the authority to police them. The fragmentation of our society into specialized interest groups is perhaps where we began to disagree about what should be legal and what shouldn’t be. Our common knowledge – our vulgarity – has been reduced to extreme forms of behavior and reduced in intelligence to something less than our potential making us more undignified than some animals.
The challenge has always been to incorporate the deviant into the conventional: this pattern has always seemed to be about the dominant sanctioning another – minority’s – convention as harmless rather than a sudden revaluation of the dominant’s morals. The arguments raised by Christopher Hitchens in his defense of the ‘freedom of denial’ in essence is of allowing that process to continue: for the dominant to not become so self-satisified that they refuse to consider the other’s point of view. But it also seems that we have reached examples of extreme perspectives that the dominant decided long ago were not sanctionable. Holocaust denial is one, as is sex with kids and animals. The recent Sundance film festival featured a film in which a 12 year old girl was raped, and another was a documentary on bestiality. My thoughts are essentially: do we really need to have that discussion? Are we so intellectually and emotionally bankrupt that we have to resort to those expressions for stimulation? It turns out that no distributor wants to buy the Dakota Fanning movie Hounddog and all I can think is thank god.
Ultimately, this is all about the strangeness of language: how a set of sounds, strung together a certain way, can have such intense psychological and intellectual effects. Words uttered or read can make the heart leap or fall, can be emotionally devastating or immensely uplifting, and it’s all just a bunch of sounds or a bunch of shapes on a surface. Through this, one mind interacts with another and our sense of what’s going in our world – that intersection of imagination and environment – grows until we eventually are changed people: more sophisticated, more learned, more conversant. We have a bigger bag of tricks and fuller experience of life. The freedom of speech is also the freedom to be exposed to ideas that we don’t agree with, so that we aren’t held back from the mysteriously transformative power of hearing or reading words. But a case can be made that some of this has the potential to be retrogressive and counterproductive, making us more stupid. Inasmuch as the state tries to do this for us, they should have better things to do, but I think it is also true that they don’t need to control what we think about things because that’s already done by a televised culture of idiocy. – Timothy
Yesterday was my birthday. I turned
I felt
because
- I was stressed at work
- I was indecisive about going to see the Mathew Barney movie, because
- I didn’t want to have to kill the time between getting off work and seeing the movie, which would have entailed
- the claustrophobia of being stuck in a room, in a seat, in a sold out theatre for two hours, with hipsters
Today I bought
- a new bottle of after shave balm which came with a promotional stick of lip balm
- a new shaving brush
- a new bottle of eye drops
I bought these things on my morning break. For my afternoon break I read
- Lee Seigel’s review/historical essay on Norman Mailer’s new book
Norman Mailer and I share a birthday with Justin Timberlake. Justin Timberlake turned
in Montreal, where he was singing after performing in Toronto on Tuesday. Today is the third anniversary of the infamous wardrobe malfunction.
Justin Timberlake was born three years earlier than Mailer’s age if we begin to count in 1900; that is, 1981. Norman Mailer turned
because he was born in 1923. If you add two to four you get Justin’s 6 and if you take away six from the eight you get the 2. If you take the the two and three and from Mailer’s birth year and flip them around and you get back to the beginning, which is