Ice Trees
The next time I see faux icicle decorations I’ll think of power outages.
The next time I see faux icicle decorations I’ll think of power outages.
“She had brought her book with her. It was an old looking thing, cased in marbled covers. Inside were pages and pages of handwritten text. Her letters sloped to the right like windblown trees.
Chiku saw an omission on one page and touched the nib of her fountain pen to the vellum. The inked words budged up, forming a space in which she could insert the missing word. Elsewhere she struck through two superfluous lines and the text on either side of the deleted passage married itself together.”
— from On the Steel Breeze by Alaistair Reynolds.
Saw Hannah Arendt last night and then felt weird about having a book called "Don't make me think" on my shelf.
— Timothy Comeau (@timothycomeau) July 14, 2013
Steve Krug’s Don’t Make me Think is considered one of the canonical texts of web-design, and as such was introduced to me as part of my web design studies at Sheridan College, which I undertook during the 2011-12 academic year. The title alone had always been offensive to me, someone who enjoys both ideas and thinking, and I always chaffed at the mindlessness it encourages.
However, the education process I went through helped me become conscious of my web-browsing behaviour, and the book is narrowly contextual to those times when we are on a website for a purpose-driven reason. For example, when we go to a theatre’s site to buy tickets, or are on some other commerce site trying to find contact-info or a business’ opening-hours. Primarily, the ‘design philosophy’ behind don’t-make-me-think is contextual to commercial or other service-oriented websites.
In the film Hannah Arendt, we get a wonderful defense of thought as a human activity, and a explication that the evil in question (the Holocaust) was facilitated by classic, mid-century modernist bureaucracy, and especially the German version which was predisposed by an education system which taught obedience and discipline. The system becomes one which encourages people to disregard thought which (as Arendt says in the film) dehumanizes us by ‘making people superfluous’ to the system. In other words, the indifference of a bureaucracy toward the individual it is meant to serve means people end up serving the bureaucracy.
It’s worth noting that the German education system, as developed by the state of Prussia, was imported to North America a century ago to transform farmer-children into future factory or corporate employees, by teaching a tolerance for boredom and a willing and mindless obedience to managerial directives. (See John Taylor Gatto’s essay, “Against School”)
~
This decade incorporates the 20th-year-anniversaries of everything web-related. The World Wide Web was first released in 1993 as an app that ran on the Internet (then an academic and government modem network). Now the W.W.W. is synonymous with the ‘Net and the design principles promoted by Don’t Make Me Think have become so standardized that we recognize websites as bad when their straightforward principles are violated. We know how websites are supposed to work, we recognize header menus as such, and understand what ‘home’ means.
Krug’s follow-up, Rocket Science Made Easy was a second-semester text, and I found both books very hard to read because they are both so patronizing and because he’s continually stating what is now obvious. They were written for people for whom computers, and using them, were new. Now they feel more like historical documents.
Inasmuch as we have a ‘web 2.0’ nomenclature (which in itself is about a decade out of date) I find the language shift from the ‘Net’ to ‘The Cloud’ indicative of where we are: the interconnected network was about siloed websites and email – essentially network nodes and lines of communication.
The Cloud (as a “post-Net 2.0” term) speaks to our ever-present interconnectivity, where we can download data to our devices out of thin air, and where server farms behind our screens can run the necessary compression algorithms to apply filters to our photos as we upload them.
The novelty of this technology has been intoxicating, and myself I’ve found it fascinating enough to both want to understand it and participate within it professionally. But after 20 years, the novelty is beginning to wear off; and the inevitable transitions evident fifteen years ago have come to pass.
Physically, publishing on paper is in decline (in some cases rightfully) whereas digital publishing is established and growing. This echoes the transition between Mediaeval manuscript book propagation in favour of the printed book, and if Gutenberg’s invention in 1452 echoes Berner-Lee’s of 1989, we are in the equivalent of the 1470s, by which time Guttenberg’s press had spread to France, Italy, England, and Poland.
The model of book-idea production has lasted since that time, until our era when we’ve figured out how to fluidly use a two-dimensional surface through the manipulation of electricity and light.
"We're struggling because the most visually adaptable medium in history is undermining print! Fire the photographers!" – Chicago Sun-Times
— Taylor Dobbs (@taylordobbs) June 28, 2013
I spent a week last July helping put the finishing touches on a corporate website to publish a company’s annual report. Twenty years ago, the report would have been a booklet and print designers and typesetters would have been hired to produce it. As the novelty of working with computers is wearing off, and as our economy has shifted to incorporate them in our offices and studios, it is now obvious that this digital economy is essentially that of publishing: websites, apps and ebooks. It is supported, as it always has been, by ad money. And the big sites like eBay and Amazon represent the Platinum Age of mail-order. I grew up with Roch Carrier’s famous short story about a hockey sweater ordered from the Eaton’s catalogue. A future generation will probably know an equivalent that replaces Eaton’s with Amazon.
As I worked during that July week it occurred to me that in 200 years I would not be described as front-end developer, nor a web-designer, but perhaps just a publisher, in the same way that Benjamin Franklin is described as a printer, not a pamphlet designer, nor a typesetter. “To earn money he worked in publishing” – may be all that need to be said, for by then, publishing will be digital by default, and will have been for two-hundred years.
Last week at FITC Screens 13 I got to try Google Glass for the first time. Tom Emrich was there as part of the Xtreme Labs Lounge and I tried his device for about five minutes, long enough for him to show me how to use it and go through some basic commands.
The screen was a little out of focus, but it wasn’t important to me that it be perfectly fitted and adjusted. I took a picture, swiped, tapped, looked at the New York Times’ app, and had it read it to me.
Here is a mock-up I made to record the general impression:
The rectangle was smaller than I expected, and the fact that it was back-lit / transparent-black gave it a bit of a purple, out of focus sheen. It’s a see through screen hovering at your eyebrow, and I was thinking of this later when I tweeted:
Google Glass is a useful hallucination prosthesis, basically. #SCREENS13
— Timothy Comeau (@timothycomeau) October 3, 2013
I wrote that while sitting in Mike DiGiovanni’s Google Glass presentation, and watching him onstage I now understood the gestural language he was presenting: not that of someone with a neurological disorder, unable to focus on what’s in front of him, with eyes rolling upward, but someone who was experiencing a prosthetic hallucination in the right corner of his visual field.
The thing about something like Ggl Glass is that you're not aware of what it looks like to others. In that way it's somewhat sociopathic.
— Timothy Comeau (@timothycomeau) October 3, 2013
I used the word ‘sociopathic’ specifically: a social pathology, that is a social illness, where one behaves in a manner that is either offensive or unfriendly and unsociable.
Human interaction requires at least two people, but Glass is not a device meant for two people. It’s an internalized, private experience. When you’re wearing one, you are meant to forget that it’s on, in the same way that traditional corrective eyeglasses become forgettable to the wearer.
New word suggestion for mobile: "isolationist" as in, "Google Glass is very isolationist." #SCREENS13
— Timothy Comeau (@timothycomeau) October 3, 2013
All the pictures we’ve seen are of other people wearing glass but of course this is because of how difficult it is to show the subjective experience, which is really what the product offers.
Google Glass is a beta product, and is the technological equivalent of 1990s cell phones with retractable antennas. In the 90s, owning a cell phone was a little offensive, because it signaled that you were either a busy-body big-shot or you were narcissistic enough to think that you were as important. (I remember telling someone in 1999 that I didn’t want a cell phone because I wouldn’t want to be bothered when I was away from home).
However, the utility of a cell phone soon meant that by 2005, almost everyone had one. By 2007, the year Apple released the iPhone, the executives and managers of the world were already carrying Blackberries and other email-capable cellphones, and I was used to seeing these people staring at their little machines while in line for coffee. It occurred to me then that the Blackberry was the wand of the overclass, and I wondered what their jobs were that they had to be checking email while in line. (At the time I carried a basic Nokia).
Now, people everywhere can be seen looking down at something they’re holding in their hands. This is such a common sight that you can find examples on Google Streetview:
For this argument I’ll refer to this posture – or more specifically this behaviour, as “digital attention behaviour”.
In 2007, in line at coffee shops, the future wasn’t yet evenly distributed, but now this digital attention behaviour has spread wide and become normalized.
Part of the normalization is that looking at a rectangle displaying digital messages isn’t that much different than looking at a pad of paper. We were already used to seeing people read things silently, which in itself was a revolution centuries ago, when reading was usually done aloud.
The rolling of eyes may eventually become a normalized digital attention behaviour, but right now, absent the even distribution allowing the rest of us to relate, it still looks strange and offensive.
Unintentionally, Google Glass manifests the Western narcissistic ego, where private experience on public display happens without care for how it affects others. The selfishness of Glass is expressed when the Other cannot tell if a picture is being taken or if the time is being checked. With a smartphone, a glance can tell you if the person is emailing, texting, web browsing, or playing a video game. The information leaks, and this information is valuable in contextualizing our person-to-person interaction.
Because the Ggl Glass experience is so personal, it seem like it should interfere with our Theory of Mind. #screens13
— Timothy Comeau (@timothycomeau) October 3, 2013
Rendered completely private, this interferes with our Theory of Mind, our ability to imagine what others are doing and be able to relate to them. We can’t empathize without sufficient contextual clues. Inasmuch as Glass is a prosthetic for hallucination, it may also be a prosthetic for autism.
Having said all this …
I am nevertheless excited by the idea of Glass as both a prototype and an attempt to get us away from current digital attention behaviour, so that we can benefit from the data cloud while also being able to interact with one another as we did in the past. The irony that Glass is at present such a private experience that it interferes with human-to-human interaction, and is one of the bugs that needs to be resolved.
I like to think of Glass a pathfinder project to get us to causal augmented reality, via “smart”-eyeglasses, contact lenses, and/or eventually implants, such as described in Alastair Reynold’s Poseidon’s Children novels, the second of which (On The Steel Breeze) has this scene:
“Wait, I’m hearing from Imris again.” Her face assumed the slack composure of aug trance, as if someone had just snipped all the nerves under her skin.
In that world of implant-enabled augmented reality, an aug trance is something everyone can relate to, and fits into everyone’s Theory of Mind. It is not disturbing to see, and is an understood appearance.
Having said all this, I suspect that a product like Glass will be successful. Again, its current design is reminiscent of the first cell phones. We know from the movies that portable radio-phones were available during World War II.
Original 1960s Star Trek communicators were more skeuomorphic of the walkie-talkie than a phone, but when Motorola marketed the StarTAC phone in 1996 the reference to the fiction was obvious.
In the 2009 Star Trek movie, Simon Pegg as Scotty is seen wearing an eyepiece:
And in 1991, Star Trek The Next Generation featured holographic eyewear which took over the minds of the crew:
Which exemplify that the idea of a heads-up-display is an old one, but Google decided to build it, tether it to an Android phone, and begin to market it. I don’t doubt something like it will eventually be successful.
What is especially interesting is how such a simple idea and device turns out to have complicated social side effects, but these side-effects would never have become apparent if Google hadn’t taken the chance to implement this old idea to begin with.
Twenty years ago in September 1993, I was a fresh-faced frosh at Saint Mary’s University in Halifax Nova Scotia. I had visited Halifax on weekend family trips while growing up in the province, but this was the first month of what turned out out be a six year experience of living there.
With some free time, I took a walk downtown. The path tended to be: Inglis to Tower Rd to South Park, to Spring Garden Road. From here, I ended up walking to Barrington St.
I remember protests regarding Clayoquot Sound around this time, being held across the street from the library at Spring Garden and Grafton but I’m not sure if they were visible on this particular walk, wherein I found myself further down into the historic properties.
I found a Subway sandwich shop, and stopped there for lunch, and recall being surprised when I was offered both mustard and mayo as a topping, a combination of which I had never encountered before.
Unbeknown to me, I had wandered onto the grounds of NSCAD, the art school where I would begin classes three years later. Occasionally while at NSCAD, I would look down from the library at Subway at the intersection and recall that first walk when I began to discover my new city. For that reason I took its picture.
The Subway, as seen from the NSCAD library in 1999
Eight years later I began to be interested in computer programming and the web. Using books I began to figure out how to build web pages, and I was reading Slashdot everyday. There was also a website called NewsToday® (later rebranded to QBN) which aggregated news items of interest to designers, and if I remember correctly, it was through that that I found YoungPup.net, where Aaron Boodeman (youngpup) had posted “the best way” to generate a pop-up window in Javascript. (I found the posting with the WayBackMachine, timestamped Sept 19 2002). From what I recall through his blog, through a link or a reference, I learned about Aaron Straupe Cope. Through his posted online resume, I learned that we shared NSCAD as an alma mater.
“If anything I studied painting but I am mostly part of that generation for whom everything changed, and who dropped everything they were previously doing, when the web came along.” – Aaron Straup Cope (Head of Internet Typing at the Smithsonian Cooper-Hewitt National Design Museum), in his post Quantified Selfies
While I was eating my sandwich twenty years ago, he was probably in the building next door taking Foundation courses. His online resume also tells me that he was around during my first year, graduating at the end of term.
Twenty years ago the future both lay before us in a tapestry of September sunshine, but just as the future of twenty years from now is being invisibly incubated, nothing was then evident. It was the first year of the Clinton Administration and Jean Chretien’s Liberals were about to win the general election. The 90s were effectively beginning.
When I first started reading Aaron’s blog about ten years ago, he was living in Montreal. Later through his blog I learned he was in Vancouver, and later still, he was in San Francisco. I imagined CDs bought at Sam the Record Man on Barrington St that may have accompanied these travels. Reading his blog, I understood he was at Yahoo!, working on a site called Flickr.
Seventeen months ago, the Google Streetview car captured the corner as it then appeared. Here it is, posted to Flickr.
So I reflect now twenty years later on how a website like Flickr (which was big in its day and which now lingers on as an established presence) became part of the world that did away with typewriters for my generation and younger, and was in a way present at the GPS coordinates where I ate a Subway sandwich twenty years ago.
I once heard it said that the internet was like Guttenberg’s printing press, and while revolutionary, Gutenberg’s printing press resulted in the religious wars of a century later. This was voiced as a warning against cyber-utopianism.
Twenty years after the World Wide Web app was released so that the public could use The Internet, we have begun to see our wars play out. The religious wars of the past led to the creation of the Nation State, after the Treaty of Westphalia. Our present wars are a symptom of the breakdown of that international system.
Freedom fries: this domino line begins with McDonalds.
Ben Ali’s son-in-law wants to open a McDonalds in Tunisia. He meets with the American ambassador.
The ambassador goes home and writes a report on the meeting, noting the family’s opulent wealth. “He owns a tiger that eats 6 chickens a day”.
Because privacy is old fashioned a Marine private smuggles out gigabytes of material on a re-recordable CD marked Lady Gaga, and provides it to cyber-utopian Wikileaks. They publish it along with the Guardian and The New York Times.
The Tunisia reports are emphasized by Al Jazeera, and they spread on Tunisians’ Facebook pages. Two out of ten people have Facebook accounts because privacy is old fashioned.
So a frustrated young man is spat on by a policewoman and sets himself on fire. Tunisians take to the streets, and inspire similar protests in Egypt, Libya, and Syria. Egypt’s dictator is kicked out. Libya has an 8 month civil war, before its dictator is finally killed. Syria’s civil war continues.
Egypt holds democratic elections but the poor vote for the wrong people, a party that wants to govern in an oppressive way. They protest again. The army comes in and removes the president. The world doesn’t want to call it a coup d’etat, because it was simply the army removing the person who should not have won the election. Democracy is only a good thing when the right people win. The people who voted for him are upset, so they protest, until the army clears the square by shooting at them. Some more people die.
Meanwhile, in Syria, chemical weapons are used. Chemical bombs are equivalent to First World War nukes, number two on the list of taboo armaments, a century old and “never to be used again”. They’ve nevertheless been manufactured.
Syria hasn’t signed the anti-chemical weapons treaties. A thousand people die.
President Obama had said that the use of chemical weapons would be a line that should not be crossed, lest he send in the World’s Most Powerful Military. The weapons were used.
To be continued.
_____________________________
Worth reading:
WikiHistory: Did the Leaks Inspire the Arab Spring?
Adam Curtis’ 2011 blog posting on Syria
“A figure in the Muslim Brotherhood [said] ‘It’s not logical,’ (is the way he put it), ‘it’s not logical for President Obama to be so concerned about a thousand people killed in a chemical weapons attack when a hundred thousand have been killed, have been slaughtered by Assad in the last two years.’ And basically people here Jeff do not accept this distinction that the President is trying to make between the use of chemical weapons and the wholesale killing of Syrian civilians by aerial bombardment and artillery. They see it as an esoteric argument about some international weapons convention treaty that just has no relevance to their lives.”
At the end of August my friend Byron Hodgins & I went to High Park and he painted me.
The argument made by Alfred Loos in Ornament & Crime a century ago (1908) was essentially racist: a European aesthete equated ornamentation with primitive barbarism. Eighty-four years later, art anthropologist Ellen Dissanayake pointed out how the West’s value of simplicity was unique; that people all over the world use ornamentation as an expression of their humanity (Homo Aestheticus; 1992). In other words, the Western/European strain of thought has derided ornament for a long time, but this is a rejection of what the rest of the world appreciates.
As architectural criitc Nikos Salingaros stated in a recent interview (speaking architecturally):
Ornament generates ordered information. It adds coherent information that is visual and thus immediately perceivable on a structure. Successful ornament does not stick something on top of form: instead it spontaneously creates smaller and smaller divisions out of the whole. Just like biological life, which is all about information: how to structure it, how to keep it together as the living body, and how to replicate it so that the genetic pattern is not lost forever. But without ornament, either there is no information, or the information is random, hence useless.
The loss of ornament is the loss of vital architectural information. Ever since that fateful moment in history, there is little life in architecture. Unornamented forms and spaces are dead, sterile, and insipid, defining a sort of cosmic “cold death”: an empty universe where no life can exist. But for a century, this empty state has been the desired aim of architects: to remove information from the built environment.
A century of thought by sophisticated individuals has resulted in Minimalism as an aesthetic trend, affecting the built environment, designed spaces, and designed objects. It is part of the story that includes not only Alfred Loos’ contribution, but also that of asceticism. Minimalism seems to be one of David Martin’s Christian ’time bombs’ that went off during the Industrial Revolution.
Martin’s argument is that Christianity (which we can say began as a cult in Roman occupied Judea) spread throughout the Roman world and for the past two thousand years has survived through weekly repetition of its repertoire of ideas. These ideas slowly transformed what was once Roman society — shepherding civilisation through Rome’s collapse, then preserving knowledge until Europe could restore itself in subsequent centuries. Along the way, Christian ideas have gone off like time bombs, such as human equality, and the abolition of slavery.
Minimalism would seem to be a contemporary expression of the asceticism taught to the West through the Christian tradition, and thus the contemporary minimalist practitioner might see themselves as practicing a form of spiritual sophistication, through what they consider to be “good taste”.
However, minimalism is very future-orientated as well. In Ian MacLeod’s Song of Time (2008) the narrating character (reflecting from a future perspective of a century from now) says of our present early 21st Century:
There were so many things is that lost world. Our house overflowed with Dad’s old tapes, records and CDs, and Mum’s ornaments, and both of their books and magazines, and all of our musical impediments, and my and Leo’s many toys, which we never played with, but still regarded with totemic reverence.
This implies that the future world is largely Thing-less and decluttered. Of the minimalised future, consider how it is parsed by Lindsay Jensen in her essay on Oblivion:
“The flannel-wearing hoops-shooter is Jack Harper (Tom Cruise), a high tech drone repairman who lives in a futuristic compound (that more closely resembles a sterile medical lab that a cozy cottage) […] Despite the destruction of the Earth’s surface, Jack and Victoria’s home – in a tower high above – displays not a speck of dust or clutter, only gleaming chrome and glass. Even Victoria herself seems a piece of this decor, impeccably dressed in sculptural shift dresses … signifying Victoria as an integral element of this environment — serving the same semiotic function as her hyper-futuristic touchscreen computer, or the meals that appear as anonymous squares of surely nutrient-dense edibles served from individual vacuum-sealed pouches. These objects – of which she is one – loudly and obviously declare this as the future: a different, cold, and calculated environment in stark contrast with the relaxed authenticity of Jack’s cabin. The latter is a hideaway whose existence Jack keeps secret from Victoria. She is too much a part of one world to venture into the other one he has (re)-created.”
Oblivion in fact displays the temporality of contemporary expressions of Sophistication. On the one hand, we get the minamlised dust-free future. On the other, we get Jack’s cabin, his secret retro-world, filled with the archeology of Oblivion’s Earth. Here Jack wears a New York Yankees ball cap, checked shirt, and listens to vinyl records. Old, weather-warped books rest on rough-hewn shelves. The cabin world reflects our “Dream of the 1890s” and the other hipsterisms our present time – the Sophisticated Retronaut who has curated their life as if it were the decade between 1975-1985, with a vinyl record collection, gingam shirts, and the usual as displayed on Tumblr.
Sophistication seems to lie on the spectrum between the Retro Past or the Austere Future, and the display of its corresponding taste. For either one curates objects of “warmth” or those that are “cold”, while “sentimental” responds to “calculating”.
Concern for dress goes along with concern for one’s bearing and manner, and these reflect the self-control and civility that humans from the earliest times seem to have deliberately imposed on their ‘natural’ or animal inclinations and regarded as essential to harmonious social life.
While the examples at this part of her book focused on clothing and dress, they serve as scaffolds to extrapolate from: our concern for ornament is a human appetite, a way that we express our supra-animal minds. Dissanyake closes the chapter in question (“The Arts as Means of Enhancement”) by narrating a brief Western art history:
For the past two hundred years… the formality and artificiality that universally characterize civilized behaviour have been suspect. Wordsworth praised the beauty to be found in the natural rustic speech or ordinary people; since his time, poetry has moved further and further away from the characteristics that marked it as poetry for thousands of years [while] 20th Century Western artists have typically been concerned with making art more natural (using ordinary materials from daily life or depicting humble, trivial, or vulgar subjects) and showing that the natural, when regarded aesthetically, is really art. […] In this they both lead and follow, absorb and reflect postmodern Western society which is the apogee of the trend I have been describing where now the natural is elevated to the cultural, where nature and the natural viewed as rare and “special” are deliberately inserted into culture as something desired. I have pointed out that most cultures like shiny, new-looking, bright, and conspicuously artificial things. […] But we prefer the worn and the faded because they look natural and authentic.”
The Minimalised future is shiny and new and special, and thus in tune with our human natures. Yet it is also austere and cold, speaking to a sense of self-control and discipline, which is acquired … that is to say, civilized. Our flourishing Retro hipsterdom is the late 20th Century’s postmodern concern for authenticity spoken of by Dissanayake. But maybe it is also way for our culture to digest the records of the previous hundred years and decide what should be considered timeless, what should we formalize into the artificiality of culture which our human appetites desire.
(top image: Fuckyeahcooldesigns Tumblr)
The King Baby was born on Monday July 22nd. Today they announced his name was George.
Let us suppose future historians will be able to say that the Second Elizabethan Era lasted the second half of the 20th Century and a quarter of the 21st (a reign of seventy five years). Thus:
Elizabeth II | 1952 – 2027 | æ 101 |
As Elizabeth’s mother lived to be 101, this is quite feasible. Let us then be generous and assume all future monarchs will live to be 100.
Charles III | 2027 – 2048 | æ 79 – 100 |
William V | 2048 – 2082 | æ 66 – 100 |
George VII | 2082 – 2113 | æ 69 – 100 |
Note: Wikipdia states that Charles has considered reigning under the name George, which may be unlikely now that his grandson has been given the name. Nevertheless, were he to do so, presumably King Baby George would reign as either George VIII or follow his grandfather’s example and us another name.
If Prince William lives as long as his grandmother that would mean he'd be 102 in 2084. So today's news is a big deal in the 2080s.
— Timothy Comeau (@timothycomeau) December 3, 2012
“What conditions would allow a sophisticated and civilized society that has space travel to turn inward and no longer see what it’s doing to itself?” he adds. “We likened it to a feudal, hyperconservative kind of society that no longer believes other planets were worth visiting and mothballed those fleets. Ancient, doddering old fools running society and paying no attention to science or more enlightened minds.” – Alex McDowell
(Sounds basically like the USA and Canada)
This french fry is this Go Train car’s Red Wheelbarrow.
These two images from National Geographic‘s Found Tumblr are currently my Desktop backgrounds:
“Did you ever hear of the 5.9 Kiloyear event?
“I thought not. It was an aridification episode, a great drying. Maybe it began in the oceans. It desiccated the Sahara; ended the Neolithic Subpluvial. Worldwide migration followed, forcing everyone to cram around river valleys from Central North Africa to the Nile Valley and start doing this thing we hadn’t done before, called civilization.
That’s when it really began: the emergence of state-led society, in the 4th millennium BC. Cities. Agriculture. Bureaucracy. And on the geologic timescale, that’s yesterday. Everything that’s followed, every moment of it from Hannibal to Apollo, it’s all just a consequence of that single forcing event. We got pushed to the riverbanks. We made cities. Invented paper and roads and the wheel. Built casinos on the Moon. […]
But this global climate shift, the Anthropocene warming – it’s just another forcing event, I think. Another trigger. We’re just so close to the start of it, we can’t really see the outcome yet. […]
“The warming was global, but Africa was one of the first places to really feel the impact of the changing weather patterns. The depopulation programmes, the forced migrations … we were in the absolute vanguard of all that. In some respects, it was the moment the Surveilled World drew its first hesitant breath. We saw the best and the worst of what we were capable of, Geoffrey. The devils in us, and our better angels. The devils, mostly. Out of that time of crisis grew the global surveillance network, the invisible, omniscient god that never tires of watching over us, never tires of keeping us from doing harm to one another. Oh, it had been there in pieces before that, but this was the first time we devolved absolute authority to the Mechanism. And you know what? It wasn’t the worst thing that ever happened to us. We’re all living in a totalitarian state, but for the most part it’s a benign, kindly dictatorship. It allows us to do most things except suffer accidents and commit crimes. And now the Surveilled World doesn’t even end at the edge of space. It’s a notion, a mode of existence, spreading out into the solar system, at the same rate as the human expansion front.”
From Blue Remembered Earth by Alastair Reynolds (p.150-151). The character Eunice, speaking in 2162, is explaining the development of the global Mechanism that watches over and protects the population. This is what I’ve been thinking about this week in light of the NSA revelations.
I myself don’t go into bookstores very much now. They have become archaic, depressing places. […] How many bookstores close, as a direct ratio of hours spent with electronic devices?
I’m sure there’s some direct relationship there. And it’s not a dark conspiracy. I happen to be quite the Google Glass fan.
In fact, I’m even becoming something of a Sergey Brin fan. I never paid much attention to Sergey before, but after Google Glass, Sergey really interests me. He’s filling the aching hole, the grievous hole in our society left by the departure of Steve Jobs. With Jobs off the stage, Sergey’s becoming very Jobsian. He wears these cool suits now. He’s got much better taste in design than he did. He’s got these Google X Moonshot things going on, they’re insanely great, and so forth.
I hope Sergey’s not taking a lot of acid and living off vegetarian applesauce. But other than that, well, now we have this American tech visionary millionaire who’s a Russian emigre. It’s fantastic! There’s something very post-Cold-War, very genuinely twenty-first century about that. It’s super. Sergey’s like my favorite out of control, one-percenter, mogul guy, right now.
[…]
Since the financial panic of 2008, things have gotten worse across the board. The Austerity is a complete policy failure. It’s even worse then the Panic. We’re not surrounded by betterness in 2013. By practically every measure, nature is worse, culture is worse, governance is worse. The infrastructure is in visible decline. Business is worse. People are living in cardboard in Silicon Valley.
We don’t have even much to boast about in our fashion. Although you have lost weight. And I praise you for that, because I know it must have been hard.
We’re living in hard times, we’re not living in jolly boom dotcom times. And that’s why guys like Evgeny Morozov, who comes from the miserable country of Belarus, gets all jittery, and even fiercely aggressive, when he hears you talking about “technological solutionism.”
“There’s an app to make that all better.” Okay, a billion apps have been sold. Where’s the betterness? – Bruce Sterling’s keynote at SXSW 2013
(the ending of 2006’s Children of Men)
So the Reinhart-Rogoff fiasco needs to be seen in the broader context of austerity mania: the obviously intense desire of policy makers, politicians and pundits across the Western world to turn their backs on the unemployed and instead use the economic crisis as an excuse to slash social programs.
What the Reinhart-Rogoff affair shows is the extent to which austerity has been sold on false pretenses. For three years, the turn to austerity has been presented not as a choice but as a necessity. Economic research, austerity advocates insisted, showed that terrible things happen once debt exceeds 90 percent of G.D.P. But “economic research” showed no such thing; a couple of economists made that assertion, while many others disagreed. Policy makers abandoned the unemployed and turned to austerity because they wanted to, not because they had to.
So will toppling Reinhart-Rogoff from its pedestal change anything? I’d like to think so. But I predict that the usual suspects will just find another dubious piece of economic analysis to canonize, and the depression will go on and on. – Paul Krugman The Excel Depression
I wonder sometimes if Morozov’s disinformation campaign is a deliberate sabotage, an attempt to discredit those who are actually working to achieve the participatory ideal that he claims to be protecting. […] I don’t mind Morozov’s petty mischaracterizations of my motives; it’s what he does to garnish attention and I make a convenient target.
-Tim O’Reilly responding to Annalee Newitz’s overview of Evgney Morozov’s attack piece
Faust audiobooks conjure Mephistopheles doodles
Lord Beaconsfield cigar box.
This drawing seems appropriate for Easter.
This was my elementary school.
A doodle from the end of January when I was listening to a podcast about King Solomon.
Found this idea in some old papers – I used to want to brand Canada as a place of thinkers but then Harper won and there’s too much hockey.
The DJ booth is a repurposed pulpit.
That’s Dan Turner aka Sex Helmet at The Founatain on Dundas.
One ends up drawing economical-line Prosperos while listening to The Tempest audio book
The romance of alchemy in the artistic festishization of jars.
(from my Twitter; related to my Timereading photo series)
I was finally able to download my Twitter archive yesterday, and I’ve posted it online. I plan on updating it every month or so.
I’ve always wanted to try a Segway so today @owlparliament & I did the Segway tour at the Distillery District. It was super fun. I loved floating along on that thing.
Field trip to the Water Treatment Plant with @owlparliament
I watched the remade Total Recall over Christmas. This is Commerce Court in Toronto’s downtown core.
Today I began work next door and after took a walk over to see it. Instead of an information kiosk, we have illuminated trees.
The church in Bathurst New Brunswick where I was baptized. The baptism itself looked like this:
The priest was my father’s uncle, Noel Cormier (died in 2010) and with my mother are my maternal grandparents. My grandfather died in 1993, and my grandmother is still alive at age 100. It was for her 100th birthday that I was in New Brunswick when the church photo was taken.