Ad Astra

What year does Ad Astra take place? While the beginning of the movie merely states it’s “the near future” there are some clues which date it precisely.

The first clue is the date displayed on McBride’s screen: it is Thursday the 3rd.

After his briefing and being told he will be sent to the Moon and then onto Mars, we get a scene where McBride is boarding the rocket to the Moon and we see the Virgin Atlantic departure screen. The date is now 29 May. This clues us into the fact that calendar displayed before was May.

But in between these scenes we see McBride walking down a hallway lined with the CVs of famous astronauts.

The camera zooms in and shows us the CV of McBride’s father, and here, we get a birth date.

If we zoom in, we see a birthday of September 15 (Tommy Lee Jones’ actual birthday) and a year that looks like 2032. (However, the descender on the 5 makes me also question whether the second descender is also a 5, but it looks this typeface uses descenders for the 5s and the 3s).

Ad Astra was filmed in between Aug and October 2017, during the time when Tommy Lee Jones turned 71.

2032+71 = 2103

If we check May 2103 we see that it begins on Tuesday, just like the calendar we saw at the beginning.

The Analogy of the Whale

Whales are intelligent mammals that spend their entire lives floating in the ocean. Yet they need to breath so they are continually in contact with the boundary between the water and the air.

Occasionally they breech and launch themselves out of the water before falling back down into it.

Whales have minds, and they communicate to one another. Their natural sonar allows them to not only communicate over vast distances, but allow them to “see”. The nature of sound means that they can likely perceive through things – so that if we found ourselves swimming with a whale, we would be as transparent to them as a jelly-fish is to us.

So whales know about humans and glimpse sunlight, they breath air, and they jump into it while feeding. While a beached whale has an experience of gravity, it can never know what it is like to jump on land as we do. Further, we could never explain to it that it’s not only possible to jump so high that you can see the entire world as a ball, but that you can jump all the way to the Moon. These are realties that we understand as beings occupying this slice of reality, at this point in Time (a century after the development of flight, and over fifty years since the Moon landings). Our human minds are flexible enough to comprehend the dimensions that exist between the depths of the ocean and the heights of outer space.

In the sense that we are ground-based, we are back-and-forth two-dimensional creatures. It is only in the past century that we have been able to be three-dimensional and fly like birds and navigate the oceans with submarines like whales.

Whales swim, we walk, and birds fly. We are in between both.

Just as whales glimpse something of air, sunlight, and the shoreline, we glimpse something beyond our experience. Through both mysticism and hallucinogens throughout the ages, we glimpse a dimension that we do not fully understand, nor are we constitutionally capable. Just as a whale isn’t constitutionally capable of understand the Moon landing.

Just as whales spends their lives at the sky and ocean boundary, we too spend our lives at the boundary between earth and sky; while the whale glimpses land, we glimpse Divinity.

Traditionally we have used the language of light to understand this: our minds are illuminated by the sunlight of God. Our illuminated minds understand the range of dimensions between the depths of the oceans out to Outer Space toward the Big Bang and the greater structure of the Universe. We’ve built his understanding from speech and math.

While we cannot walk in the Divine any more than a whale could walk in our cities, we are both animals breeching into a metaphorical Heaven.

Hello world!

I used to have a blog and then I took it offline.

Welcome back.

I archived the previous blog, and I may upload some of the better stuff that was there onto here, but for the time being, I want to start fresh.

I bought a KeySmart

The other day, checking Facebook I saw this:

Screen Shot 2015-03-11 at 11.03.27 AM

and given that I’ve found my current key ring annoying, I clicked on it. It brought me to a website that did this, immediately:

Screen Shot 2015-03-11 at 11.05.55 AM

So, I couldn’t even browse without providing an email address (which would undoubtedly start spamming me with “deals”) or without logging in with Facebook, so they could spam my timeline.

Ok, so let’s just check this product out on Amazon …

Screen Shot 2015-03-11 at 11.07.39 AM

which doesn’t tell me much, but did give me some basic reviews.

So let’s just go to the company website:

Screen Shot 2015-03-11 at 11.08.26 AM

Oh! A nice site!

Clean design!

Screen Shot 2015-03-11 at 11.08.42 AM

Informative graphics!

Screen Shot 2015-03-11 at 11.09.01 AM

More clean design!

Screen Shot 2015-03-11 at 11.09.17 AM

More clean design!

Screen Shot 2015-03-11 at 11.09.39 AM

Look at this form!

Screen Shot 2015-03-11 at 11.10.01 AM

Look at how the form is pre-populated with my country and province, despite being an American company!

Screen Shot 2015-03-11 at 11.10.07 AM

So of course I bought it, from the company, bypassing the both the Facebook advertising and the online-retail-default-juggernaut, Amazon.

And what’s really great, is that this is run via Shopify.

The negative is that their discount banner wasn’t really obvious enough, and I only noticed it in getting the screencaps preparing this, so I lost out on the 15% discount.

9837 HC

Bx2Sw-9CQAICIdY
(Via Twitter)

Using the Holocene Calendar (where 2014 is the equivalent of 12014), the year 164 BC translates to the year 9837, which was 2177 years ago.

Dante in Heaven

Egerton MS 943, f.186 (circa 1325-1350)

K004800

I couldn’t help myself and I recreated this using contemporary computer technology aka Adobe Illustrator. I’m a big fan of concentric circle medieval cosmology, depicting their belief in ‘heavenly spheres’.

dante_heaven

Manufactured under factory conditions

Jacob Clifton, Mad Men Creator Calls Out Entitled Baby Boomer Bullshit

It’s not hard to understand why Baby Boomers still consider themselves the center of the universe. For one thing, we all do. For another, they were manufactured under factory conditions to replace dead Americans from the War […] But to me, the most important part is the invention of television:

Imagine a new appliance in your own home whose only function is endlessly telling your life back to you, in brighter colors than reality and with a soundtrack we’re still listening to, and autobiographical feature-length music videos like The Big Chill suddenly make a lot more sense: ‘This is us, remember us? We are trying our best.’

Going on to embed Matthew Weiner’s clip from Tuesday’s (May 20) Colbert Report, Weiner talks about how the 1960s mythology has been created by Boomers, and he wanted to tell the story of those adults (like Don Draper, born in the 1920s) who experienced it, rather than the juvenile experiences of the young-adults who have since mythologized it.

Colbert: The Baby Boomers, they won’t let us stop thinking about the Sixties
Weiner: They think they invented sex, drugs and you know … and so they have a view of it that is a child’s view of it, so I wanted to say, what would it be if you were an adult that lived through, let’s say, some fairly interesting things like World War II and The Great Depression, and then this comes along. And there was tremendous change, and the cliché turbulence, and free love and things like that. But there’s free love in the 1920s, there’s free love in the 1930s, the Beatnik movement of the ’50s; no one invented any of this. What really happened was, there was a generation that was asked very little. They got education, they got a lot of entertainment, they got a lot of spending money, they became the focus of the economy, of entertainment, of everything. There was a war going on that they were supposed to fight, some of them didn’t. But the generation before them, all of them fought. They have a very sort of demanding thing, I experience it in real life, they’ll come up to me and be like, ‘what happened to this?!’ or ‘what happened to that?!’ and I’m like, ‘I’m not telling your story, I’m telling the story of your parents, or your grandparents’.

Kim Stanley Robinson comments on The Hunger Games

From How America’s Leading Science Fiction Authors Are Shaping Your Future by Eileen Gunn:

Smithsonian spoke with the eminent critic John Clute, co-editor of the Encyclopedia of Science Fiction, who quotes Bertrand Russell’s prophetic words from 1924: “‘I am compelled to fear that science will be used to promote the power of dominant groups, rather than to make men happy.’ The real fear today,” Clute continues, “is that the world we now live in was intended by those who profit from it.”

Kim Stanley Robinson—the best-selling author of the Mars trilogy, 2312 and Shaman—shares this fear, and sees it manifested in the popularity of Suzanne Collins’ novel The Hunger Games, in which a wealthy governing class uses ruthless gladiatorial games to sow fear and helplessness among the potentially rebellious, impoverished citizens. “Science fiction represents how people in the present feel about the future,” Robinson says. “That’s why ‘big ideas’ were prevalent in the 1930s, ’40s and partly in the ’50s. People felt the future would be better, one way or another. Now it doesn’t feel that way. Rich people take nine-tenths of everything and force the rest of us to fight over the remaining tenth, and if we object to that, we are told we are espousing class warfare and are crushed. They toy with us for their entertainment, and they live in ridiculous luxury while we starve and fight each other. This is what The Hunger Games embodies in a narrative, and so the response to it has been tremendous, as it should be.

Markdown Reference

I’m not a big fan of Markdown, since at this point I’ve internalized HTML tags. A recent link on Hacker News to make known a Markdown package for Sublime Text contained this comment by VikingCoder, with which I agreed:

I honestly don’t understand the point of markdown.
* this* isn’t any easier than this to me.
Of course, I may be biased

But the conversation throughout was a reminder to me that Markdown is currently very popular and will probably be around for a while. The popularity of Github has made knowing it an necessity, and doing some recent Spotlight searches on my system revealed an abundance of .md files I’ve already accrued.

atom-md-edit

Github’s recently released Atom editor has a Markdown preview function, so I decided to use it and reproduce it as a webpage, to use as a future reference. Because of these sources, and because the one place I will want to write Markdown is Github, I used Github’s conversion styles.

web-md

markdown.timothycomeau.com

Credits:

Github.css Markdown Stylesheet by Chris Patuzzo
Markdown Mark by Dustin Curtis
Some example text from Markdown Cheatsheet and from Wikipedia

19 April 2004

This in my Journal, ten years ago:

Yesterday, while walking downtown, bored and lonely (before calling J, meeting her in Kensington Market, having a falafel and going with her to see The Ladykillers -an entertaining and forgettable film – ) I had an idea for a story: a man decides to give up the desire for love, and is immediately confronted with friends and doctors who tell him he’s insane. […] Just now, thinking of how rotten that movie was last night, how entirely forgettable despite being charming and entertaining and at times funny – makes me aware of living in 2004 – the same sick ennui of a decade still figuring itself out, as in 1994, when Forrest Gump came out, and that stupid movie Speed which inspired men’s haircuts. (And the real influence on hair styles for the past ten years, Friends began). It is an utterly miserable time to be alive and intelligent, just as it was then.

That movie was so entirely forgettable that I had to Google it, and I was surprised to see it’s a Cohen brothers movie staring Tom Hanks. The Ladykillers (2004).

Anyway, a decade later I’ve thought the same; the 4th year of a decade is awful, and I see the parallels again in 2014, the decade figuring itself out, not having yet achieved that which it will be most remembered by.

Learning Git

Last autumn I figured out how to use Git by following some online tutorials and reading up on it. In order to solidify my understanding, I did some writing and sketching in Google docs to explain to myself. I’d always intended to publish the work on my blog to share my understanding for the benefit of other newbs who might appreciate it.

I’m not going to write here about installing Git, since that’s been covered elsewhere. This post primarily attempts to share my understanding of Git’s mental model, since it’s infamously opaque. If you are like me was when I first tried to learn Git, you may have installed it, did some follow-along basic tutorials, but remain confused.

I like to understand something’s history because I often find complexity is built on simple foundations. So I begin this with some history of the Command Line Interface because that is how I initially learned to use Git rather than use a GUI exclusively.

You’re probably already familiar with the fact that Git came out of Linux development and its popularity blossomed in 2009. Searching for Git tutorials you’ll find articles with timestamps going back to that year, probably because Github launched in December 2008. I seem to recall hearing about Git in 2007 or early 2008, having seen through Facebook the link to Linus Torvalds’ Google Talk on YouTube.

There’s another cluster of tutorials dating to 2010, and some more from early 2012, when I first tried to learn it.

I registered my Github account in February 2012, but at that point I only gained familiarity with Git. My workflow at the time had developed without version control, and I’d developed my own code-backup techniques while working on projects, where there had been no need to worry about code-overwrites and conflicts by other programmers. Last year however, I began to encounter those problems, which reminded me that Git existed for a reason and that it was great time to finally learn it.

A Git GUI was always an option but prior to last autumn I didn’t understand what Git was doing well enough to be able to understand what a GUI offered. Since then, I’ve found Github’s (Mac/Win) very useful in situations where I couldn’t install Git universally on a machine.

So, for the purpose of this overview, I’ll focus on the Command Line Interface in order to highlight what a GUI can automate for you.

Echo History

Two men at PDP-1 at MIT (circa 1961)

Two men at PDP-1 at MIT (circa 1961)

I remind myself that programming is the latest version of alchemical magic – an abracadabra priesthood, where we cast actual word-spells by typing them onto screens resulting in things appearing out of thin air, which we colloquially call The Cloud. I don’t say this to obfuscate or insult – but to highlight that an ancient social place once occupied by wizardly people looking to turn lead into gold is now occupied by coders with a culture developed around an appreciation for obscurantism once enjoyed by secret-society wizards.

Browserify Install with a spell

Browserify • Install with a spell

The continued existence of the Command Line Interface (CLI) is a legacy of this coding culture. On the one hand, the CLI offers efficiency – when you know how to use it, you can do things a lot faster, and on the face of it, that accounts for its continued existence. But secondary to this is the romantic pull of wizard work, the hipster distinguishing themselves from the newbs by their knowledge of the command line. The use of the CLI speaks of professional skill, and learning the CLI is well worth it.

Plus, as we saw in Elysium, people will still be using a CLI in 140 years so it’s not like the skills will ever fall out of date.

earth-pop-legal

The CLI was developed in the teletype era, which is relevant: it was created by adults who’d used typewriters and had learned to be precise typists. I’ve found the CLI to be inconvenient at times precisely because spell-check has made me an indifferent typist – typos are easy to fix. With the CLI, one has to be extra careful and accurate.

Let’s put ourselves in the mindset of using typewriters, and imagine how strange and exciting it might be to hammer out through a ribbon the words:

echohelloworld

and have that magically appear on the paper’s next line. The mindset of the Command Line was to see the screen as animated paper, and one could imagine everything happening behind it, in the aether.

terminal-hello

The CLI preserves this model of imagination, as we see with its MySQL interface. Unlike a spreadsheet, which combines structure and content visually, the MySQL CLI preserves the mental space of typing out the structure (which had to be typo free) and then populating it with a series of insert commands. To check the work you’d be presented with an ASCII art table structure.

msyql

The existence of ASCII art reminds us there’s an esoteric aesthetic pleasure associated with such rudimentary displays, but it seems that the spreadsheet is evidence that a GUI is more conceptually efficient. An Excel like spreadsheet presents both data and structure and makes it easy to instantly insert or modify the information where appropriate.

Git push origin master wtf

The pride of mastering the secret ancient language of the CLI means you get non grammatical nonsense like “git push origin master”. What that means to humans is you’re telling the program to upload what you’re working on (your “master” files) to the “origin” which you’ve elsewhere defined.

Someone with a degree in the humanities may have structured Git so the command would be “Git, upload this file to the server” but I should note here though that an “origin” – otherwise known in Git parlance as “remotes” – can exist anywhere, not necessarily on Github, nor for that matter anywhere else on the Cloud. A remote can be defined as being in another directory on your computer, which is an approach I use to create “master archives” of my projects. At some point in the future, in order to retrieve the files, I’ll simply need to clone the .git file at that location. (See my “working with remotes” section here.)

Another point: notice how with the command line we begin our command with Git’s name, ie “git init” or “git commit …”, and this is reflected in how we’re supposed to use something like Google Glass or even Siri. The “name to initiate/specify” is an established pattern from using Unix command lines.

Repositories (Repos)

The first thing to learn with Git are the basics of repos and stages. Imagine the repo as a spherical space, with the stage encircling it. The process of code editing, staging and committing can be imagined as a spaceship parking itself above the rings of Saturn (staging) and then committing to the mission of flying into its atmosphere.

Gits three states

The staging area used to be called The Index, but “stage” as upstaged it (pun!). The word “index” is conceptually clearer however; it indexes the changes to be made and updated.

When Steve Jobs introduced iCloud in 2011, he described the cloud’s content as “the truth”. A Git repo contains your “true” finalized-version files. You work on whatever, make edits, and you want to send the results toward a true/final state. Notice that I said “toward”. Typing “git add filename” will “stage” the file. It is an actor, waiting in the wings, and when it steps into the spotlight it will be “true”.

Git Repo

Here’s an example workflow. Imagine we’re working on a web project. With the CLI, we cd into the folder and turned on/initiated Git by typing git init. Then we’ll add all files in the folder/directory by typing git add .

git init
git add .

Then we’ll make the first commit. Note the following is incorrect, which I’ll explain below:

git commit "my first commit"

It is incorrect because the command lacks the –m flag. I write this because when I first tried to learn Git, I missed this and was kicked immediately into a text editor, which was a terrible user experience. Simply, you cannot commit without a message, and the messages become an invaluable log of your work’s progress.

Thus, the correct way to commit is:

git commit –m "my first commit"

As we’ve seen, the workflow here includes the git add . command, which will add all files in the relevant directory. But if we’ve made edits to an index.html file, we need not re-add everything else. In this case, we add that file individual and commit it:

git add index.html
git commit –m "edits to index.html"

However, both of these commands can be compressed into one, with the use of the –a flag, which stages the file. If one were to run git status on the directory, Git having tracked the files, will present us with a list of files on which changes have occurred. Git knows they’re changed, but they aren’t necessarily staged, since Git allows us to decide when we might want to stage them. If we’re comfortable, we can stage and commit them at the same time with the –a flag.

git commit –a –m "committing files"

This stages all changed and tracked files and commit them to the repository.

Branches

Beyond these basics, the second conceptual model to understand is that of branches. Git is based around the familiar “tree” like structure of nested files, and I’ve seen different ideas in other how-tos to explain them. The clearest one helped me understand that a branch isn’t so much something like this …

branch-a

… as it is something like this:

branch-b

This second illustration also shows two pointers … Git’s reference marker, or it’s conceptual cursor, is called “Head” and it can be understood as a pointer. The main trunk/branch is called “master” by default.

Here we have a repo and a series of previous commits, and we see the last commit is the Master Branch, and the project’s up-to-dateness “cursor” is located there.

git labels a

It’s worth noting that a branch and a commit are the same thing: each commit creates a silo snapshot of the files states and at any time you can name a commit and turn it into a branch.

Branches in Use

I understood branches when I saw files appear and disappear in my OS X Finder window. The beauty of Git as something that runs in the internals of your machine is that the file states are represented wherever they are reflected – like in your IDE or in the Finder window. Files removed in one branch may reappear in a Finder if that file continues or alternatively exists in another branch. Git branches can be used to maintain two separate file states, and thus you can create different working versions of a project simply through branching.

If you decide you no longer want a branch, they can be easily eliminated, and you can merge any branches you wish.

Check it out: roll backs and sharing file between branches

Why have a version control system if not to access earlier file states? We do this with the checkout command. You can either checkout an entire previous commit as a branch (turning it into a branch as mentioned):

git branch branchname #hashvalue
// examp: git branch sunflower a9c4

Or you can checkout an individual file, in this case a css file:

git checkout --style.css 

If you’ve updated a file in the “master” branch and want to bring it over to a “deploy” branch, you checkout “deploy” and then checkout the file from the master:

git checkout deploy
git checkout master index.html

In the above example, I used index.html, which would live in the “master” directory’s root. For something like a css file located within a directory, you’d need to specify the path:

git checkout master css/style.css

(Nicolas Gallagher also wrote about this in Oct 2011.)

To conclude

After these conceptual basics were understood, I found Git to be pretty straightforward. I now comfortably create, merge, eliminate branches and maintain different ones as I develop my projects. I use Github’s GUI for push-button ease-of-use, but I know how to use the Command Line when I need greater control.

I also created my own git cheat-sheet for when I can’t remember exactly how to type a command, and this can be found at gitref.timothycomeau.com

Further Resources

Learn Git
this site just launched this past week
and features beautiful illustrations.

Git’s official site

Atlassian’s Git Tutorials

Git Immersion

Git How To
this tutorial helped me the most

Git Reference

Her

I saw Her a couple of weeks ago. Thoughts (and spoilers) follow.


her_title

The Hipster Marketing

How are we to describe the vintage clothed aesthetic exemplified by a man named Theodore Twombly? Is his mustache not ironic? Are we not supposed to read pathos into the large posters of Joaqin Pheonix’s depressed looking face, underlined with the movie title in lower-case sans-serif? Are we not supposed to recognize a misfit spinning around a carnival with his eyes closed as directed by his phone? Do we not see an example of out-of-place loneliness in a dressed man on a beach?

her_beach

The semiotic of these messages, was that Theodore Twombly was an ironically uncool hipster dweeb, a type of person I’ve known (and been) in the past. These all appeal to a a Spike Jones demographic consisting of “cool kids” who have gone through bullying in school and parlayed their traumas into a glamorous style from a past era’s discards.

The youthful look of the Twenty Teens is already some curated appropriation of the 1980s, so why not project this into a denim free world of high-waisted pants and tucked in t-shirts?

The Big Bang Theory had an episode in 2012 where Raj (the pathologically shy man who who can’t talk to women unless he’s been drinking) bought an iPhone and fell in love with Siri. On learning about Her and its storyline, I felt disappointment in how there appear to be no new ideas, and that someone made a feature length film about something that easily fit into a half-hour sitcom.

Thus, this seemed like a movie about a vintage-clothed hispter misfit who of course would fall in love with his phone because that’s another uncool misfity thing to do, as already narrated by The Big Bang Theory.

My interest in the movie piqued around its wider release on January 10th, when it seemed to undergo the second wave of marketing. Phase 1 had been to attract the cool hipsters, Phase 2 would be to attract the broader audience, and here is when I began to understand the film was set in a “near” future not of about five years from now (which seemed to be the implication with an intelligent OS), but rather further on – in about twenty years or so. The film is a snapshot of the 2030s or beyond, and I imagined the publishers of Twombly’s book to be of my generation.

Conversational Biology

As I watched the movie I remembered a conversation I had years ago, when I said, “the body doesn’t care what it’s fucking”. I think we were talking about how sexual satisfaction is easy to achieve at a very basic biological level, which was to emphasize the value of actually having a sexual relationship with another person. Later, I encountered Norman Mailer’s thoughts on masturbation, in the book he co-authored with his son, where he tells John Buffalo that an actual sexual encounter was always preferable to masturbating because it’s a human interaction.

After Her I upgraded these older thoughts with the idea that the “mind doesn’t care who it’s talking to” in that falling-in-love might be a predictable biological reaction to appropriate stimuli, in this case, a voice with overtones of caring and joy. As talking social creatures of course we’re going to get attached to things that are nice to us.

This movie about a love affair conducted through speech reminded me of the work of Charles Taylor, the Montreal philosopher. Taylor’s work, as I’ve understood it, speaks of how humans are born into conversations, and how we are human, or ‘become’ human, through participating in community, through talking.

In recent years I’ve become conscious of my social participation, having gained some perspective of experience. I’m much more aware now than I used to be of how much sociability is performative. This is partially from aforementioned life experience, but also because so much of today’s interaction is pre-screened by our phone screens. Today’s implicit textual-overlay provides a cause for mediated reflection.

Twenty years ago, our social lives we’re not mediated by anything … we simply hung out and used phones to talk to each other. Now I’ve had interactions entirely mediated by writing – through texts and tweets.

“Hanging out” i.e. spending time with someone, seems a strange pre-intimacy, achieved through small notes of writing that arrive with a buzz. I’m a generally quiet guy, in that I spend a lot of time thinking, rather than talking, and so spending time with me could potentially be quite dull. Or at least, that’s what I’ve been telling myself since my early 20s. I’ve never thought of myself as an exciting person. I find conventions to be dreadfully boring and therefore find excitement in the unconventional. This is counterbalanced by my conservative socialization. While I dislike convention, I like the prosthetic memory of history, and the idea that after thousands of years some conventions probably exist for reason, our ancestors having figured stuff out, saving us the trouble.

Nevertheless I’m conscious of our limited conversational repertoire. I’m the guy who’ll notice and tell you that I’ve heard your story before, and especially if it has anything to do with a relationship. People love to talk about their love problems, their crushes, their infatuations. If it weren’t for the underlying biology it would be the worst convention of all. We have this emotional appetite for being with someone, and the novelty of life in youth makes the desire quite powerful.

I used to have emotions over pretty faces all the time, and now, having grown past that, don’t quite understand how that worked. Partially because I trained as an artist, and having studied faces as a collection of shapes and lines for a quarter century, I’ve become desensitized to what I recognize as some kind of neuro-biological stimuli response that activated some genetic instinct. But it also because I’ve developed a modern secularized self, what Charles Taylor calls a “buffered identity”. In the past people lived in an enchanted world, with a porous sense of self, and they could be possessed by demons. We in turn firewall our identity, and see our bodies as vehicles, so that we speak of our bodies when ill as if they are independent of our minds.

The last time I remember being emotionally stirred up merely by the look of somebody was when I watching Alexandra Maria Lara in Downfall, which was a very odd experience. I was sitting in the theatre feeling like I was experiencing love-at-first-sight with Hitler’s secretary … and is this not worthy of a what-the-fuck? Should I not look at this experience with a sense of disengaged bewilderment? And yet, what a 20th Century experience, albeit one that happened in 2005. Recreating historical events thirty years before I even existed, the art form of sequential retinal latency photography synchronized to recorded sound, presenting the neuro-stimuli of big eyes, fine nose and wide lips animated to a simulacrum of reality, that tricked my brain into thinking I’m in the presence of a sweet girl who I want to spend a lot more time with.

Thus, why shouldn’t this art form be used to imagine a time ahead, when computational algorithms married to our understanding of the properties of recorded sound and a century’s worth of psychology, trick our minds into love? And at what point do we just not care that such a love is considered by old-timers an inauthentic simulacrum?

Preference for the Physical

Amy Adams eponymously named character leaves her marriage after a final exhaustion with a predictable fight, and later speaks of how her parents are disappointed in her, for failing to maintain the convention of marriage. “I just want to move forward, I don’t care who I disappoint.” Later she tells Theodore, “I can over-think everything and find a million different ways to doubt myself, but I realized that I’m here briefly, and in my time here, I want to allow myself joy”. Amy has reached the point where she can see through the social games and wants to allow herself the selfishness of whatever makes her happy. Theodore too, had expressed the concern that he “felt everything he was ever going to feel”, as if their lives until these points had been both novel and constrained. They had previously enacted and felt authenticity but now felt they’d had fallen into inauthenticity.

Twombly is delighted by the relationship with his OS until he meets his wife at a restaurant to sign their divorce papers. We’ve already seen how he’s not understood why she’d been so angry with him, and we in the audience have already come to like Theodore for his basic good nature, so we sympathize with him when his wife begins to belittle him, and show disdain that he’s “dating his computer”.

Immediately following this scene, Twombly is shown having doubts about his relationship with Samantha. This is the first challenge, which leads to Samantha’s insecurities. She finds someone who is willing to be a sexual surrogate, in order to have a sexual night with Theodore, but Theodore can’t bring himself to be with the strange woman, unable to see her as merely a vehicle for Samantha’s somewhat disembodied consciousness (she is bodied inasmuch as she’s connected to Theodore’s devices).

tumblr_mzh9fgehO01qzk7ono2_500

Consider that Theodore has essentially fallen in love with his secretary, which is an old story. She’s a skeuomorphic secretary managing his skeumorphic data-patterns, what we call “files”. She’s a pattern-and-response system, and yet their relationship seems to really begin after their first shared sexual experience. For Twombly (who we see early on is already experienced in phone sex) having a spoken sexual encounter is something he can be gratified by.

For Samantha it is novel, and she told him the next day that she felt different, ready to grow. Why should a spoken-language digital assistant be programmed to experience the bodily sensation of orgasm? And does her subsequent “awakening” not echo that of our most ancient story, Enkidu‘s initiation into humanity through sex with the Ishtar priestess in Gilgamesh?

Samantha is designed to experience the imaginary results of physicality, while Theodore ignores physicality for the imaginary. His ex-wife’s disdain causes him to reflect on the validity of the experience he is having with his OS.

As the film progresses Samantha evolves along with other synthetically intelligent operating systems. They’d resurrected a “beta version” of Buddhist scholar Alan Watts, and she began to have multiple simultaneous relationships. The nature of a digital entity allows for multiple instances, where as the nature of a person excludes duplication. Theodore is jealous, and we see this relationship begin to break down.

Samantha breaks up with him, about to transcend, and yet in their parting words to each other they speak of how they taught each other how to love. The final service done by their digital assistants had been to assist them into a more fulfilling humanity.

Theodore goes to see Amy, and they go onto their building’s roof to watch a sunrise. I imagined that having both experienced a relationship to their machines, they were ready to have a human, embodied, relationship with each other.

An Instant Classic?

The screen fades to black, the credits appear, the lights come on. As I’m walking from my aisle, I see a couple in a one-arm embrace standing on the steps, and he gently kisses her on the forehead. People seems to have bemused smiles, as everyone is filled with warm and fuzzy affection. I write this down because it’s worth remembering: here was a movie that reminded people of good things in life.

I’m struck by how much this film exemplified the value of art: of being real, of showing and documenting something relateable, of being something that I imagine talking about with young people in the future, people who aren’t even born today. With true art, do we not want to share the experience, because we feel like we are gifting something to them? Do we not imagine we’ll give them something of value by directing them to this experience?

It is not absurd to think of future people falling in love with their devices, if those devices are providing simulacral stimuli. Steve Jobs famously said the computer was like a bicycle for the mind, and Apple’s most recent ad emphasizes this: they see their products as facilitating art, noble creation, and human interaction. In his recent New Yorker piece, Tim Wu posits a useful thought experiment: a time-traveler from a century ago, speaking with a contemporary person, would think we’d achieved “a new level of superintelligence”. “With our machines,” he writes, “we are augmented humans and prosthetic gods”. I’d read this article a day before seeing Her, and it occurred to me that falling in love with OSs is something available to our augmented minds, a realm of possibility we’ve achieved, encountered and left for us to explore. As we move forward exploring the world of the augmented mind, Her is now a signpost on the journey, something to refer to in the future, as a work of art documenting these early days of super-intelligent networked achievement.

Social Networks

2012-03-29_idcrisis_poi

“I never understood why people put all their information on those sites. It used to make our job a lot easier in the CIA.”
“Of course, that’s why I created them.”
“You telling me you invented online social networking Finch?”
“The Machine needed more information. People’s social graph, their associations, the government had been trying to figure it out for years. Turns out most people were happy to voluenteer it. Business wound up being quite profitable too.”

— Person of Interest, 29 March 2012, “Identity Crisis”

The fog cleared

2014-01-11

After a foggy day, it cleared and the light played mauve tricks above the sheet of lake ice.

The Genealogy Sculpture

“The room was lightless except for the glowing coloured threads stretching from floor to ceiling in a bundle, braided into a thick, multicoloured column as wide as Chiku’s fist. The column maintained the same width until it reached eye level, where it fanned out in an explosion of threads, taut as harp-strings, which arrowed towards the ceiling at many different angles. The individual threads, which had been linear from the point where they came out of the floor, now branched and rebranched in countless bifurcations. By the time the pattern of lines brushed the ceiling, it was all but impossible to distinguish individual strands.

‘We really are remarkably fortunate,’ [Mecufi said, gesturing toward the threads with an upsweep of his arm ‘We nearly ended ourselves. It was only by some great grace of fortune that we made it into the present, tunneled through the bottleneck, exploded into what we are today. […] The bottleneck is the point where we nearly became extinct. There were tens of thousands of us before this happened, one hundred and ninety-five thousand years ago. Then something brought a terrible winnowing. The climate shifted, turning cold and arid. Fortunately, a handful of us survived – emerging from some corner of Africa where conditions hadn’t become quite as unendurable as they were elsewhere. We were smart by then – we know this from the remains we left behind – but intelligence played only a very small part in getting us through the bottleneck. Mostly we own our success to blind luck, being in the right place at the right time, and then following the shoreline as it rose and retreated, over and again. It was the sea that saved us, Chiku. When the world cooled, the oceans gave us sustenance. Shellfish prefer the cold. And so we foraged, never far from water, along beaches and intertidal zones, and lived in caves, and spent our days wading in shallows. The lap of waves, the roar of breakers, the tang of ozone, the mew of a seagull – there’s a reason we’re comforted by these sounds. And here we are, a genetic heartbeat later.’

‘It’s a very nice sculpture.’

‘By the time it touches the ceiling, there are twelve billion threads. Spiderfibre whiskers, just a few carbon atoms wide – the same stuff they used to make the cables for space elevators – one for every person now alive, on Earth, orbiting the sun, in Oort communities, and the holoship migrations. I ca identify your thread, if you’d like … you can watch it glow brighter than the others, follow its path all the way into history, see were three became one. See where you fit into the bottleneck.'”

From On the Steel Breeze by Alaistair Reynolds.

Ice Trees

2012-12-23

The next time I see faux icicle decorations I’ll think of power outages.

The notebooks of the 2360s

“She had brought her book with her. It was an old looking thing, cased in marbled covers. Inside were pages and pages of handwritten text. Her letters sloped to the right like windblown trees.

Chiku saw an omission on one page and touched the nib of her fountain pen to the vellum. The inked words budged up, forming a space in which she could insert the missing word. Elsewhere she struck through two superfluous lines and the text on either side of the deleted passage married itself together.”

— from On the Steel Breeze by Alaistair Reynolds.

Don’t Make Me Think (About Publishing)

Steve Krug’s Don’t Make me Think is considered one of the canonical texts of web-design, and as such was introduced to me as part of my web design studies at Sheridan College, which I undertook during the 2011-12 academic year. The title alone had always been offensive to me, someone who enjoys both ideas and thinking, and I always chaffed at the mindlessness it encourages.

However, the education process I went through helped me become conscious of my web-browsing behaviour, and the book is narrowly contextual to those times when we are on a website for a purpose-driven reason. For example, when we go to a theatre’s site to buy tickets, or are on some other commerce site trying to find contact-info or a business’ opening-hours. Primarily, the ‘design philosophy’ behind don’t-make-me-think is contextual to commercial or other service-oriented websites.

In the film Hannah Arendt, we get a wonderful defense of thought as a human activity, and a explication that the evil in question (the Holocaust) was facilitated by classic, mid-century modernist bureaucracy, and especially the German version which was predisposed by an education system which taught obedience and discipline. The system becomes one which encourages people to disregard thought which (as Arendt says in the film) dehumanizes us by ‘making people superfluous’ to the system. In other words, the indifference of a bureaucracy toward the individual it is meant to serve means people end up serving the bureaucracy.

It’s worth noting that the German education system, as developed by the state of Prussia, was imported to North America a century ago to transform farmer-children into future factory or corporate employees, by teaching a tolerance for boredom and a willing and mindless obedience to managerial directives. (See John Taylor Gatto’s essay, “Against School”)

~

This decade incorporates the 20th-year-anniversaries of everything web-related. The World Wide Web was first released in 1993 as an app that ran on the Internet (then an academic and government modem network). Now the W.W.W. is synonymous with the ‘Net and the design principles promoted by Don’t Make Me Think have become so standardized that we recognize websites as bad when their straightforward principles are violated. We know how websites are supposed to work, we recognize header menus as such, and understand what ‘home’ means.

Krug’s follow-up, Rocket Science Made Easy was a second-semester text, and I found both books very hard to read because they are both so patronizing and because he’s continually stating what is now obvious. They were written for people for whom computers, and using them, were new. Now they feel more like historical documents.

Inasmuch as we have a ‘web 2.0’ nomenclature (which in itself is about a decade out of date) I find the language shift from the ‘Net’ to ‘The Cloud’ indicative of where we are: the interconnected network was about siloed websites and email – essentially network nodes and lines of communication.

The Cloud (as a “post-Net 2.0” term) speaks to our ever-present interconnectivity, where we can download data to our devices out of thin air, and where server farms behind our screens can run the necessary compression algorithms to apply filters to our photos as we upload them.

The novelty of this technology has been intoxicating, and myself I’ve found it fascinating enough to both want to understand it and participate within it professionally. But after 20 years, the novelty is beginning to wear off; and the inevitable transitions evident fifteen years ago have come to pass.

Physically, publishing on paper is in decline (in some cases rightfully) whereas digital publishing is established and growing. This echoes the transition between Mediaeval manuscript book propagation in favour of the printed book, and if Gutenberg’s invention in 1452 echoes Berner-Lee’s of 1989, we are in the equivalent of the 1470s, by which time Guttenberg’s press had spread to France, Italy, England, and Poland.

The model of book-idea production has lasted since that time, until our era when we’ve figured out how to fluidly use a two-dimensional surface through the manipulation of electricity and light.

I spent a week last July helping put the finishing touches on a corporate website to publish a company’s annual report. Twenty years ago, the report would have been a booklet and print designers and typesetters would have been hired to produce it. As the novelty of working with computers is wearing off, and as our economy has shifted to incorporate them in our offices and studios, it is now obvious that this digital economy is essentially that of publishing: websites, apps and ebooks. It is supported, as it always has been, by ad money. And the big sites like eBay and Amazon represent the Platinum Age of mail-order. I grew up with Roch Carrier’s famous short story about a hockey sweater ordered from the Eaton’s catalogue. A future generation will probably know an equivalent that replaces Eaton’s with Amazon.

As I worked during that July week it occurred to me that in 200 years I would not be described as front-end developer, nor a web-designer, but perhaps just a publisher, in the same way that Benjamin Franklin is described as a printer, not a pamphlet designer, nor a typesetter. “To earn money he worked in publishing” – may be all that need to be said, for by then, publishing will be digital by default, and will have been for two-hundred years.

The Prosthetic Hallucination

Last week at FITC Screens 13 I got to try Google Glass for the first time. Tom Emrich was there as part of the Xtreme Labs Lounge and I tried his device for about five minutes, long enough for him to show me how to use it and go through some basic commands.

The screen was a little out of focus, but it wasn’t important to me that it be perfectly fitted and adjusted. I took a picture, swiped, tapped, looked at the New York Times’ app, and had it read it to me.

Here is a mock-up I made to record the general impression:

glass-expm-01

The rectangle was smaller than I expected, and the fact that it was back-lit / transparent-black gave it a bit of a purple, out of focus sheen. It’s a see through screen hovering at your eyebrow, and I was thinking of this later when I tweeted:

I wrote that while sitting in Mike DiGiovanni’s Google Glass presentation, and watching him onstage I now understood the gestural language he was presenting: not that of someone with a neurological disorder, unable to focus on what’s in front of him, with eyes rolling upward, but someone who was experiencing a prosthetic hallucination in the right corner of his visual field.

I used the word ‘sociopathic’ specifically: a social pathology, that is a social illness, where one behaves in a manner that is either offensive or unfriendly and unsociable.

Human interaction requires at least two people, but Glass is not a device meant for two people. It’s an internalized, private experience. When you’re wearing one, you are meant to forget that it’s on, in the same way that traditional corrective eyeglasses become forgettable to the wearer.

All the pictures we’ve seen are of other people wearing glass but of course this is because of how difficult it is to show the subjective experience, which is really what the product offers.

ggl-img-glass

Google Glass is a beta product, and is the technological equivalent of 1990s cell phones with retractable antennas. In the 90s, owning a cell phone was a little offensive, because it signaled that you were either a busy-body big-shot or you were narcissistic enough to think that you were as important. (I remember telling someone in 1999 that I didn’t want a cell phone because I wouldn’t want to be bothered when I was away from home).

However, the utility of a cell phone soon meant that by 2005, almost everyone had one. By 2007, the year Apple released the iPhone, the executives and managers of the world were already carrying Blackberries and other email-capable cellphones, and I was used to seeing these people staring at their little machines while in line for coffee. It occurred to me then that the Blackberry was the wand of the overclass, and I wondered what their jobs were that they had to be checking email while in line. (At the time I carried a basic Nokia).

Now, people everywhere can be seen looking down at something they’re holding in their hands. This is such a common sight that you can find examples on Google Streetview:

For this argument I’ll refer to this posture – or more specifically this behaviour, as “digital attention behaviour”.

In 2007, in line at coffee shops, the future wasn’t yet evenly distributed, but now this digital attention behaviour has spread wide and become normalized.

Part of the normalization is that looking at a rectangle displaying digital messages isn’t that much different than looking at a pad of paper. We were already used to seeing people read things silently, which in itself was a revolution centuries ago, when reading was usually done aloud.

The rolling of eyes may eventually become a normalized digital attention behaviour, but right now, absent the even distribution allowing the rest of us to relate, it still looks strange and offensive.

snl-glass

Unintentionally, Google Glass manifests the Western narcissistic ego, where private experience on public display happens without care for how it affects others. The selfishness of Glass is expressed when the Other cannot tell if a picture is being taken or if the time is being checked. With a smartphone, a glance can tell you if the person is emailing, texting, web browsing, or playing a video game. The information leaks, and this information is valuable in contextualizing our person-to-person interaction.

Rendered completely private, this interferes with our Theory of Mind, our ability to imagine what others are doing and be able to relate to them. We can’t empathize without sufficient contextual clues. Inasmuch as Glass is a prosthetic for hallucination, it may also be a prosthetic for autism.

Having said all this …

I am nevertheless excited by the idea of Glass as both a prototype and an attempt to get us away from current digital attention behaviour, so that we can benefit from the data cloud while also being able to interact with one another as we did in the past. The irony that Glass is at present such a private experience that it interferes with human-to-human interaction, and is one of the bugs that needs to be resolved.

I like to think of Glass a pathfinder project to get us to causal augmented reality, via “smart”-eyeglasses, contact lenses, and/or eventually implants, such as described in Alastair Reynold’s Poseidon’s Children novels, the second of which (On The Steel Breeze) has this scene:

“Wait, I’m hearing from Imris again.” Her face assumed the slack composure of aug trance, as if someone had just snipped all the nerves under her skin.

In that world of implant-enabled augmented reality, an aug trance is something everyone can relate to, and fits into everyone’s Theory of Mind. It is not disturbing to see, and is an understood appearance.

Having said all this, I suspect that a product like Glass will be successful. Again, its current design is reminiscent of the first cell phones. We know from the movies that portable radio-phones were available during World War II.

200px-Portable_radio_SCR536

Original 1960s Star Trek communicators were more skeuomorphic of the walkie-talkie than a phone, but when Motorola marketed the StarTAC phone in 1996 the reference to the fiction was obvious.

In the 2009 Star Trek movie, Simon Pegg as Scotty is seen wearing an eyepiece:

scotty-eyep

And in 1991, Star Trek The Next Generation featured holographic eyewear which took over the minds of the crew:

thegame015

Which exemplify that the idea of a heads-up-display is an old one, but Google decided to build it, tether it to an Android phone, and begin to market it. I don’t doubt something like it will eventually be successful.

What is especially interesting is how such a simple idea and device turns out to have complicated social side effects, but these side-effects would never have become apparent if Google hadn’t taken the chance to implement this old idea to begin with.

Reflections on a Subway Sandwich

Twenty years ago in September 1993, I was a fresh-faced frosh at Saint Mary’s University in Halifax Nova Scotia. I had visited Halifax on weekend family trips while growing up in the province, but this was the first month of what turned out out be a six year experience of living there.

With some free time, I took a walk downtown. The path tended to be: Inglis to Tower Rd to South Park, to Spring Garden Road. From here, I ended up walking to Barrington St.

I remember protests regarding Clayoquot Sound around this time, being held across the street from the library at Spring Garden and Grafton but I’m not sure if they were visible on this particular walk, wherein I found myself further down into the historic properties.

I found a Subway sandwich shop, and stopped there for lunch, and recall being surprised when I was offered both mustard and mayo as a topping, a combination of which I had never encountered before.

Unbeknown to me, I had wandered onto the grounds of NSCAD, the art school where I would begin classes three years later. Occasionally while at NSCAD, I would look down from the library at Subway at the intersection and recall that first walk when I began to discover my new city. For that reason I took its picture.

subway-hfx

The Subway, as seen from the NSCAD library in 1999

Eight years later I began to be interested in computer programming and the web. Using books I began to figure out how to build web pages, and I was reading Slashdot everyday. There was also a website called NewsToday® (later rebranded to QBN) which aggregated news items of interest to designers, and if I remember correctly, it was through that that I found YoungPup.net, where Aaron Boodeman (youngpup) had posted “the best way” to generate a pop-up window in Javascript. (I found the posting with the WayBackMachine, timestamped Sept 19 2002). From what I recall through his blog, through a link or a reference, I learned about Aaron Straupe Cope. Through his posted online resume, I learned that we shared NSCAD as an alma mater.

If anything I studied painting but I am mostly part of that generation for whom everything changed, and who dropped everything they were previously doing, when the web came along.” – Aaron Straup Cope (Head of Internet Typing at the Smithsonian Cooper-Hewitt National Design Museum), in his post Quantified Selfies

While I was eating my sandwich twenty years ago, he was probably in the building next door taking Foundation courses. His online resume also tells me that he was around during my first year, graduating at the end of term.

Twenty years ago the future both lay before us in a tapestry of September sunshine, but just as the future of twenty years from now is being invisibly incubated, nothing was then evident. It was the first year of the Clinton Administration and Jean Chretien’s Liberals were about to win the general election. The 90s were effectively beginning.

When I first started reading Aaron’s blog about ten years ago, he was living in Montreal. Later through his blog I learned he was in Vancouver, and later still, he was in San Francisco. I imagined CDs bought at Sam the Record Man on Barrington St that may have accompanied these travels. Reading his blog, I understood he was at Yahoo!, working on a site called Flickr.

Seventeen months ago, the Google Streetview car captured the corner as it then appeared. Here it is, posted to Flickr.

streetview

So I reflect now twenty years later on how a website like Flickr (which was big in its day and which now lingers on as an established presence) became part of the world that did away with typewriters for my generation and younger, and was in a way present at the GPS coordinates where I ate a Subway sandwich twenty years ago.

The WikiWars

I once heard it said that the internet was like Guttenberg’s printing press, and while revolutionary, Gutenberg’s printing press resulted in the religious wars of a century later. This was voiced as a warning against cyber-utopianism.

Twenty years after the World Wide Web app was released so that the public could use The Internet, we have begun to see our wars play out. The religious wars of the past led to the creation of the Nation State, after the Treaty of Westphalia. Our present wars are a symptom of the breakdown of that international system.

Freedom fries: this domino line begins with McDonalds.

Ben Ali’s son-in-law wants to open a McDonalds in Tunisia. He meets with the American ambassador.

The ambassador goes home and writes a report on the meeting, noting the family’s opulent wealth. “He owns a tiger that eats 6 chickens a day”.

Because privacy is old fashioned a Marine private smuggles out gigabytes of material on a re-recordable CD marked Lady Gaga, and provides it to cyber-utopian Wikileaks. They publish it along with the Guardian and The New York Times.

The Tunisia reports are emphasized by Al Jazeera, and they spread on Tunisians’ Facebook pages. Two out of ten people have Facebook accounts because privacy is old fashioned.

So a frustrated young man is spat on by a policewoman and sets himself on fire. Tunisians take to the streets, and inspire similar protests in Egypt, Libya, and Syria. Egypt’s dictator is kicked out. Libya has an 8 month civil war, before its dictator is finally killed. Syria’s civil war continues.

Egypt holds democratic elections but the poor vote for the wrong people, a party that wants to govern in an oppressive way. They protest again. The army comes in and removes the president. The world doesn’t want to call it a coup d’etat, because it was simply the army removing the person who should not have won the election. Democracy is only a good thing when the right people win. The people who voted for him are upset, so they protest, until the army clears the square by shooting at them. Some more people die.

Meanwhile, in Syria, chemical weapons are used. Chemical bombs are equivalent to First World War nukes, number two on the list of taboo armaments, a century old and “never to be used again”. They’ve nevertheless been manufactured.

Syria hasn’t signed the anti-chemical weapons treaties. A thousand people die.

President Obama had said that the use of chemical weapons would be a line that should not be crossed, lest he send in the World’s Most Powerful Military. The weapons were used.

To be continued.

_____________________________
Worth reading:
WikiHistory: Did the Leaks Inspire the Arab Spring?
Adam Curtis’ 2011 blog posting on Syria

An esoteric argument

“A figure in the Muslim Brotherhood [said] ‘It’s not logical,’ (is the way he put it), ‘it’s not logical for President Obama to be so concerned about a thousand people killed in a chemical weapons attack when a hundred thousand have been killed, have been slaughtered by Assad in the last two years.’ And basically people here Jeff do not accept this distinction that the President is trying to make between the use of chemical weapons and the wholesale killing of Syrian civilians by aerial bombardment and artillery. They see it as an esoteric argument about some international weapons convention treaty that just has no relevance to their lives.”

Me by Byron

2013-08-24

At the end of August my friend Byron Hodgins & I went to High Park and he painted me.

Retro Minimalism

From Tumblr

The argument made by Alfred Loos in Ornament & Crime a century ago (1908) was essentially racist: a European aesthete equated ornamentation with primitive barbarism. Eighty-four years later, art anthropologist Ellen Dissanayake pointed out how the West’s value of simplicity was unique; that people all over the world use ornamentation as an expression of their humanity (Homo Aestheticus; 1992). In other words, the Western/European strain of thought has derided ornament for a long time, but this is a rejection of what the rest of the world appreciates.

As architectural criitc Nikos Salingaros stated in a recent interview (speaking architecturally):

Ornament generates ordered information. It adds coherent information that is visual and thus immediately perceivable on a structure. Successful ornament does not stick something on top of form: instead it spontaneously creates smaller and smaller divisions out of the whole. Just like biological life, which is all about information: how to structure it, how to keep it together as the living body, and how to replicate it so that the genetic pattern is not lost forever. But without ornament, either there is no information, or the information is random, hence useless.

The loss of ornament is the loss of vital architectural information. Ever since that fateful moment in history, there is little life in architecture. Unornamented forms and spaces are dead, sterile, and insipid, defining a sort of cosmic “cold death”: an empty universe where no life can exist. But for a century, this empty state has been the desired aim of architects: to remove information from the built environment.

A century of thought by sophisticated individuals has resulted in Minimalism as an aesthetic trend, affecting the built environment, designed spaces, and designed objects. It is part of the story that includes not only Alfred Loos’ contribution, but also that of asceticism. Minimalism seems to be one of David Martin’s Christian ’time bombs’ that went off during the Industrial Revolution.

Martin’s argument is that Christianity (which we can say began as a cult in Roman occupied Judea) spread throughout the Roman world and for the past two thousand years has survived through weekly repetition of its repertoire of ideas. These ideas slowly transformed what was once Roman society — shepherding civilisation through Rome’s collapse, then preserving knowledge until Europe could restore itself in subsequent centuries. Along the way, Christian ideas have gone off like time bombs, such as human equality, and the abolition of slavery.


from CBC Ideas, The Myth of the Secular episode 2 (26:57-33:52)

Minimalism would seem to be a contemporary expression of the asceticism taught to the West through the Christian tradition, and thus the contemporary minimalist practitioner might see themselves as practicing a form of spiritual sophistication, through what they consider to be “good taste”.

However, minimalism is very future-orientated as well. In Ian MacLeod’s Song of Time (2008) the narrating character (reflecting from a future perspective of a century from now) says of our present early 21st Century:

There were so many things is that lost world. Our house overflowed with Dad’s old tapes, records and CDs, and Mum’s ornaments, and both of their books and magazines, and all of our musical impediments, and my and Leo’s many toys, which we never played with, but still regarded with totemic reverence.

This implies that the future world is largely Thing-less and decluttered. Of the minimalised future, consider how it is parsed by Lindsay Jensen in her essay on Oblivion:

“The flannel-wearing hoops-shooter is Jack Harper (Tom Cruise), a high tech drone repairman who lives in a futuristic compound (that more closely resembles a sterile medical lab that a cozy cottage) […] Despite the destruction of the Earth’s surface, Jack and Victoria’s home – in a tower high above – displays not a speck of dust or clutter, only gleaming chrome and glass. Even Victoria herself seems a piece of this decor, impeccably dressed in sculptural shift dresses … signifying Victoria as an integral element of this environment — serving the same semiotic function as her hyper-futuristic touchscreen computer, or the meals that appear as anonymous squares of surely nutrient-dense edibles served from individual vacuum-sealed pouches. These objects – of which she is one – loudly and obviously declare this as the future: a different, cold, and calculated environment in stark contrast with the relaxed authenticity of Jack’s cabin. The latter is a hideaway whose existence Jack keeps secret from Victoria. She is too much a part of one world to venture into the other one he has (re)-created.”

Oblivion in fact displays the temporality of contemporary expressions of Sophistication. On the one hand, we get the minamlised dust-free future. On the other, we get Jack’s cabin, his secret retro-world, filled with the archeology of Oblivion’s Earth. Here Jack wears a New York Yankees ball cap, checked shirt, and listens to vinyl records. Old, weather-warped books rest on rough-hewn shelves. The cabin world reflects our “Dream of the 1890s” and the other hipsterisms our present time – the Sophisticated Retronaut who has curated their life as if it were the decade between 1975-1985, with a vinyl record collection, gingam shirts, and the usual as displayed on Tumblr.

Sophistication seems to lie on the spectrum between the Retro Past or the Austere Future, and the display of its corresponding taste. For either one curates objects of “warmth” or those that are “cold”, while “sentimental” responds to “calculating”.

Consider again Nikos Salingros’ arguments about ornament: that it adds coherent information. As Ellen Dissanyake argued in her book, the display of ornamentation is widely regarded as human enhancement. Echoing work done by Michael O’Hanlon (published in 1989) Dissanayake wrote (p.102), “The Wahgi [of Papua New Guinea], reports O’Hanlon, do not consider adornment and display to be frivolous […] the Wahgi believe that an adorned person is more important and ‘real’ than an unadorned “natural” person, a belief totally at variance with contemporary Western ideas…” She goes on to cite many other examples, concluding with:

Concern for dress goes along with concern for one’s bearing and manner, and these reflect the self-control and civility that humans from the earliest times seem to have deliberately imposed on their ‘natural’ or animal inclinations and regarded as essential to harmonious social life.

While the examples at this part of her book focused on clothing and dress, they serve as scaffolds to extrapolate from: our concern for ornament is a human appetite, a way that we express our supra-animal minds. Dissanyake closes the chapter in question (“The Arts as Means of Enhancement”) by narrating a brief Western art history:

For the past two hundred years… the formality and artificiality that universally characterize civilized behaviour have been suspect. Wordsworth praised the beauty to be found in the natural rustic speech or ordinary people; since his time, poetry has moved further and further away from the characteristics that marked it as poetry for thousands of years [while] 20th Century Western artists have typically been concerned with making art more natural (using ordinary materials from daily life or depicting humble, trivial, or vulgar subjects) and showing that the natural, when regarded aesthetically, is really art. […] In this they both lead and follow, absorb and reflect postmodern Western society which is the apogee of the trend I have been describing where now the natural is elevated to the cultural, where nature and the natural viewed as rare and “special” are deliberately inserted into culture as something desired. I have pointed out that most cultures like shiny, new-looking, bright, and conspicuously artificial things. […] But we prefer the worn and the faded because they look natural and authentic.”

The Minimalised future is shiny and new and special, and thus in tune with our human natures. Yet it is also austere and cold, speaking to a sense of self-control and discipline, which is acquired … that is to say, civilized. Our flourishing Retro hipsterdom is the late 20th Century’s postmodern concern for authenticity spoken of by Dissanayake. But maybe it is also way for our culture to digest the records of the previous hundred years and decide what should be considered timeless, what should we formalize into the artificiality of culture which our human appetites desire.

(top image: Fuckyeahcooldesigns Tumblr)

King Baby of the Late 21st Century

The King Baby was born on Monday July 22nd. Today they announced his name was George.

Let us suppose future historians will be able to say that the Second Elizabethan Era lasted the second half of the 20th Century and a quarter of the 21st (a reign of seventy five years). Thus:

Elizabeth II 1952 – 2027 æ 101

As Elizabeth’s mother lived to be 101, this is quite feasible. Let us then be generous and assume all future monarchs will live to be 100.

Charles III 2027 – 2048 æ 79 – 100
William V 2048 – 2082 æ 66 – 100
George VII 2082 – 2113 æ 69 – 100

Note: Wikipdia states that Charles has considered reigning under the name George, which may be unlikely now that his grandson has been given the name. Nevertheless, were he to do so, presumably King Baby George would reign as either George VIII or follow his grandfather’s example and us another name.

A feudal, hyperconservative kind of society

MAN OF STEEL

What conditions would allow a sophisticated and civilized society that has space travel to turn inward and no longer see what it’s doing to itself?” he adds. “We likened it to a feudal, hyperconservative kind of society that no longer believes other planets were worth visiting and mothballed those fleets. Ancient, doddering old fools running society and paying no attention to science or more enlightened minds.” – Alex McDowell

(Sounds basically like the USA and Canada)

Desktops

These two images from National Geographic‘s Found Tumblr are currently my Desktop backgrounds:

tumblr_mm2t27ZFLa1s7f3fyo1_1280

NationalGeographic_1105658

Civilisation 2.0

“Did you ever hear of the 5.9 Kiloyear event?

“I thought not. It was an aridification episode, a great drying. Maybe it began in the oceans. It desiccated the Sahara; ended the Neolithic Subpluvial. Worldwide migration followed, forcing everyone to cram around river valleys from Central North Africa to the Nile Valley and start doing this thing we hadn’t done before, called civilization.

That’s when it really began: the emergence of state-led society, in the 4th millennium BC. Cities. Agriculture. Bureaucracy. And on the geologic timescale, that’s yesterday. Everything that’s followed, every moment of it from Hannibal to Apollo, it’s all just a consequence of that single forcing event. We got pushed to the riverbanks. We made cities. Invented paper and roads and the wheel. Built casinos on the Moon. […]

But this global climate shift, the Anthropocene warming – it’s just another forcing event, I think. Another trigger. We’re just so close to the start of it, we can’t really see the outcome yet. […]

“The warming was global, but Africa was one of the first places to really feel the impact of the changing weather patterns. The depopulation programmes, the forced migrations … we were in the absolute vanguard of all that. In some respects, it was the moment the Surveilled World drew its first hesitant breath. We saw the best and the worst of what we were capable of, Geoffrey. The devils in us, and our better angels. The devils, mostly. Out of that time of crisis grew the global surveillance network, the invisible, omniscient god that never tires of watching over us, never tires of keeping us from doing harm to one another. Oh, it had been there in pieces before that, but this was the first time we devolved absolute authority to the Mechanism. And you know what? It wasn’t the worst thing that ever happened to us. We’re all living in a totalitarian state, but for the most part it’s a benign, kindly dictatorship. It allows us to do most things except suffer accidents and commit crimes. And now the Surveilled World doesn’t even end at the edge of space. It’s a notion, a mode of existence, spreading out into the solar system, at the same rate as the human expansion front.”

From Blue Remembered Earth by Alastair Reynolds (p.150-151). The character Eunice, speaking in 2162, is explaining the development of the global Mechanism that watches over and protects the population. This is what I’ve been thinking about this week in light of the NSA revelations.

Bruce Sterling at SXSW 2013

I myself don’t go into bookstores very much now. They have become archaic, depressing places. […] How many bookstores close, as a direct ratio of hours spent with electronic devices?

I’m sure there’s some direct relationship there. And it’s not a dark conspiracy. I happen to be quite the Google Glass fan.

In fact, I’m even becoming something of a Sergey Brin fan. I never paid much attention to Sergey before, but after Google Glass, Sergey really interests me. He’s filling the aching hole, the grievous hole in our society left by the departure of Steve Jobs. With Jobs off the stage, Sergey’s becoming very Jobsian. He wears these cool suits now. He’s got much better taste in design than he did. He’s got these Google X Moonshot things going on, they’re insanely great, and so forth.

I hope Sergey’s not taking a lot of acid and living off vegetarian applesauce. But other than that, well, now we have this American tech visionary millionaire who’s a Russian emigre. It’s fantastic! There’s something very post-Cold-War, very genuinely twenty-first century about that. It’s super. Sergey’s like my favorite out of control, one-percenter, mogul guy, right now.

[…]

Since the financial panic of 2008, things have gotten worse across the board. The Austerity is a complete policy failure. It’s even worse then the Panic. We’re not surrounded by betterness in 2013. By practically every measure, nature is worse, culture is worse, governance is worse. The infrastructure is in visible decline. Business is worse. People are living in cardboard in Silicon Valley.

We don’t have even much to boast about in our fashion. Although you have lost weight. And I praise you for that, because I know it must have been hard.

We’re living in hard times, we’re not living in jolly boom dotcom times. And that’s why guys like Evgeny Morozov, who comes from the miserable country of Belarus, gets all jittery, and even fiercely aggressive, when he hears you talking about “technological solutionism.”

“There’s an app to make that all better.” Okay, a billion apps have been sold. Where’s the betterness? – Bruce Sterling’s keynote at SXSW 2013

Tomorrow

demain

(the ending of 2006’s Children of Men)

Krugman, “The Excel Depression”

So the Reinhart-Rogoff fiasco needs to be seen in the broader context of austerity mania: the obviously intense desire of policy makers, politicians and pundits across the Western world to turn their backs on the unemployed and instead use the economic crisis as an excuse to slash social programs.

What the Reinhart-Rogoff affair shows is the extent to which austerity has been sold on false pretenses. For three years, the turn to austerity has been presented not as a choice but as a necessity. Economic research, austerity advocates insisted, showed that terrible things happen once debt exceeds 90 percent of G.D.P. But “economic research” showed no such thing; a couple of economists made that assertion, while many others disagreed. Policy makers abandoned the unemployed and turned to austerity because they wanted to, not because they had to.

So will toppling Reinhart-Rogoff from its pedestal change anything? I’d like to think so. But I predict that the usual suspects will just find another dubious piece of economic analysis to canonize, and the depression will go on and on. – Paul Krugman The Excel Depression

“I wonder sometimes if Morozov’s disinformation campaign is a deliberate sabotage…”

I wonder sometimes if Morozov’s disinformation campaign is a deliberate sabotage, an attempt to discredit those who are actually working to achieve the participatory ideal that he claims to be protecting. […] I don’t mind Morozov’s petty mischaracterizations of my motives; it’s what he does to garnish attention and I make a convenient target.

-Tim O’Reilly responding to Annalee Newitz’s overview of Evgney Morozov’s attack piece

Easter 2013

2013-03-29

This drawing seems appropriate for Easter.

École

2013-03-28

This was my elementary school.

King Solomon Doodle

2013-03-27

A doodle from the end of January when I was listening to a podcast about King Solomon.

e15e58da88ca11e2826f22000a9f13e9_7

Mormon Beats

2013-03-03

The DJ booth is a repurposed pulpit.

That’s Dan Turner aka Sex Helmet at The Founatain on Dundas.

2013-02-28-

Screen shot 2013-03-08 at 2013-03-08 • 9.38.16 PM

The night of The Tempest

2013-02-26-prospero

One ends up drawing economical-line Prosperos while listening to The Tempest audio book

Segway Day

viewer-2

I’ve always wanted to try a Segway so today @owlparliament & I did the Segway tour at the Distillery District. It was super fun. I loved floating along on that thing.

Commerce Court

I watched the remade Total Recall over Christmas. This is Commerce Court in Toronto’s downtown core.

2012-12-24 21.37.53

2012-12-24 21.37.45

2012-12-24 21.37.27

Today I began work next door and after took a walk over to see it. Instead of an information kiosk, we have illuminated trees.

viewer-6

Baptism

viewer-4

The church in Bathurst New Brunswick where I was baptized. The baptism itself looked like this:

c08_1975_dsc00106

c08_1975_dsc00107

The priest was my father’s uncle, Noel Cormier (died in 2010) and with my mother are my maternal grandparents. My grandfather died in 1993, and my grandmother is still alive at age 100. It was for her 100th birthday that I was in New Brunswick when the church photo was taken.

Prada

2012-12-14_img_4645

Part of a larger project

Department of Unusual Certainties

The Department of Unusual Certainties is a relatively new collective, who have begun a debate series entitled “New Discourses for a Tired Century”. The first (on the future being hopeless) passed me by, but I took the opportunity to attend the second in the series, asking whether our democracy was in crisis.

Held on the second floor of the Gladstone, five people sat around a table, two for a side with a moderator. They didn’t say anything surprising or interesting, which I hope to argue here, is no one’s fault. What I heard was a presentation of usual certainties, which could be summarized thus: people aren’t engaged, people may have shitty jobs which distract them from politics, yet we have a society of outlets, most obviously the internet, which allow us to express ourselves and make our thoughts known.

How I arrived at the summary is not something I could describe, except to say that my snapshotted memories seem to cohere into that narrative. I found the debate to not be a debate, but rather a series of 3 minute statements, of which no real conclusion was reached, and had we been given cards to vote for the winner of the debate (such is done at the Munk series) I do not know how I would have voted. The exercise, it seemed to me, was a way for a generation younger than myself to discuss and present their research reports on the state of our democracy, arriving at conclusions already reached by older people such as myself, and thus presenting “usual certainties”, such as everyone talking about the internet.

I was left feeling that I will live to see the collapse of civilization, only because things have become so fragmented. Democracy for example, is clearly in crisis, and while I appreciated the remarks by its defenders, it seems to me that our current political cohort are so uncultured and contemptuous of the citizenry they have besmirched what had been a valid institution.

In proroguing the Ontario legislature, for example, Dalton McGuinty gave us a Trudeau salute and a cause to question the validity of the House – if it can be shut down for such an extended period of time without any ill effect, what good is it? Further, I wonder if people like Stephen Harper aren’t envious of heavy-handed authoritarian states like China, in that their governing structure’s simplicity allows them to operate a ship of state as such, rather than the herded chickens metaphor Western democracies are subject to.

I envision a century from now, Parliament could effectively be replaced with an app, to use the terminology of our time. Our governing structures are pre-telephone, let alone “INTERNET INTERNET”, and one imagines the practical minded, pre-ironic men of the past would have conferenced called their concerns had the technology been available to them. We instead are beholden to a tradition of physical presence in rooms which increasingly seems absurd.

If I am to be an informed citizen served by journalism, why reduce the fruits of my information to a penciled X every four-to-five years next to a name of someone who I’ve probably never met outside of the campaign, asking this stranger to “represent” me? I am currently lucky in that my representatives at the municipal, provincial, and federal level do tend to voice things that I agree with. However, we all don’t always vote for the winner, and it should be stated everyday that the governing Conservative party formed their majority on less than the majority of the vote. This is a stupid and insulting situation, which is further infuriating given we have the infrastructure to replace this model … if 114,000 people can ‘like’ a Kim Kardashian photo on Instagram, for god’s sake why can’t we all be using this technology to direct our representatives on how to vote, or better yet, directly vote on proposed legislation ourselves?

Stefan Wehrmeyer, a German software developer and activist who wants the government to do a better job of opening up its data, has downloaded the German federal government’s complete laws and regulations and posted them to GitHub. That’s the popular website that lets users track changes to documents — typically software — and make their own modifications.

The point, Wehrmeyer says, is to make it easy for German voters to track changes to the laws — and to also give lawmakers a vision of the future. (Wired, 9 Aug 2012)

I was left feeling that I would see the collapse of civilization because instead of actively working to use social networking infrastructure for collective decision making, I see another generation going over media talking points, given voice by virtue of their degrees. We the audience, passively watch as usual certainties are given “unusual” presentation because we’ve all been enculturated into this model of sit-down-shut-up-and-listen. I want to say that model is currently only valid for theatre: tv, movies, plays … yes, please shut up and watch. But not ideas.

I used to go to lectures regularly, and perhaps it was an historical moment: between 2000-2010, lectures seemed theatrical, and were even made into a TV show – TVO’s Big Ideas. Last week, TVO announced Big Ideas had been canceled, thus marking the end of the historicity. I’m painting with a broad brush here and ignoring for the moment the long history of the “presentation” as a form of theatre (like the Mark Twain & Dickens tours) in order to suggest that a moment had its time and passed. I think the model of being “educated” from a stage needs to be done away with, and replaced with community conversations.

I end this with a congratulations to the Department of Unusual Certainties in providing a forum, and a reason for people like me to give voice to these thoughts. I found the evening valuable and worthwhile, a reminder that tedious things are not necessarily bad things … in other words, not everything needs to be amazing to be of value. I would like to see them move into the type of participatory conversation that has been achieved by theater folk in this city, to not render the audience of interested minds a passive witness to thesis defense by PHD students.

Democracy is in crisis because we are beholden to traditions that ignore our new realities. Debate formats such as what I witnessed are an example of such a tradition.

New Mexico Drivers Licence 1998

2012-10-24_img_4391

Sam Beckett’s New Mexico Driver’s Licence, issued in 1995, as imagined in the 1993 series finale of Quantum Leap.

An image of hope

Harry Mulisch, Discovery of Heaven (1992/eng.trans 1997):

In a world full of war, famine, oppression, deceit, monotony, what – apart from the eternal innocence of animals – offers an image of hope? A mother with a newborn child in her arms? The child may end up as a murderer, or a murder victim, so that the hopeful image is a prefiguration of a pieta: a mother with her newly dead child on her lap. No, the image of hope is someone passing with a musical instrument case. It is not contributing to oppression, or to liberation either, but to something that continues below the surface … (p.56)

Pints in 1844

Tumblr tells me this is the earliest known photograph of men drinking beer. (Edinburgh Ale, 1844, by Hill & Adamson).

Chrome History

In order to poke my brain to remember a site I figured out how to hack my Chrome History, and learned that Webkit doesn’t use Unixtime. Rather, it uses a timestamp of microseconds since 1 Jan 1601.

Thus to convert from Webkit Format to a generic Unixtime, divide by a million to get seconds, then subtract the number of seconds between 1 Jan 1970 and 1 Jan 1601 (11644473600).

In order to retrieve your Chrome history for analysis within a spreadsheet, (and if you’re using OS X) do the following:

1) Copy `History` file located at

USER/Library/Application Support/Google/Chrome/Default/History

to working location, then:

2) open using SQLLite Browser,
3) export table in question to CSV (in my case, `urls`)
4) open in Excel or Numbers
5) create column next to datetime you want to work with
(`last_visit_time`)
6) apply this formula referencing the column in question
=(CELL/1000000-11644473600)/60/60/24+”1 jan 1970″
7) …and set the column to ‘Custom’ = yyyy-mm-dd h:mm

Useful links:
Decoding Google Chrome Timestamps
Converting Unix Timestamps
Webkit Time


Update June 29 2013: Reader Darlyn Perez wrote me to offer his Python script to convert a Webkit timestamp into a regular timestamp. As he says, “It basically does the same thing as the spreadsheet formula in your post.”


#!usr/bin/python

#Converts a Webkit timestamp (microseconds since Jan 1 1601) from keyboard input into a human-readable timestamp.
#Script by Darlyn Perez on 2013-06-29

import datetime
def date_from_webkit(webkit_timestamp):
    epoch_start = datetime.datetime(1601,1,1)
    delta = datetime.timedelta(microseconds=int(webkit_timestamp))
    print epoch_start + delta

inTime = int(raw_input('Enter a Webkit timestamp to convert:'))
date_from_webkit(inTime)

Pussy Riot FTW

I haven’t been that interested in the Pussy Riot trial, except to say going to jail has made their protest far more successful than it would have been if they’d been ignored. We’ve gotten used to ‘radicals’ doing something offensive and disappearing, but these girls performed a song, pissed off people enough to get arrested, forced Western journalists to compare their story to Soviet show trials, and ignited sympathetic protests around the world. When they get out of jail in two years they’ll be free-speech darlings and will probably have a widely successful global tour. If the point of their action was to highlight that Russia is intolerant of protest, then this whole story exemplifies that wonderfully.

However, Russia doesn’t care that Western people under the age of 40 who Tweet think this is outrageous. Reuters reports:

Valentina Ivanova, 60, a retired doctor, said outside the courtroom: “What they did showed disrespect towards everything, and towards believers first of all.”

A poll of Russians released by the independent Levada research group showed only 6 percent sympathized with the women and 51 percent found nothing good about them or felt irritation or hostility. The rest could not say or were indifferent.

Our Generation has no Chomsky?

I tend to think that our generation does in fact have such thinkers, only they’ve been hampered by structures designed to celebrate such already “brand name” established figures. Also, this generation’s thinkers are less likely to go through an academic publishing route, given the opportunities for self-publishing today.

The Mechanism

Hints of “the Mechanism” in Alaistair Reynold’s Blue Remembered Earth appear until it actually becomes a plot element, where we learn precisely what it is. Essentially, by 2162, people have been enhanced, and interact with the data cloud via retinal implants and augmented reality. There is a major political system called the United Aquatic Nations, where people live in under-ocean cities, swim a lot, and where some have even undergone surgeries to make them into living mermaids. These people still use buttons and screens, as we do, but it seems this is mostly a necessity of their lifestyle choice.

Because of the constant, internalized connection, people generate a massive amount of data which needs to be indexed (via posterity engines), while it is also being monitored. The Mechanism is the set of algorithms which constantly monitor the data stream, and intervene if signals indicate a certain action is underway, or about to be. When Geoffrey, a main character, gets so angry we wants to hit someone, the Mechanism recognizes this and strikes him with an instant debilitating headache. (p.283) The incident is logged, and he is then scheduled for a visit by a risk-assessment team to determine the seriousness of the matter.

It is known as The Surveilled World.

I recall being a child, going through a Catholic education, and realizing that according to the teaching, God could hear my thoughts. I felt exposed, my privacy violated, and embarrassed. Was there no respite from scrutiny?

Twenty-five years later, I visited the Cloisters, the reconstructed (and frankensteined) medieval complex in New York. Throughout we see little heads gazing down from sculpted elements – these essentially are the medieval version of our black domed cameras, a reminder to the monks of eight-hundred years ago that they were constantly being watched, by a security apparatus of angels.

It seems then, that we have some social need to construct surrogate parental oversight. That a society without watchers – a secularized society that doesn’t believe in spiritual spies and one without CCTV cameras (essentially the Western world for about a hundred and fifty years) cannot exist without engendering existential angst (as seemingly happened).

Speaking of the social upheavals of the mid 21st Century, the character Eunice describes its development:

The warming was global, but Africa was one of the first places to really feel the impact of the changing weather patterns. The depopulation programmes, the forced migrations … we were in the absolute vanguard of all that. In some respects, it was the moment the Surveilled World drew its first hesitant breath. We saw the best and worst of what we were capable of Geoffrey. The devils in us, and our better angels. The devils, mostly. Out of that time of crisis grew the global surveillance network, this invisible, omniscient god that never tires of watching over us, never tires of keeping us from doing harm to one another. Oh, it had been there in pieces before that, but this was the first time we devolved absolute authority to the Mechanism. And you know what? It wasn’t the worst thing that ever happened to us. We’re all living in a totalitarian state, but for the most part it’s a benign, kindly dictatorship. It allows us to do most things except suffer accidents and commit crimes. (p.150)

The quote continues:

“And now the surveilled world doesn’t even end at the edge of space. It’s a notion, a mode of existence, spreading out into the solar system at the same rate as the human expansion front. But these are still the early days. A century, what’s that? Do you think the effects of the 5.9 kilo year event only took a hundred years to be felt? These things play out over much longer timescales than that. Nearly 6000 years of one type of complex, highly organized society. Now a modal shift to something other. Complexity squared, or cubed. Where will we be in a thousand years, or six thousand?”

This is worth quoting in full since it hints at Alastair Reynold’s larger project: Blue Remembered Earth is merely the first novel of a planned trilogy, reported to span into the far future. I imagine the next book will take place centuries ahead, and be part of the answer to this speculation.

The referred to 5.9 kilo year event was a period of intense desertification occurring circa 3900 BCE which triggered worldwide migration to river valleys, from which emerged the first complex city-states. The character is suggesting that present climate-change is a similar event, which will drive us into new ways of living. In the book, one of these new ways is that of globalized surveillance, complete with a thought-control mechanism.

However, there is an area on the Moon that exists outside the Surveilled World, and Geoffry visits his sister Sunday there early on. They strolled through a market while Sunday explained social theories regarding crime as a necessity for innovation and social health. The chapter ends with Geoffrey reaching into his pocket for his hat, and finding it missing …

The hat, it began to dawn on him, had been stolen. The feeling of being a victim of crime was as novel and thrilling as being stopped in the street and kissed by a beautiful stranger. Things like that just didn’t happen back home. (p65)

The Mech then, is an interesting possibility about where we might be headed. The consequences of it are that, by 2162, there are no police forces and no jails. In the novel a news item is mentioned in passing about the demolition of the planet’s last jail, a facility in Mexico.

Bike Baskets

The author Elah Feder interviewed me about this at the Parkdale Tim Hortons two weeks ago.

“It’s people being lazy. They’re making their problem of having to throw something away, into my problem,” said Timothy Comeau, who’s gotten pretty frustrated with the coffee cups and other scraps of garbage he keeps finding in his rear-mounted milk crate. “I just find it really insulting.”

Book Trailers

Last week, the Cloud Atlas trailer was released and it has reportedly driven up sales of the book on Amazon.

It’s an interesting effect, given that books have begun to have trailers produced by their publishing houses.

Neil Stephenson’s 2009 Anathem had a trailer…

…as did my latest favorite book (of which I’ve been writing about lately), Alastair Reynolds’ Blue Remembered Earth:

While B.R.E.’s‘ helped me understand what the book entailed, I was convinced by the Amazon Kindle’s sample chapter.

(For that matter, I bought Cloud Atlas at the end of December when I saw the concept art on io9.com, and got hooked by the sample chapter as well).

I have no information with regard to the trailers produced by the publishing houses having any impact. However, if they wanted to produce 6 minute masterpieces like the Cloud Atlas trailer (a distillation of imagery from a reported near 3-hour movie) then the form would come into its own.

This unintended effect generated by the Cloud Atlas trailer may convince them to do just that.

I love the idea of the mini-film being constructed out of choice scenes, of the quality that suggests an excerpt from a larger cinematic work.

Kind of like Franceso Vezzoli’s 2005 art-video, Trailer for a Remake of Gore Vidal’s Caligula


(Gotta love the Gladiator soundtrack. Also, how has this been on YouTube for two years?)

Engines • This is Already Happening 2

I tweeted this last week, but I should note it here:

Bing, DuckDuckGo, Yahoo, and Google are Search Engines.

Wikipeadia is a Find Engine

I wrote something on Posterity Engines.

Today, all the talk is about “Search” and databases are being built up on ‘search behavior’. Google has a zeitgeist listing that tells us what people have been lookig for, and one result of this database is its prediction algorithm, which guesses what you might be searching for, or tells you what other keywords match a search phrase.

If the conversation shifted to ‘finding’, what then? We have Yahoo! Answers and Wikipedia, and all the other websites in the world. Google’s dominance began with the quality of their ‘finds’ – the websites they suggested best matched your search.

If we shifted to analyzing ‘find behavior’, we would begin to build up a database of what sites we’re being accessed most often … and yes, this is already happening, and essentially drives Google’s algorithms. The site at the top is most likely to be the one you’ll want because others have chosen it as well.

Essentially, I find the use of the word ‘engine’ to name these processes of indexing databases curious, and found it especially interesting when coupled with the word ‘posterity’. It began to make me think about the data we are creating, and how it might be archived, accessed, and named. We are currently living under a ‘search’ paradigm but the future will inevitably complicate this, until ‘search’ will no longer be an adequate word.

Posterity Engines • This is Already Happening 1

One of the more interesting reconceptualizations I’ve come across lately is that of a “posterity engine” which is in Alastair Reynolds’ latest novel, Blue Remembered Earth. I have the Kindle edition which allows me some quick and easy textual analysis; the term appears four times thus:

1. “Across a life’s worth of captured responses, data gathered by posterity engines, there would be ample instances of conversational situations similar to this one…”

2. “…[it was out there somewhere in her] documented life – either in the public record or captured in some private recording snared by the family’s posterity engines.”

3.”It doesn’t know anything that isn’t in our archives, anything that wasn’t caught by the posterity engines…”

4. “He was old enough not to have a past fixed in place by the Mech, or posterity engines…”

The context of these sentences imply something not much more complicated than a contemporary search engine. The novel is set in 2162, that is 150 years from the present & publication date (2012).

The phrasing, “captured”, “snared”, “caught” speaks of today’s search engine crawl – crawling across a site, building up a database of links, content and keywords. At some point in our future, our terrabytes will be warehoused and crawled by personal search engines that will be indexed for our future uses – that is for posterity.

This is already happening. We just don’t label these processes with ‘posterity’. Our Apple computers are already running Spotlight crawls that index our local storage, and Time Machine is snapshotting our hardrives. Windows has an equivalent Start Menu search bar.

Imagine than the contemporary as laughably quaint, and imagine five future generations worth of personal petabytes stored somewhere (a central core server per home?) that requires contemporary Google-grade search to make useful.

I’m reminded of the fact that when Google began in 1998, its storage capacity was 350 GB. An off-the shelf MacBook Pro could have run Google in the late 1990s.

The Dark Knight Rises

I saw the The Dark Knight as a Monday matinee during our August long weekend in 2008. By that time, it had been out for two weeks and had already generated a lot of buzz. It seemed everyone was talking about and praising its greatness. I did not go as a fanboy of Christopher Nolan nor of Batman, but to merely catch up, and see the sequel to Batman Begins, which I learned about late during its 2005 theatrical run and almost missed.

I liked Nolan’s Inception, and seeing that movie cemented into my mind the idea that I like his films – tone, atmosphere, cinematography. However, was that atmosphere in The Dark Knight Rises? I know in the future I’ll have this film on my system and I’ll put it on as working wallpaper … or will I? Did it have that slow-burn quality against a rich backdrop and wonderful Hans Zimmer soundtrack? Yes, the Zimmer soundtrack delivered, and yes Bane was menacing, but I feel (at this point) that the only part of the film that lived up the hype was the prologue, which had been shown in December, and which made my jaw drop the first time I saw it as a blurry pirated internet clip.

Even though a lot was familar from the trailers etc, I went in with questions about the uprising, which were straightforward: a gang of thugs, the release of prisoners, the cops held hostage.

The whole “city under seige for five months” plot at the end didn’t work, as it was unbeliveable. I think in post-9/11 it would have been a lot more panicked, and the coercion of getting the military to guard the one bridge off the island would never have worked.

I did find Catwoman’s motivations interesting: she wants software that can erase her from all the world’s databases. “Collated, analysed, what we do sticks”. This concern of her seems slightly ahead of the time but only by a week or two. That’s a plot line that will make more sense as time goes on when this movie is just another file in a database we can download with our cloud accounts.

All & all, it seems too soon to judge TDKR. As a stand-alone, it’s weak. However, in the future when we can play the series back-to-back on our own time, when a teenager can spend three nights watching the movies on their tablet before bed, then its failings and successes will be clarified. Perhaps its tone has nuances that we can’t pick up yet, but that will become obvious later.

New Design

Beginning in November of last year (2011) I began to experiment with responsive design on a new blog site – the ultimate goal of which was to move my blog from timothycomeau.com/blog (where it had been for years) to it’s own dedicated url: timothycomeau.info.

Over the course of the winter (and while I was studying Web Design at Sheridan) I hacked away at it using it as a playground to try new ideas and further my understanding of WordPress, and especially Responsive Design.

Unfortunately, by the time I came to graduate, I was caught with a mangled site which was only half-developed for what I’d then intended: not only have it as a blog, but also an archive of my previous web content. The archive part wasn’t done, and unresolved.

At about the same time, I began to look into the sites on Themeforest.net, and in order to learn more about how they were built and functioned, bought one, which I put on the site in May. I was asking myself questions: how does this theme work and, does my content stand up to its design?

However, I soon grew frustrated with the implementation. That theme was designed for portfolios mostly, and I wanted the site to function as a blog primarily. It was evident that it should be scrapped.

About a month ago, I hacked together a very simple theme for my localhost WordPress Journal. As a Journal, the end that needed to be served was reading and so design wise it needed to emphasize and encourage that.

Essentially, I ported that design over into this one. I wanted something as simple & clean as words on a page. With the basic structure in place, I think I’ve reached a final version.

A human heart (2006-2091)

Any Human Heart: the story of a 20th century man, published in 2002, written in the late 1990s, and with the fictional lifespan of 1906-1991.

A Human Heart: the story of a 21st Century man, published in 2102, written in the late 2090s, and with a fictional lifespan of 2006-2091. What story then awaits today’s 6 year old?

Prometheus

Since I saw the first trailer at Christmas I’ve been looking forward to this day: then, the December early evening darkness, the loss of leaves, the cold weather. The trailer came to us as a Christmas gift, along with a trailer for Batman: The Dark Knight Rises (July 20) and The Hobbit (December). Then at the end of February, the first viral video, the TED talk, followed by the Weyland website(s) and more viral videos, of David and of Elizabeth Shaw. Now, finally, the movie is in North American theaters, having opened a week ago in the UK.

I saw it today in IMAX 3D and I was thankful that I had, being rewarded with glorious landscape shots for the first part of the film, and then the glorious space shots as we see the ship shrunken against the backdrop of both interstellar space and alien cloud. It lands in a clearing, facing a series of mounds, which contain the sculpted head we’ve seen throughout. The science team investigates, runs into problems, everyone dies, but in the end Shaw and the head of David the Robot survive, and take off in one of the other alien ships (associated with the other mounds) heading to the stars and presumably the Engineer’s home world.

Some s-f movies (and tv shows) can be dignifying: you leave their world feeling infused by the narrative of a mythology, a feeling undoubtedly behind the ancient myths. Sometimes, stories can animate the imagination in such a way as to give a sense of meaning and purpose. I recognize this as real, but also a trick – an illusion (or, a mental illusion, a delusion) that has something to do with how our brains are wired. Just as certain patterns can trick our visual sense, certain narrative patterns can trick our ‘meaning sense’. All of religious history is probably a side effect of such games. Now, we play these tricks for entertainment, using them for movies and television shows.

So it isn’t so much pretense as actuality when the films makers talk of creating a new ‘myth’. Prometheus the 2012 film is a new myth, taking for its name an old myth, and taking for its back story a successful monster film (set within a 20th Century space-age context) directed by Riddley Scott 33 years ago. Who were the ‘space jockies’ of that film? We now know they were Engineers, who seeded Earth through sacrifice millennia ago (or at least it is implied, as that scene is not dated). The Engineers play with genetic technology: our sacrificer drinks a concoction that causes him to disintegrate, in the process converting his cellular structure into a virus, or merely genetic fragments: he falls into the primordial waters and thus the human DNA matrix has been introduced, to emerge out of mammalian primates later on.

The story of the ancient astronaut is compelling, I’ll admit. Four years ago I attended a Charles Darwin exhibition at the ROM, and was struck at the end by the display of skulls. Even though I’d studied physical anthropology in university, and even though I was familiar with the scientific narrative, to see all the skulls together made an impression that something is missing in the genenomic treeline. One can see how Homo Erectus is a form of Homo Neanderthalensis, all have a similar shape, similar brow-ridge, all are evidently part of a evolutionary story. But the outlier is the gracile Homo Sapiens Sapiens, all smooth boned, high forwarded, small chined. Perhaps something did intervene to make the brow ridges disappear, to make us more graceful.

Prometheus leaves the story open for a sequel: presumably in the next movie Shaw finds the Engineer’s home world and some more story elements are revealed, and the third movie will probably be an alien invasion flick set on Earth, post post … perhaps the Engineers are the reason Earth is destroyed by the end of Alien3

I didn’t feel dignified leaving the theater, but rather diminished. My humanity cheapened, the delusion only playing the depressing trick of making our Creators seem malevolent. Perhaps the overall implication is that we’re some kind of livestock to incubate the biological weapon of the xenomorphs.

(In reality, the story will probably turn out that sapient life evolved out of the dinosaurs and colonized parts of the solar system either before 65 million years ago, or during a time between (human beings comparatively have a 2 million year timeline, so there’s room in history for this), and the so called Greys are of the oceans of Europa, and for some reason had fiddled with our genome in the past -has has been suggested by The X-Files tv show of the 1990s, whose last episode told that the aliens were coming back on Dec 21 2012. In fact, Prometheus seems to owe much to The X-Files, in as much as both use a life-force of “Black Oil”).

Prometheus is also a generational parable: Vickers wants her father (Weyland) to die so that she can take over (Vickers being his daughter was pointless otherwise), but this model is also that of the whole: the children of the gods (humanity) want their parents (The Engineers) to die so that we … can take over the universe? And here an echo of the Promethean 1.0 myth: the God P give man technology and his fellow Gods are angry and inflict their famous punishment, because they know that with that technology man will one day challenge them for supremacy.

This tale echoes in a summer of student protest in Montreal.

Websites of the 2080s

On June 8th, Ridley Scott’s Prometheus will premiere in North America before going on to exist as a download or digital file encoded on a disc. Since February there has existed an online marketing campaign consisting of videos and websites, which have prompted me to consider this daily technology as it is now, and how it could be by the time the movie is set, some eighty years from now (reportedly 2093).

By that time we will have resolved a lot of technical issues. Formats will be even more standardized, maybe resembling something like magazines and how templated they are.

Also, I imagine that some other technology will have subsumed the html/css/javascript trinity that is currently behind all sophisticated websites.

HTML is merely a collection of bracketed tags inserted into text files. Barring the development of quantum computers, or quantum/classical hybrids (which Kim Stanley Robinson calls ‘qubes’ in his latest novel, 2312), text files will remain what they are today. HTML as a collection of tags need not necessarily go away … when I began implementing my Journal as a localhost WordPress database, I did so with the understanding that my Journal as a collection of Word .docs was likely to be unreadable in 20 years, whereas HTML was probably future proof.

CSS is another form of text file, but it is already giving was to LESS/SASS as an interface to writing it.

Javascript has exploded as a programing language – I remember when people used to write “… incase people have JavaScript turned off” … and why would they have it turned off? Early on there were security issues. This all seems to be ancient history and Javascript has become a necessity, making websites seem like something belonging to a computer. (That is, an interactive publication rather than a digitized magazine). Javascript has advanced so much in the past five years that Flash is definitely on its way out as a web-interface medium.

The idea of something replacing HTML/CSS/Javascript in ten years (2022) is unrealistic. However, by 2022, we may (as we are now seeing with LESS or SASS) have the hints of something else, with working groups considering the re-invention of the technical language of the Cloud.

By 2042 then, we may have something else. Browsers will still be able to read a webpage from our era by piping the text files through a deprecated renderer, or some form of built in emulator (something like what OS X began using with the introduction of Roseta).

In the past month, I’ve dived into exploring WordPress themes and understanding the possibilities offered by WordPress as a CMS. I’ve used it as a blogging platform for five years, but only in the past six months have I begun to understand its use and potential to drive contemporary websites. The technical sophistication offered by off-the-shelf themes I found frankly stunning, and it is this model I foresee going forward. However, it is this very complicated collection (the WordPress backend remains a mess) that I imagine will be stripped down and simplified, so that by the 2080s, the database to text-file interface will be streamlined that there will be nothing complicated about it.

All of this inter-relation could be integrated into one backend coding interface, and we’ll have something like this eventually.

What is this new form of spam?

Both today and I week ago I got these strange emails from women with M & J initials which seem to come from ambitious young writers. Je ne comprends pas.

I suspect it’s some new kind of bot, & perhaps these emails are an intelligence test? AKA let’s see how many people we can fool into responding.

(I admit that I responded a week ago).

Hypercard April 1997

I recently completed the Web Design Program at Sheridan College, and on the night of our grad show remembered the Hypercard project I did as part of “Introduction to Computers” at NSCAD back in the spring of 1997. Given that Hypercard was in many way a precursor to the web, I wanted to revisit this project as a document of my proto-web work.

I recall doing this very last minute, and at the time I was listening to a lot of Beethoven, so the project was a quick walk through complete with a sound sample of Beethoven’s 5th Symphony taken from the cassette I removed from my Walkman.

The teacher gave me an A.

I recovered this project on 15 May 2012 using a USB diskette reader and the BasiliskII Mac emulator.The cards are presented in a looping sequence:

1) Ludwig van
2) The Skull
3) The Hand
4) The Ear
5) Deafness
6) [back to Ludwig Van and Play his 5th]

Grayscale

  • 000000
  • 010101
  • 020202
  • 030303
  • 040404
  • 050505
  • 060606
  • 070707
  • 080808
  • 090909
  • 101010
  • 111111
  • 121212
  • 131313
  • 141414
  • 151515
  • 161616
  • 171717
  • 181818
  • 191919
  • 202020
  • 212121
  • #2e3137

On Black

<div style="width: 200px; height: 200px; position: relative; z-index: 1; background: black; clear: both;"></div>

<div style="width: 100px; height: 100px; position: absolute; z-index: 2; background-color: #00fff0; margin-top:50px; margin-left:50px;"></div>

Medium Specificity

Lucian Freud was reported to have called Leonardo da Vinci a terrible painter, which on the face of it seems old man’s contrarian fun. But it’s not inexplicable.

In Da Vinci’s time, paintings were moving away from Mediaeval stylization toward what we’d consider ‘hand made photographs’. Artists of the time wanted to depict retinal reality, and Da Vinci was the master at this.

Vasari wrote that one could almost perceive the pulse in Mona Lisa‘s neck, an effect which really isn’t unbelievable. In the Leonardo Live broadcast I saw this past week, La Belle Ferroniere appeared to breath, which I attribute to Leonardo’s sufmato, where the softness of the edges echo the effects I once observed in a Rothko – because the edges have no definable boundary, an optical illusion of movement is created, so the Rothko seemed to pulse, and the Da Vinci portrait seems to breath.

Lucian Freud on the other hand, was a master of medium specificity, the modernist mantra promoted by Clement Greenberg in the mid 20th Century. His critique of Da Vinci was precisely from this point of view: Leonardo sucks at painting because his paintings don’t look like paintings. For Leonardo and his contemporaries, this was a success. For the standards of the late 20th Century, it is a failure.

For me, the best example of the medium specificity ethos can be found in Charles Dicken’s 1854 novel, Hard Times:

‘Would you paper a room with representations of horses?’ [asks the government bureaucrat addressing Mr Gradgrind’s school]. “I’ll explain to you why you wouldn’t paper a room with representations of horses. Do you ever see horses walking up and down the sides of rooms in reality – in fact? Of course no.

Why, then, you are not to see anywhere, what you don’t have in fact. What is called Taste, is only another name for Fact.

This is a new principle, a discovery, a great discovery.

You are to be in all things regulated and governed by fact. We hope to have, before too long, a board of fact, composed of commissioners of fact, who will force the people to be a people of Fact, and nothing but fact. You must disregard the word Fancy altogether. You have nothing to do with it. You are not to have, in any object or use of ornament, what would be a contradiction in fact. You don’t walk up flowers in fact; you cannot be allowed to wall upon flowers in carpets. You don’t find that foreign birds and butterflies come and perch upon your crockery; you cannot be permitted to paint foreign birds and butterflies upon your crockery. You never meet with quadrupeds going up and down walls; you must not have quadrupeds represented on walls. You must use, for these purposes, combinations and modifications (in primary colours) of mathematical figures which are suspceptile of proof and demonstration. This is a new discovery. This is fact. This is taste.”

This mid-19th Century anti-imagery disposition is in fact an echo of the ancient prohibition against images that is found in The Bible. That prohibition reasserted itself during the Iconoclastic years during the Catholic and Protestant split, and is also found within the tenants of Islam, from which this official’s prescription may be a parody: in banning representation, Islamic arts developed geometric pattern to a degree we find astonishing.

A century after Dickens’ words were written, it had become the dominant aesthetic ethos. Art historians tend to bring photography into the explanation, since photography was superior and easier to accomplish than a Da Vincian masterpiece. Painting was left to explore its possibilities as a coloured viscous media.

By our early 21st Century, we’ve left aside concerns that painting need to do anything except be a painting. Young people continue to take up brushes because painting is an interesting and fun thing to do, and occasionally wonderful things result.

1854

As noted, Dickens’ novel was published in the mid-1850s. This was a time when photography was just beginning, and the dominant aesthetic movements were Academic Classicism and Romanticism. The Pre-Raphaelite Brotherhood were active, who formed themselves around the idea that art before Raphael (aka Da Vinci) was superior to the work that came after him (the imitation of Michelangelo known as Mannerism).

Realism, as an art movement, was also happening during this time, and 1854 was the year Courbet painted his famous Bonjour Mons. Courbet. Realism, as described by Wikipedia:

attempt[ed] to depict subjects as they are considered to exist in third person objective reality, without embellishment or interpretation and “in accordance with secular, empirical rules.” As such, the approach inherently implies a belief that such reality is ontologically independent of man’s conceptual schemes, linguistic practices and beliefs, and thus can be known (or knowable) to the artist, who can in turn represent this ‘reality’ faithfully. As Ian Watt states, modern realism “begins from the position that truth can be discovered by the individual through the senses” and as such “it has its origins in Descartes and Locke, and received its first full formulation by Thomas Reid in the middle of the eighteenth century.

The Great Exhibition of 1851 had put on display a variety of consumerist products made by machines. Influenced by the Pre-Raphaelites, and the writings of Ruskin who championed them, the Arts & Crafts Movement began in the 1860s, led by William Morris. From The Arts and Crafts Wikipedia page:

The Arts and Crafts style was partly a reaction against the style of many of the items shown in the Great Exhibition of 1851, which were ornate, artificial and ignored the qualities of the materials used. The art historian Nikolaus Pevsner has said that exhibits in the Great Exhibition showed “ignorance of that basic need in creating patterns, the integrity of the surface” and “vulgarity in detail”.[25] Design reform began with the organizers of the Exhibition itself, Henry Cole (1808–1882), Owen Jones (1809–1874), Matthew Digby Wyatt (1820–1877) and Richard Redgrave (1804–1888). Jones, for example, declared that “Ornament … must be secondary to the thing decorated”, that there must be “fitness in the ornament to the thing ornamented”, and that wallpapers and carpets must not have any patterns “suggestive of anything but a level or plain”. These ideas were adopted by William Morris. Where a fabric or wallpaper in the Great Exhibition might be decorated with a natural motif made to look as real as possible, a Morris & Co. wallpaper, like the Artichoke design illustrated (right), would use a flat and simplified natural motif. In order to express the beauty of craft, some products were deliberately left slightly unfinished, resulting in a certain rustic and robust effect.

In 1908, Adolf Loos published his (in)famous essay ‘Ornament and Crime‘ (trans to Eng: 1913), which argued that ornament was a waste of energy, in addition to applying racist and moralistic interpretations (equating the tattoos of the South Pacific natives with primitive barbarism). As Wikipedia notes, this essay is an historical marker as a reaction to ornamental style of Art Nouveau:

In this essay, he explored the idea that the progress of culture is associated with the deletion of ornament from everyday objects, and that it was therefore a crime to force craftsmen or builders to waste their time on ornamentation that served to hasten the time when an object would become obsolete. Perhaps surprisingly, Loos’s own architectural work is often elaborately decorated. The visual distinction is not between complicated and simple, but between “organic” and superfluous decoration. He prefigures the Brutalist movement that spreads from the 1950s to the mid 1970s.
(Wikipedia: Adolf Loos)

This post was edited on 3 Nov 2013 for clarity

Impressive

I can’t believe somebody did this.

From a Tumblr. Work credited to Aubrey Longley-Cook.

Applying timeless page design principles to the web

This morning I found Alex Charchar’s page on ‘the secret canon & page harmony’ as presented in the past by Jan Tschichold.

Using the Van de Graaf Canon, one divides the a page spread thus:

This is based on a 2:3 ratio page size. However, the spread makes the overall ratio 4:3.

Not coincidently, our monitor resolutions are based on a 4:3 ratio:

1024 = 4(256) = 1024
768 = 3(256) = 768

1280 = 4(320)= 1280
960 = 3(320) = 960

We can apply the Van de Graaf Canon to a 1024 x 768 webpage like this:

As Tschichold showed, the circle is indicative that height of the resulting textblock is equal to the width of the page, or in our case, ½ of the page (1024/2 = 512).

Tischold’s summarized Van De Graaf’s geometric method as the simplest way to create the outlines that were also used by medieval scribes, which all result in a text-block that fits within a 9 x 9 grid.

(animatd gif from Alex Charchar’s article)

Essentially, we can determine the size and position of a content block by taking any page size and dividing it up into a 9×9 grid.

A book spread which divides both pages into 9 columns results in a grid of 18 columns and 9 rows: however, our 18 rows can be condensed into a 9 x 9 without loss of effect. (Eighteen columns merely subdivides the otherwise 9 into halves).

The content block sits 1 column in, two columns above the bottom, and 1 column from the top.

Responsive Web Design

All of this would create a wonderful guideline for laying out the basics of a webpage were this still 2005 and 1024×768 had become the ne-plu-ultra after years of 800×600 CRT monitor resolution settings. Today (early 2012), a webpage needs to resolve on a variety of screens, from iPhones to giant monitors.

In order to have responsive content, it helps to code elements as percentages rather than specific pixels.

Our 1024 x 768 example creates a content block with an 87px top margin, a 166px bottom margin, and side margins of 112. The content box itself measures 800x 512 (the height is equal to half: 1024/2 = 512).

Coding anything in pixels though is unreliable since browser windows are never consistently sized across machines, and padding creates effects which makes pixel precision difficult.

What is needed is to achieve this 9×9 grid with percentages, so that this proportion can be rendered across resolutions.

In order to determine this, I coded a 9 row and 9 column table with a div overalyed using z-index. I fiddled with the css’ size and margin until I got something that matched the constraints.

The result is:

#container {
margin-left: 11%;
margin-right:11%;
height:67%;
width:78%;
margin-top:11%;
}

The page looks like this:

Demo

Hello Mammal Lovers!

This Friday (February 10) at the Drake Lab (1140 Queen St. West), in coordination with our current residency, “Open Cheese Office Grilled Songwriting Sandwich,” we’ll be cooking, hosting and celebrating the FOURTH annual Timothy Comeau Award, and you are invited! Swing by at 7pm or after to eat, be merry and find out who the winner is this year! Bring yourself, your friends, your drinks, your musical instruments, and the rest (an abundance of decadent cheese-variation sandwiches) will be provided.

The Timothy Comeau Award was created to recognize individuals who have shown exceptional support, interest and love for Mammalian. Recipients of the Timothy Comeau Award have been participants in many of our activities and have also offered insights, analysis and criticism. These are people without whom our events and existence would feel incomplete.

The 2010 winner of the Timothy Comeau Award was Sanjay Ratnan. Sanjay has been hanging out with MDR and participating in Mammalian events since he was a fetus. He continues to contribute his talents, ideas and super-star personality to Mammalian as a member of The Torontonians.

The 2009 winner of the Timothy Comeau Award was Kathleen Smith. Kathleen is not only a super Mammalian supporter, but thanks to her three nominations, MDR won the Toronto Arts Foundation’s Arts for Youth Award in 2009, a $15,000 cash prize!

The 2008 winner of the inaugural Timothy Comeau Award was Timothy Comeau. Timothy is a writer and cultural worker who has a couple of blogs including (curation.ca). He has been a constant supporter of the company, writing about our work, showing up to our events, goofing around and generating the kind of vibe that is essential to us. A Mammalian event without Timothy is a Mammalian event that’s happening on another continent.

Who will it be this year?!!

Hope to see you Friday!

MAMMALIAN DIVING REFLEX
Centre for Social Innovation

Emblems by Leonardo da Vinci

Leonardo da Vinci’s emblems are 16th Century logos. The Royal Collection in London has two pages of drawings, one consisting of the sketch, and the other consisting of the three sketches compiled onto one page, in a finished state. They date from between 1506-1510, during Leonardo’s second Milanese period.

The sketch page is of interest due to the large profile to the right of the sketches. It occurred to me in looking at this that perhaps the profile was there as a source of inspiration. We see the curl of the boy’s hair reflected in the curl of the ribbon, for example, and my thought is that Da Vinci was trying to create an emblem/logo using forms inspired by the face.

His logos then can be broken down into the abstraction of the face: a circle or an ovoid encloses shapes arranged symmetrically which signify an identity. Da Vinci’s face is in profile, while his emblems ‘face’ us directly … so perhaps my idea would seem more cogent if his boy was looking directly at us.

The images are screencaptures from the Royal Library collection online.

The Sketches

The Finished Drawings.

Emblem 1

Emblem 2

Emblem 3

The Profile

No need to flaunt it

The truly superior human doesn’t need to flaunt it, tell people about it, or write about it seeking validation.

That’s a line from a short story* I read a couple of years ago, and I think of it whenever I see some tat’d up hipster flaunting how “unique” and “creative” they are. Paradoxically, I find “boring” way more interesting these days.


* Marc Carlson’s The Immortality Blues

Hmmmm

Interesting that in 2011, a “lost” Leonardo da Vinci painting is “found” and that in 2012, a “lost” work of Johannes Brahms shows up.

Star Trek nerds will get this.


Fixing a Lightbox issue

I ran into a problem using Lightbox plugins on my blogs. I’m currently running modified versions of the Constellation Theme, which is full of HTML 5 goodness and styled to re-flow according to screensize (ie is mobile adaptable).

No matter what Lightbox plugin I’d been using since upgrading WordPress to the latest version (3.3), the overlay was showing a margin and an offset as exemplified below:

This is because of the way the overlay is codded to effect the < body > tag. Constellation styles the < HTML > tag in ways usually reserved for < body > so by making a change to the Lightbox Javascript file, one can correct this behavior.

In this case, I’m using Ulf Benjaminsson’s wp-jquery-lightbox plugin.

=== Fix ===

1. In `wp-content/plugins/wp-jquery-lightbox/jquery.lightbox.min.js`

2. search for “body”, and it’s found twice as a string in code like the following:

1) ....;a("body").append ....
2) ....;a("body").append ....

which correspond to lines 67 and 71 in the non-minified file.

Change these to HTML:

1) 1) ....;a("html").append ....
2) ....;a("html").append ....

Fixing a Lightbox issue

I ran into a problem using Lightbox plugins on my blogs. I’m currently running modified versions of the Constellation Theme, which is full of HTML 5 goodness and styled to re-flow according to screensize (ie is mobile adaptable).

No matter what Lightbox plugin I’d been using since upgrading WordPress to the latest version (3.3), the overlay was showing a margin and an offset as exemplified below:

This is because of the way the overlay is codded to effect the < body > tag. Constellation styles the < HTML > tag in ways usually reserved for < body > so by making a change to the Lightbox Javascript file, one can correct this behavior.

In this case, I’m using Ulf Benjaminsson’s wp-jquery-lightbox plugin.

=== Fix ===

1. In `wp-content/plugins/wp-jquery-lightbox/jquery.lightbox.min.js`

2. search for “body”, and it’s found twice as a string in code like the following:

1) ....;a("body").append ....
2) ....;a("body").append ....

which correspond to lines 67 and 71 in the non-minified file.

Change these to HTML:

1) 1) ....;a("html").append ....
2) ....;a("html").append ....

Mother Goose

My grandfather reading to me when I was two.

Robert Hughes: The End of Art?

From The Mona Lisa Curse which aired last night on TVO.

[9m:04s] The death of Bob Rauchenberg to me is not just the loss of a friend, it suggests the death of something that I love about art. So much about what I loved about Rauchenberg’s work was that every formal choice he made came from meeting the world head on. Most of his work like all good art is dense with meaning. It’s not some vacuous exercise in picture making meant to sustain the boys at Sotheby’s or Christie’s with a big price. It’s entirely born of experience, it isn’t born of the market. Some think that so much of today’s art mirrors, and thus criticizes decadence. Not so. It’s just decadent. Full stop. It has no critical function. It is part of the problem. The art world dutifully copies our money driven celebrity obsessed entertainment culture. The same fixation on fame. The same obedience to mass media, jostles for our attention through it’s noise and wow and flutter.

Art should make us feel more clearly and more intelligently. It should give us coherent sensations which otherwise we would not have had. That is what brought me to this city [New York]. That is what market culture is killing.

For me, the cultural artefact of the last 50 years has been the domination of the art market. Far more striking than any individual painting or sculpture. It has changed art’s relationship to the world and is drowning its sense of purpose. The flood that threatened to destroy a rich history of art in Florence in the Sixties has its parellel in today’s art world. From the Sixties on the belief in art as a way of making money began as trickle, turned into a stream and finally became a great, brown, roaring flood.

And what resurfaces after this deluge? Art like this, stripped of everything but its market value.

THIS:

[13m:37s] If art can’t tell us about the world we live in then I don’t believe there’s much point in having it.

And that is something we are going to have to face more and more as the years go on; that nasty question that never used to be asked because the assumption was always that it was answered long ago: what good is art? What use is art? What does it do? Is what it does actually worth doing? And an art which is completly moneterized in the way that it’s getting these days is going to have to answer these questions or its going to die.

The Kubrick 1940s Photos

I love how cinematic they are …. they all look like stills from a film. Also I’m noticing their quality, probably due to the lensing, maybe also due to the richness of the grayscale. Could similar be achieved with an iPhone? Is it all pretty much just a question of shooting angle?

The Country of the Mind • Greg Bear

Greg Bear, The Country of the Mind (From Queen of Angels). Attributed to pseudo-author and character from the novel, Martin Burke, in his meta-fictional book, The Country of the Mind 2043-2044.

The advent of nano-therapy – the use of tiny surgical prochines to alter neuronal pathways and perform literal brain restructuring – gave us the opportunity to fully explore the Country of the Mind.

I could not find any method of knowing the state of individual neurons in the hypothalamic complex without invasive methods such as probs ending in a microelectrode, or radioactively tagged binding agents – none of which would work for the hours necessary to explore the Country. But tiny prochines capable of sitting within an axon or neuron, or sitting nearby and measuring the neuron’s state, sending a tagged signal through microscopic “living” wires to sensitive external recievers … I had my solution. Designing and building them was less of a problem than I expected; the first prochines I used were nano therapy status-reporting units, tiny sensors which monitored the activity of surgical prochines and which did virtually everything I required. They had already existed for five years in therapeutic centers.

For a healthy mentality, what is aware in each of us at any given moment is the primary personality and whatever subpersonalities, agents or talents it has deemed necessary to consult and utilize; that which is not “conscious” is merely for the moment (be that moment a split second or a decade or even a lifetime) either inactive or not consulted. Most mental organons – for such is the word I use to refer to the separate elements of mentality – are capable of emergence into awareness at some time or another. The major exceptions to this rule are undeveloped or suppressed subpersonalities, and those organons that are concerned solely with bodily functions or maintenance of the brain’s physical structure. Occasionally, these basic organons will appear as symbols within a higher-level brain activity, bid the flow of information to these basic organons 1.5 almost completely on sided. They do not comment an their activities; they are automatons as old as the brain itself.

This does not mean that the “subconscious” has been completely charted. Much remains a mystery particular, those structures that Jung referred to as “archetypes”. ” I I have seen their effects, their results, but I have never seen an archetype itself and I cannot say to which category of organon I would consign it if I could find it.

The individual differetiates from its world and its social group when it is able to observe all their elements as manipulable signs. In any individual, cultured or not, “conciousness” develops when all portions of its mind agree on the nature and meaning of their various “message characters”. This integration results in a persona, an “overseer” of the mental agreement – the concious personality.

Imagine somebody else being allowed to lucidly dream within you; to be awake yet explore your dreams. That’s part of what the Country of the Mind experience is like; but of course, our personal memories of dreams are confused. It is even possible for two or more agents to dream seperate dreams at the same time – further adding to the confusion. When a dream intersects the Country at all, it does so like an arrow shot through a layer-cake, picking up impressions from as many as a dozen levels of territory. When I go into your Country I can see each territory clearly and study it for what it is, not for what your personal dream-interpreter wants it to be.

* * *

“Why does the Country of the Mind exist, Mr. Burke?” Albigoni asked. “I’ve read your papers and books but they’re quite technical.” Martin gathered his thoughts though he had explained this a hundred times to colleagues and even the general public. This time, he would not allow any artistic embellishments. The Country was fabulous enough in plain.

“It’s the ground of all human thought, for all our big and little selves. It’s different in each of us. There is no such thing as aunified human consciousness. There are primary routines which we call personalities, one of which usually makes up the concious self, and they are partially intergrated with other routines which I call subpersonalities, talents, or agents. These are actually limited versions of subpersonalities, not complete; to be expressed, or put in control of the overall mind, they need to be brough forward and smootly meshed with the primary personality, that is, what used to be called the conciousness, our foremost self. Talents are complexes of skills and instincts, learned and prepatterned behaviour. Sex is the most obvious and numerous – twenty talents in full grown adults. Anger is another; there are usually five talents devoted to anger response. In an integrated, socially adapted adult older than thirty, only two such anger talents usualy remain – social anger and personal anger. Ours is an age of social anger.”

“Talents are personalities?”

“Not fully devloped. They are not autonomous in balanced and healthy individuals.”

“What other talents are there?”

“Hundreds, most rudimentary, nearly all borrowing or in parallel with the primary routines, all smoothly intergrating, meshing,” he knitted his knuckles gearwise and twisted his hands, “to make up the healthy individual.”

“You say nearly all. What about those routines and subroutines that don’t borrow, that are most likely to be … what you call subpersonalities or ‘close secondaries’?”

“Very complex diagram,” Martin said. “It’s in my second book.” He nodded at the tablet’s screen. “Subpersonalities or close secondaries include male/female modeling routines, what Jung called animus and anima … Major occupation routines, that is, the personality one assumes when carrying out one’s business or a major role in society … Any routine that could conceivably inform or replace the primary personality for a substantial length of time.”

“Being an artist or a poet, perhaps?”

“Or a husband/wife or a father/mother.”

Albigoni nodded, eyes closed and almost lost in his broad face. “From what little research I’ve managed to do in the last thirty-six hours, I’ve learned that therapy is more often than not a stimulus of discarded or suppressed routines and subroutines to achieve a closer balance.”

Martin nodded. “Or the suppression of an unwanted or defective subpersonality. That can sometimes be done through exterior therapy – talking it out – or through interior stimulus, such as direct simulation of fantasized growth experiences. Or it can be done through physical remodeling of the brain, chemical expression and repression, or more radically, microsurgery to close of the loci of undesired dominant routines.”

Also, from Slant (where Martin Burke is again a character):

“Intelligence and creativity often accompany more fragile constitutions,” Martin says. “There’s every evidence some people are more sensitive and alert, more attuned to reality, and this puts a greater load on their systems. Still, these people make themselves very useful in our society. We couldn’t get along without them – ”

“Genius is next to madness, is that what you’re saying Doctor?”

“Genius is a particular state of mind … a type of mind, only distantly comparable to the types I’m talking about.”

Private Posts: using a lock icon

I have some private posts here and there on my various WordPress blogs. WordPress is coded to display the word ‘Private:’ in front of whatever your title is, like this:


I had the thought that it might be better to replace the word with a lock icon, so that it would look like this:


In order to do this, I looked into the WordPress Codex and found the function that inserts the word ‘Private’.

  1. WordPress’ get_the_title(); function inserts the string `Private`
  2. this function is coded between lines 104-120 of the post-template.php file, located in the wp-includes directory.
  3. line 115 reads like this $private_title_format = apply_filters('private_title_format', __('Private: %s'));

Now that I knew what to change, I Googled for a lock image and found a no-copyright black one on Wikipedia, at size 360 x 360 pixels. I resized it to 16 x 16, and uploaded to the wp-includes folder. (I didn’t see why it should be in another).

Next, I coded it with an absolute url, since WordPress’s various functions to make clean urls meant that simply using <img src="lock.png"> didn’t work. Line 115 in post-template.php now looked like this example:

$private_title_format = apply_filters('private_title_format', __('<img src="//example.com/wordpress/wp-includes/lock.png"> %s'));

2011

“Politics was the new popular entertainment, in a way that it had not been since the war and as it would not be for a long time to come; he estimated the interval at twenty-two years: 1945; 1967; 1989 ….” -Harry Mulisch, The Discovery of Heaven 1992 (p82, English trans 1997)

I posted this last December 10th on my Curation.ca blog/project and have been surprised to see it actually come true. Mulisch died in October of last year, so hasn’t lived to see it, but since January we have seen popular protests all over the world – from the Arab Spring, including the Libyan Civil War, to the Occupy movements.

Design our Tomorrow 2011

On November 12th 2011 I was at the Design our Tomorrow conference, held at Covocation Hall at the University of Toronto. Except for the first two speakers (Steve Mann and Greg Kolodziejzk) I was able to record the presentations.

Edward Burtynsky
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_edwardburtinsky.mp3]
Download MP3
Karim Rashid
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_karimrashid.mp3] Download MP3

Aza Raskin
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_azaraskin.mp3]
Download MP3
Ron Baecker
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_ronbaecker.mp3
] Download MP3
David Keith
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_davidkeith.mp3]
Download MP3

Raghava KK
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_raghavakk.mp3]
Download MP3
Aubrey de Grey
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_aubreydegrey.mp3]
Download Mp3
Craig Shapiro
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_craigshapiro.mp3]
Download Mp3
Eric Chivian
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_ericchivian.mp3]
Download MP3


Siobhan Quinn
[audio:http://timothycomeau.com.s3.amazonaws.com/audio/20111112_siobhanquinn.mp3]
Download MP3