Monday, October 19, 2009

Social Singularity
















Over a year ago, I made the very purposeful decision to take the twitter application off of my facebook profile (this was before "selective twitter" was up and running). I was noticing that there were things I wanted to say to my twitter followers that made sense to them and to that medium that the old friends, family, and grad school crowd wouldn't find relevant or even make sense. Though it felt odd at first, I came to the conclusion that I wanted to keep those worlds separate. That people who knew the way I was currently thinking (as a brand strategist and pseudo-cultural anthropologist) were all captured on one medium while those with whom I relate to in a different (though no less authentic way) I was connected to on another medium. Not exclusively, of course, but separate to some extent.

And for a while, this plan worked very well.

Recently, my brother started a twitter account. I was amazed, excited, and then alarmed. Last night, my wife and I talked about her entrance into the blogosphere as an aspiring novelist, and the various ways to get connected and to promote her books and build a community. And lo, the conversation included some overlap into the distinct worlds I had created for myself. Today, a friend from grad school started following my tweets. The fastest growing demographic segment on Facebook are women over 55. The norm is now that your parents are on the same platforms as you, sending you links, "liking" your status updates, commenting on your posts, and re-tweeting your insights and confessions.

We knew that this, like all trends, was inevitable. That was the whole idea behind it, right? The Influencers and their trickle-down technologies eventually must either move on to a different, less crowded space, or concede that those circles of distinction must at some point become crowded to the point where all personal networks are populated by the people from previously separated spheres of your life.

This convergence of the members of disparate real life and/or digital circles is known as Social Singularity.

The implications of this go hand-in-hand with the past decade's obsession with the notion of the "democratization of exclusivity" or "massclusivity". When we all have access, various things happen. Innovation and evolution become necessary as new platforms are born and grow. The heavily populated social spaces either lose their appeal, or evolve into hubs of active, though uncontrollable communications (see: YouTube & the gross increase of Twitter spam). And most importantly, in a very positive way, the newcomers to the digital social space make the space a more viable place to develop new means of contact and community, as there becomes less education and dispelling necessary when people jump in and experience things for themselves.

Monday, October 12, 2009

Truth in Creativity, by Maurice Sendak



















For those who don't know me, I have two kids: Sadie, 6, and Charlie, almost 3. We love reading. LOVE it. I will spare you the column-that-could-be about the very best in kid lit and instead give you my thoughts on the amazing rediscovery the world is having with the 1963 Maurice Sendak classic, Where the Wild Things Are.

First off, the film looks insanely delicious. God bless Spike Jonze and any other director who chooses costumes over CGI (also a column-that-could-be). But more importantly, the release of the film has given many of us dads a chance to dust off a copy of the book and share it with our kids with a bit more care - lingering over the illustrations, considering the simplicity of the words, and asking on each page, "what do you think Max is feeling here?"

This is what I love. In a 10 sentence book, Sendak nails childhood. I mean, NAILS it. Who hasn't felt angry and wild and filled with feelings that are too big for our bodies? And yes, yes of course "let[ting] the wild rumpus start." If for nothing else than for the magic of those three words. But clearly for reminding us of what it means to feel alone and isolated and wronged and the need to rebel.

But these days I find myself intrigued by the feelings of loneliness Max experiences when the rumpus is over and he wants to be "where someone loves him best of all." The part of the book most of us don't remember or consider. The time-to-go-home part. The part that completes Max's journey. The part that brings remorse and need and closure to the experience of being a kid.

This book is a study in truth expressed creatively. Find the truth of a human experience, (and not the kind of truth we Brand Strategists like to throw on a creative brief really quickly - you know, the daypart/website pattern/general-generational characteristics, but a real essential understanding of what it's like to be human - kids have big, wild feelings and need to get them out and then need to know that they can have those feelings and still be loved best of all) and then tell that story in a surprising, tender, delightful, and daringly original way.

It's what any creative endeavor should strive for, be it a book, song, painting, classroom lesson, brainstorming session, and, yes, advertising & marketing too.

So, thanks, Maurice Sendak. From my kids to my colleagues, thanks.

Wednesday, June 24, 2009

Apple Calls it Quits (a "what if" exploration)












It is never a good sign when a blog entry begins with a disclaimer. And yet, I offer up the following: I am not a technologist, but I love technology. My wife and I still use our VCR (sometimes). My iPhone was a birthday present, but I almost cried when I dropped it within the first week of having it. I do not have HDTV. Clearly not an expert, I am, I’d say, an observer. A lite-social-networking, twittering, blogging, account planner. I like the conversation. I am fascinated more by the social implications of technology than by the wizardry of the mechanism. I offer this up as a canvas, a backdrop for the conversation I’m keen on having.

In the unlikely event you’ve been living on a planet other than ours for the past couple weeks, you will have undoubtedly been witness to the launch of the iPhone3Gs (the greatest, fastest, most powerful of the 3G's, now with cut and paste!). The frenzy this launch stirred amuses and confuses me. People were once again lined up, huffy and aggressive outside of Apple and AT&T stores for hours. I mean, like 12 hours or more. And others have been expressing for months their plans to get the new device right away. Give their first gen iPhone to the missus, to a kid, to eBay. It didn’t really matter, so long as there was a place for it to go to justify the new purchase.

But as I sat there, with my own suddenly ancient piece of handheld networking iCapability, I wondered, what if the folks at Apple just closed their doors? What if Steve Jobs pulled a Wonka, and just…no more. Aside from the obvious economic ramifications (unemployment, the cost of switching various hardwares) and the disappointment of MacJunkies like myself who will miss out on new designs, and the feeling of “us” that comes with aligning yourself with this anti-PC community. Also, let’s limit this to personal technologies so that the discussion doesn’t meander into the need for medical advancements or better aviation. If we’re just talking about our everyday, humdrum hardware needs and usage, I wonder what we’d be missing. Aren’t our computers fast enough? Can’t we find what we hope to find? Don’t we have enough access? Enough games? Couldn’t we conceivably have full, prosperous, connected lives if personal technologies did not advance beyond this point?

I know Ray Kurzweil's answer. Help me out here, followers of Singularity. What would the argument be? That it is inevitable that technology must continue, must advance because the merging of humanity and machinery will bring about our best-self evolution. But I’m posing what if we put on the brakes at this point? What if we stop now?

What if we thought more about Sherry Turkle’s approach? What if we question what the relationships people are developing with technology? Turkle argues that:

Our new intimacies with our machines create a world where it makes sense to speak of a new state of the self. When someone says, 'I'm on my cell,' 'online,' 'on instant messaging,' or 'on the web,' these phrases suggest a new placement of the subject, a subject wired into social existence through technology, a tethered self. I think of tethering as the way we connect to always-on communication devices and to the people and things we reach through them.

Wow. Look at the big brain on Brett! (aside: I am fully prepared to be schooled on both Kurzweil and Turkle. I don’t mean in any way to dumb down or reduce their thinking. Both are worth studying and understanding in any way possible.) The truth is I wonder if we need new technologies to continue driving us toward a new kind of connection. Some would argue that the new connectivity exchanges human contact for honest conversation. That although we don’t see one another, we feel more free to express our truest thoughts, to let our id run free, to say what’s really on our minds because it’s not really us, it’s our online selves. The converse to that is that people need to learn to interact eye to eye. That our humanness is critical to our survival as a species. What I’m asking is shouldn’t we work on learning to make direct, personal, actual human connections, decisions, negotiations before lining up to get the next gadget that will do it all for us? It’s like Jeff Goldblum’s Dr. Ian Malcolm says in Jurassic Park: “Yeah, but your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.”

Now the twist. Although it seems I’m arguing against the development of more technology, I am more trying to start a dialogue. Keep in mind, I love my laptop, my iPhone, the ability to text. To tweet. To post to tumblr. To videochat my wife and two kids from the road. To cleverly update my status on facebook. And so on. So, talk to me. Tell me. Instruct me. I could use it. I’m earnestly inquiring as to what we’d be missing if we could pause personal technology development at this point in time and make due with where we’ve arrived.

Tuesday, June 9, 2009

two simple words

























I just bought a new notebook. A return to the Moleskine blank page journal model after an ill-thought departure in which a sketch book took its place. (No bad-bad against the sketch book, but there's something about that hard black cover, the size, the experience of it.) Ach. This is all beside the point.

When I open it up, I write the same two words on the first page as I've written on my notebooks for the past few years: "what if..." (the ellipses is optional)

I daresay these two words are potentially the most powerful two in all of the English language. They hold more potential energy than any I can think of. They ask us to consider. They go beyond the norm, expanding our sense of what's possible.

A much, much, much smarter man than myself, Sandy Goldberg, Professor of Philosophy at Northwestern likes these words too. When I met him at last year's Idea Festival, I was struck by our shared devotion to these two small words. He led a too-short discussion about the power of "what if..." in which he described the phrase as an important tool in a philosopher's arsenal of possibility.

The problem of not indulging our ability to wonder and allowing "what if" thinking to seep into our processes is that "we end up taking the efficient cognitive path in substitution for imaginative thinking." Professor Goldberg's words there. And his example was walking through a forest for the first time. We can either explore and take paths untaken or we can go where we clearly see others have been. Typically, we take the path we see others have been and soon our footsteps have worn a path through the woods that becomes visible and clear, which is helpful in some regards, but it is hazardous in others. You see, with a well-worn path through the woods, the next interloper tends not to consider a new path, and instead walks safely to the other side without a second thought.

Our brains work in similar ways. In thinking in ways that have worked well for us in the path, not only do we stop considering the untrod possibilities, but we actually have a hard time forcing ourselves to explore the non-path should we so much as force ourselves to try.

I work as a strategic planner. It's the dreaming of ideas that makes my job interesting. The past months in particular, I have had cause (good fortune) to offer up my thoughts about a couple of several brands' (to remain unnamed) current and future positions. As I presented the "what if" directions to them, they were met with various reactions. For the most part, they were received with "we like the way you're thinking about this", which is great, but a couple of responses were more like "no. that's not us." Which is fine. My ideas are far from perfect. My batting average on these things is somewhere along the Mendoza line, so I'm hardly in a position to disagree.

I think I'm not bothered by the rejection of the ideas as much as the velocity in which a couple of them were administered. I wish I could get clients to pause and consider. To ask "what if" a lot more. As I've said, rejection is not my issue. I'm fond of the ensuing conversation. I'm interested in getting us after the pursuit of the thing; the collaboration that whittles the idea down or morphs it or (yes, truly) sinks it altogether.

Here's my challenge to you: if you're on the agency-side of things, I dare you to include in every strategy brainstorm, creative brief, client presentation, and concepting session (yes creatives - you glorious, misunderstood geniuses too); I dare you to include one moment or slide or devoted, focused consideration to "what if..."

If you're on the client-side of the map, I implore you to contemplate. Have vision. Consider. Challenge. Discuss. But most importantly, allow. Allow "what if..." to have its moment. Give it a beat. Allow for the possibility of something different. Allow for magic. For "my god, I never thought of that." Allow for absurdities that translate into consumer consideration that translate into commitment that translate into staunch declarations and onomatopoetic rapture. But also allow for the unthinkable. The horrendous. The belly-flop. Give some space to experimentation. Not committed dollars. Not even a greenlight. Just allow for the idea that something not found in flowcharts, graphs, and telephone surveys might be alarmingly relevant and staggeringly effective.

To each of you, I say good luck. I hope you take me up on my "what if" challenge. I happen to think the unknown is where innovation lies. And I think you'd be surprised at what can be inspired by opening up a conversation with two simple words. If you'd like to see them again, they're written on the first page of my Moleskine.

Thursday, February 12, 2009

in the eye of the beholder


















So I'm looking for a second car. Our family owns a CR-V, which is perfect for our little foursome. But, increasingly, the world seems like it would be si
gnificantly easier to handle if we had a second car. A commuter. One with a backseat in which I can tak
e my 5 and 2 year old to various Seattle places while mom gets going hither and thither. All of which leads to Ed.

Ed and I worked together in the Advanced Planning department at Nissan/Infiniti, nearly 5 years ago now, though both of us have since moved on. He is still one of the smartest people I know, not just about cars (which he is), but about consumer habits and human tendencies. (there you go, Ed, no more compliments for another 5 years.) So it seemed only fitting that I ask Ed his opinion about what car might fit my aforementioned needs, and hopefully doesn't end in the word "Civic" (no slight to Honda - as noted, we drive & love their products, I just happen to be a car-personality guy and Honda isn't first on my list from that standpoint.)














Today, Ed continued our ongoing conversation by proposing Nissan's Cube to me. When you look at the import, it's hard not to notice the asymmetrical rear window which wraps around the passenger side of the car. The website claims: "symmetry is so last year." I'm undecided on the design choice, but it led to a great conversation with Ed about nature, symmetry, balance, and perception.

Here's the gist of it. Ed claimed that nature is not symmetrical. He recalled going for a hike to the top of a mountain in San Diego in which he looked out over the majesty of the area and found the view interrupted by the straight lines and right angles of a casino. This was indicative to him of the great truth that nature is asymmetrical. I tried to argue with him. Poor architecture, more than a solid proof point. I pointed out the cyclical nature of seasons, of day following night following day, and of an annual trip we all take once around the sun. Like clockwork. But Ed raised the counterpoint that those were examples of balance, not symmetry.

Symmetry is roughly defined as: the quality of being made up of exactly similar parts facing each other or around an axis.

So, he had me there.

True symmetry, he pointed out, is creepy to look at. I recall hearing once that symmetry in faces is actually what draws us to them. It's why Denzel is "Denzel", a People magazine most beautiful person and Forrest Whitaker is simply a brilliant actor. (the audacity! simply a brilliant Oscar winning actor - my apologies, Mr. Whitaker). I looked at some facial symmetry fun, but then did my own experiments using Photo Booth. It wasn't horrible, but I can se how it isn't quite the state of "perfection" we think of.

So, I changed my hypothesis because I could see that Ed was going to be all "science-y" and to the letter about this. The new hypothesis was that any conclusions about symmetry in nature must take into account the human propensity to see symmetry where there is asymmetry. Let's call it "Perceived Symmetry". We, as humans, are attracted most to the things we perceive as symmetrical. It's natural to do so. It explains the beauty of snowflakes, butterfly wings, the majesty of Mt. Rainier.

It also explains a lot of artistic and creative design choices. If we didn't perceive Michelangelo's David to be symmetrical, we might not be quite as fascinated by his beauty. Imagine David with a scrawny arm, or eyes like Marty Feldman. You can't because it ruins the perfection we associate with that work. The glorious iPhone is shaped to symmetric delight, as are most of Apple's award-winning designs. Even looking around my desk, the lovelier items are designed with a sense of symmetry. SIGG water bottle, Swingline stapler, desk lamp.

The point here is that to make a design choice which is asymmetrical is one that can be used very effectively. But I believe it is counter-natural. Meaning, beauty is typically found in perceived symmetry. However, what what would Marilyn Monroe's beautiful face be without that amazing mole on her lower left cheek? Or Van Gogh's single white iris in a field of blue. As Ed points out, "artificial things often become endearing in their asymmetry because they become less intimidating." A slight imperfection here or there gives something a touch of character, shows that there really is no such thing as perfection (as that's a subjective measure), and makes the object (or person) more human through its quality of being juuuuust off. And here, I'm thinking of the beautiful design of Karim Rashid's Yum Bowl, amongst others.

So, where are we left after all this? Without conclusion, I'm afraid, but with an interesting consideration of the notion of perception, and the unsatisfying concession that beauty truly is in the eye of the beholder. Even in a rear windshield.

Monday, February 9, 2009

comedy ain't for just anyone

Far be it from me to pronounce myself as any sort of comic genius. I am reminded of Eugene Levy's Dr. Pearl from Guffman who professed not being the class clown, but sitting next to him. And studying him. So take my critique and analysis with a grain of salt.

Today I watched "Francesco Vezzoli's project Greed, a faux ad campaign for an imagined perfume." This is meant to be a satire, thumbing its nose at the idea of promotions in the luxury product world. But it's terrible. Just terrible. I tweeted as such and my friend's response was enough to get my engines running. He wondered how this could have been avoided. His snarkiness aside, there's something to this line of questioning. He wondered if a different director or fight ch
oreographer might have helped the end result.

The whole experience actually made me consider two things:
1) Broad, self-aware slapstick comedy will never lead to laughter.
2) Comedy is hard. It is an art form that aught to be respected on the level of intense drama.

First part first. Think about Will Ferrell. I know, the last thing he needs is promotion. But he's widely considered one of the funnier people in the universe. He excels at both nuanced, life-or-death stakes realism as well as ridiculously over the top, fully-committed humor.
When the men enters a room in an electric wheelchair and opens a 2-inch cell phone, he just enters the reality of that physicality. The women in the Greed spot look like they've been told not to worry about what it all means, just roll on the floor and try to grab the perfume bottle. What surprises me is that both Portman and Williams have proven themselves to be gifted nuanced actors. And I wish I'd had the chance to see something more carefully performed, which I hypothesize, would have ended up significantly more biting and halfway amusing.

Which leads me to the second point. There's a reason some "comedy" stars are bankable. It's that they're funny and they're aware that comedy takes work. They know timing, commitment, delivery, and subtlety can make or break a moment. Which is interesting. Because I'll bet if you asked Steve Carrell and Ralph Fiennes how to make a single moment work they'd have similar answers. I'm just saying, being put in a wacky situation and making some faces doesn't make something funny in the same way that asking Jack Black to play the priest in Doubt would have probably have been the undoing of that movie (please see: King Kong).

I guess the bottom line is I think much as we'd like to believe that making people laugh is as easy as having a clever idea. But it's hard work that requires a good amount of talent and understanding of process. And it does help to just be funny.
the thoughts and opinions expressed below are entirely my own, and are not necessarily shared by my friends, family, or employer. (though they very well might be...)