Optimism

I remember heading up the stairs at the Ace Hotel in New York City (one of the finer things about that fine city) and, stumbling – almost literally – over this:

Image

I think I took a photo at the time but I can’t find it now. And this one is better anyhow. Turns out the staff at the Ace really like this idea, so much so that they also print it on the back of their keycards:

Image

And they’re not alone: as tumblr proves. Artist Martin Creed has made a habit of constructing various signs saying just the same thing everywhere from Edinburgh to just a few blocks north of the Ace in Times Square.

And the affirmation remind us that, things now are as they should be, and the future will work out just fine.

What do we call this. In fact the definition of the work ‘optimism’ is a belief that the current state of affairs is the best it could be, and that the future will be too. And yet sometimes we fail to see this truth about now until it is later. It’s when we look back at our past that we can see what we should have perhaps valued more at the time.

Do our modern lifestyles make it easier or harder to be optimistic? I’m not necessarily contrasting optimism with pessimism (the belief that things are not as they should be and may not be in the future). What I’m contrasting it with is distraction. This dream that Steve Jobs et al have had – to put a computer in the hands of every man, woman and child, seems to rob us of our ability to understand where we are now. Everyday as I travel into work and back I see people staring endlessly at tiny iPhone screens, as disengaged with the real world as we could be. And most of the time, I’m one of them. Perhaps on occasion, what’s on the screen of the anonymous commuter is photos of family and friends from a million miles away via the miracle of  Facebook and the internet.

But a lot of the time it’s work email or Angry Birds.Why do we do this to ourselves? Why must we constantly stare at these alternative universes?

I suppose the answer is where we started out. That sometimes we do not wish to contemplate now, and prefer neither the past nor the future but some other timeless realm. It’s escapism, but often escaping to somewhere as uninspiring as a work email.

Still, the reassurance remains, everything will work out as it should, even if you’re reading this on the 5.27 to Sevenoaks.

Advertisements

The worst form of government

Image

Anyone who is a bit of a smart arse, like me, will recognise the quote above. It’s from Winston Churchill and it is about Democracy.

You would be forgiven for thinking the full quote is ‘Democracy is the worst form of government, except for all the others’, as that is how the punchline is usually delivered. In fact the quote is

Democracy is the worst form of government except for all those others that have been tried from time to time

I remember learning about democracy at school. The ancient Athenians, I was told, had the purest form of democracy because literally all the people were called to vote on the major issues. It was an entirely participatory democracy, not a representative one. The voice of the people spoke directly on all the issues (so long as ‘the people’ only meant rich Greek men, of course).

We don’t do this any more because it would be impractical… but of course, now, with the internet and our clever phones, we probably could do it again if we really wanted to. We could also, when it comes to it, do any number of other things to make decisions which would like be way more representative than what we do do. Why couldn’t we do sampling to decide, why couldn’t we analyse Google’s search logs? We might not trust Google, but we don’t trust MPs either.

When I talk about polling or sampling, I don’t mean the sort of polling where Rupert in Piers Rupert and Tristan PR asks two mates what they think about designer handbags and then sells it to a newspaper as a story (“66% of women like handbags”). I mean honest to goodness polling. Ask a scientifically balanced control group their opinion and then do that. It’d probably still end up being more fair than what we do now, and a damn site more practical, and cheap. After all, that’s how they decide what’s on the telly.

The obvious concerns is that we wouldn’t have any safeguards against tampering. Well why not, we don’t think people tamper with general elections now do we?

The truth is that we do already govern by polling. Not officially perhaps but political parties often using polling / sampling to set policies and decide agendas. This is how they figure out which laws they’ll pass and how to spend your money. But most people would shy away from it being a formal mechanism (polling suggests).

Why? Because  it’s not what we’ve done in the past. And we must always do what we’ve done in the past.

As I see it, the most fascinating example of this very conservative bias towards the past is US politics. Americans agree on nothing. They are bitterly divided on almost everything. Unless it was said more than 100 years ago, or by a Kennedy or by Dr King. In which case, everyone agrees on it.

This reverence for the past is amazing. Especially for a country so wedded to the future. The American Dream is about tomorrow but the American consensus is firmly about the past.

Of course, revolutionary America was gifted with many fine men, many great public speakers and many heroic soldiers. If only the leaders of today who match these qualities could gain the same respect.

And as we look at the hideous turmoil in Egypt, can we ever imagine that the statesmen, the soldiers and common people will be revered as such decisive change makers? I certainly hope so, but they should not be forging that future based on the model of dead presidents or prime ministers from years past. Where the army can remove elected officials in the name of democracy, perhaps we need to think more originally about what it means to be a democracy.

Now that we can all talk to each other all the time, couldn’t there be and shouldn’t there be a more challenging re-evaluation of what it means to operate a society. Why cling so fervently to rulling through putting folded pieces of paper in metal boxes?

Let’s look for one of these forms of government that is less bad that democracy to try from time to time, or rather lets look at a version of democracy which deliver fairness but also the progress needed by these countries who are growing up in an era of big data, or mass communication, of mass participation and of political despair.

Two tribes

guinness_surfer

A side effect of the digital revolution has been the closing of the perceived gap between product thinking and communications thinking. Not always with desirable results.

Watching these worlds collide has long been a fascination for me, as they are such distinctly different approaches, require such different skills and temperament and are typical bought by distinct clients.

Yet, they have much in common too.

Whilst the staff are rarely transferable between the disciplines, they are very much the same breed: recognisable by their dress, their attitude, their enthusiasm, their intelligence and creativity. And the technical tools and techniques used are often similar too: creative briefs, Photoshop, brainstorms and so on.

Many other worlds use these tools and techniques. You might find PostIt notes and Adobe in packaging, PR, management consulting and other parts of the professional services world, but you’re unlikely to find an agency that does Packaging, Consulting and PR products. Or, at least, one that does it well.

One reason why we see so many try to combine the art of the digital communicator and the art of the digital product developer could be that a single company that could do both could create, launch and grow digital products without need for external support. That’s a powerful dream. Although I don’t know how often it has been fulfilled in reality.

By treating these two disciplines as if they should be accomplished in a single place, do we run the risk of losing the best of each, and the opportunity to properly assess how elements of each would enrich the other.

Let’s look at the two things and see what, if anything they can contribute to the other.

And this is the hard bit. How do we capture the heart of each discipline succinctly, without jargon, and in a way that practitioners can both agree with for themselves and understand for the other.

Skill One: experience of the thing
Product designers (and that’s a pretty broad church) are responsible for developing the thing itself. The more obvious (if not easy) part of this is in developing things that the end consumer will love, that they will find intuitive, rewarding and so on; a thing that users will continue to want to use. In this case the user’s relationship to the thing is more or less obvious. The user experiences it directly. Of course, that doesn’t mean that each user has the sample experience, even if they all see the same thing. Experience depends on the user’s skill, the user’s context, the task the user is trying to achieve. As Lou Carbone puts it, the experience is not related to what the user thinks of the thing, but rather, how the user feels about themselves once they’ve interacted with the thing.

The science of trying to design for users despite their differing skills, context and so on, is quite well developed using research to understand and group people, using tools like user journeys and personas to codify them. Whether it’s a bottle opener or an app, we also need to push product developers to consider the aesthetic as much as the function of the product. The impact on the user can be equally impacted by emotional triggers in the product, not just how successfully it can be used to achieve a task.

Meaning is – of course – socially constructed. And so understanding how objects will be interpreted and understood, should understand this social context, norms, reference points etc. I’m sure owning the first gas lamps was a sign of being cutting edge. Now it would be a statement about being traditional. How do we compare the effect of owning and iPhone in New Jersey with owning one in Shanghai, and so on. When done well, a measure of this social context of use, and of understand the meaning derived from objects, is included in the definition of user experience.

Skill Two: communication of the thing
Modern communications skills are no less important or complex than the skills required to design a product in the first place.

By definition, this is not a design of a product which is directly experienced.

At the heart of it, the communicator is trying to precondition the audience to have a different reaction to the product (or service) and is doing so in the absence of a direct experience of the product. In fact, it is odd for communications about the thing and the thing itself to be present at the same time (think of those awful brand posters in Barclays branches or adverts for Rank Screen Advertising).

How can external stimuli change how we react to stimuli, real or imagined? Fundementally, it must help the receiver to create links amongst their understandings of meaning. It is a process which results in new associations. What is it that makes Prada posh or Pepsi-max precocious. These are truths which have been created by marketing.

In order to do this, a communications designer must have some grip on the inexact art of stimulus and response. Why is it inexact? Because meaning is ever mutating and audiences are not amorphous. If I talk about ‘pretty little colleens’, it is a phrase which some will recognise exactly (it has been a lyric in a pop song), others will recognise generically (Colleen taken as stereotypically Irish name), and others (those called Colleen) will identify with specifically. Your distance from the various references will impact on the extent to which you understand it at all.

So to a real degree, the communications planner must understand how embedded and related these social norms are and construct their message to match this.

So arguably meaning/context is even more important for communications than it is for product.

At the same time, the communicator is most often working to do a lot in a very short time frame, whether the turn of a page, or the 10 watched seconds of a 30 second TV commercial. This has led to lots of techniques focussed on gaining maximum traction with minimal transaction: big ideas, single minded proposition, visual identities and so on.

In contrast, product designers are often working to reduce the amount of time consumers spend with their offspring and making each moment delightful and uninterruptive.

I think all of us who have seen the inner workings of the development of a truly phenomenal advertising campaign, are in awe of the planners and creatives who can translate such a disparate context in such a challenge medium into so much meaning. Guiness’ surfer,  VCCP’s years of work for O2, the great Levi’s ads.

But why would the people that do one of these things be good at the other, and for what reason (other than commercial) would you want both under the same roof?

What I’ve seen is people from each discipline trying their hand at the other. This has rarely ended well. And, worse, I’ve seen managers from one discipline try and manage the other, but without changing their approach. This never works, except in the pitch room where all is roses, and the only possible impact of crushing together ying and yang, of fusing these atoms, is a joyful integration and 5% off the cheque.

I’ll wait for a braver clever agency person to show me the way. Perhaps even in the comments. Go on, you know you want to.

Lean blog post

I’ve always been a fan of Lean. Most recently Lean Startup has been very influential on me, as it has been on many others. On my desk sits an unread copy of Lean Analytics, and now, I see we have also got ‘lean strategy’, meaning not the strategy of using lean techniques, but rather the use of lean techniques to develop strategy.

In this post, Andy Whitlock of Made by Many, espouses the change in mind set of strategist getting involved in new lean techniques.

Is this still strategy? Not, perhaps in its traditional sense. Many would say lean is almost an ‘anti-strategy’ movement, in the sense of favouring action over long planning cycles. This would make ‘lean strategy’ an aggressive contradiction in terms, a self-destructive oxymoron. And that sounds like some thing we should avoid.

But forgetting all that, what is it Andy is really saying? He seems to have three themes:

1/ The ideas can come from anywhere – strategists need to come to believe that their team is just as capable of generating useful insight as they are, and need to learn the language of their colleagues so they can take part more

2/ Don’t cook up big intellectual theories to confuse people – provide information that can actually be used to do things.

3/ Propose intermediary hypothesis and watch them adapt rather than trying to solve the whole problem upfront. Very lean.

Perhaps it was always a good idea for planners to be less precious (1), less overly intellectual (2) and less hell-bent on grand unifying theories (3). But if the lean movement underscores those needs, so much the better.

For my money, the job of the strategist / planner remains the same: make complicated things seem simpler and figure out how to solve the problem. Doing that in rapid iterations and tests is certainly unusual for the average planner, but should be welcomed by most once they’ve got over the shock of the whole thing, since it gives us a chance to break down and learn about problems in pieces as humans really do,  removing the often unrealistic expectation for a sudden, blinding flash of inspirational light.

Broken Windows


The Broken Window theory of criminology was popularised in Malcolm Gladwell’s 2002 book, The Tipping Point. The theory says that urban environments where vandalism and dereliction are present redefine social norms (reducing the pride people take in their communities) and leading to greater crime.

In Gladwell’s book, he highlights the effect of Giuliani’s zero tolerance policy on minor crime in New York City had in reversing a years’ long reputation for being dirty and dangerous.

I think the same kind of effect of the magnified effect of small issues applies just as well to another kind of Windows.

When Microsoft launched Windows Phone in 2010, they achieved something similar in a change of attitude. Screw the number of total features, apps or whatever, the Windows Phone team reversed a decade-long (Pocket PC first came out in 1990, Windows Mobile in 2003) trend of releasing software with lots of little bugs in it. And in doing that, they gave many of us hope that Microsoft could really rival Android and iOS in the mobile phone OS market.

Anyone who lived with a Windows Mobile devices (Windows Mobile 2003, Windows Mobile 5, 6, 6.5) will remember these little bastards the overwhelming feeling when one thing or another just failed to work wasn’t anger, it was resignation.

I’m not talking about UI failures – although there were certainly plenty of those. My favourite bit of non-user-centred thinking must be the snooze menu for the built in alarm clock which required the navigation of a pop-up submenu – this for a user that you can guarantee is half asleep. In later versions, SMS messages were threaded but with new messages appearing at the bottom of the list which would open at the top – often taking several minutes to scroll down to.

No, I’m talking about full-on bugs. In our office at the time, where these phones were standard issue, we gave up asking why people had failed to return calls, had hung up mid-sentence (I still believe the phone would drop the call if it received an email with attachment), or sent garbled and incoherent emails and texts.

I remember a conversation with the ‘mobile expert’ from our firm back in 2008 when he told me that the way to keep your WinMo phone working well was to completely wipe it and re-install everything each month.

You just learned that every so often, the phone would let you down and the only thing to do would be to suck it up.

It was a disaster. Ballmer even admitted as much in public.

Despite all this, the platform was pretty successful, commanding up to 20% of the marketplace. Because it was only competing with Blackberry (which was a bit more expensive and required a server for enterprise customers to get their mail) and Symbian which was late to make any kind of leap to the enterprise.

I’m sure Microsoft would kill to have the same share with Windows Phone today that they once had with Windows Mobile. And the fact is that Windows Phone – a completely re-designed mobile platform – deserves to be a serious competitor in the marketplace. It’s really good.

But the best thing about WP7 when it came out was that it wasn’t buggy. It didn’t have multi-taking (WM did), it didn’t have copy and paste (ditto) or all sorts of other features. But at least it didn’t have any bugs. Things would straight-forwardly work. Calls could be made. The screen wouldn’t stop responding or go all laggy. The UI was consistent. In fact, the UI was excellent and intuitive. So good in fact that it’s ended up on Windows 8, but that’s another story.

For once, it felt the Microsoft team behind the product really understood the need for quality in the product released. Better quality not more features. When Microsoft updated the OS to 7.5 (codenamed Mango) they brought a host of new features and capabilities to the platform and once again, maintained the capability. Of course it was and is an uphill struggle for the OS. Clearly it’s been slow to grow. But the people that have it like it, and that’s a great starting point.

So now it’s two years later, And Microsoft has recently launched WP8 for new WP hardware and a final update for WP7.8 for older hardware.

Whilst it brings a couple of new features and a new start screen, WP8 is really an engineering-led change for Microsoft, building on a long-story which dates back to before the somewhat calamitous release of Windows Vista.

Vista had been intended to improve the overall user experience of Windows, making a big step forward from Windows 2000. As it happens, the user-experience of the Windows Vista interface was very compelling. Unfortunately the performance – the most important element of any user experience – was not up to scratch. Frustrating many with the new OS.

By contrast, Windows 7 went on to be Microsoft’s best and most successful OS and it did this by making the heart of the operating system as small and efficient as possible and therefore dramatically improving the actual user experience. Project lead, Sinofsky did this by taking advantage of the ‘winmin‘ project which had been running at Redmond for many years to cut down core Windows NT.

With Windows Phone 8, Microsoft has replatformed – almost invisibly – their phones from Windows CE (a somewhat dated and clunky core) to a version of Windows NT (a long-standing but highly efficient system), just like Windows 7 and Windows 8.

There is no doubt that this is an amazing engineering achievement. Even though it has come of the cost of WP8 moving along very little from WP7 in terms of what the user sees. But it also seems to have come at the cost of quality in delivery, and not just the delivery of WP8, but WP7.8 too.

Nokia was kind enough to send me a Lumia 820 device early on. Aside from using the highlight colour for the button actions as well as the tiles, the devices can only really be told apart from the Lumia 800 by the removable back cover and the size (for my money, a bit too big). The screen’s actually the same resolution (but bigger so it drains the battery faster). It’s got NFC and wireless charging, both of which are cool. But it crashes. About every six hours, meaning I’ve got pretty good at taking the removable cover off. And the music player hangs the system. And you know what I thought straight-away? This is like having Windows Mobile back. Broken windows.

Because it’s careless. As I said earlier, I’m sure it’s a major engineering triumph but from a user’s point of view it’s taken a year to make a phone that’s bigger, has worse battery life, crashes (often at night, making it’s use as an alarm clock somewhat questionable), hangs, doesn’t have Gorilla Glass (the 800 does), has a much worse desktop sync client and doesn’t look as nice.

And the 7.8 update, a sort of parting shot to keep a Microsoft promise about upgrade cycles, is full of bugs. So now my 800 is broken too. The live tiles don’t work, mine at least is crashing regularly and there are small careless errors dotted here and there. Take for example my Music tile which has recently renamed itself (somewhat accurately) ‘Crowded House’!


Forgive me for saying that it doesn’t feel like a year well spent. A year in an industry which (Android at least) is moving ahead very quickly. Yes, we want new features but what I personally want more than anything is quality. Each new product should have fewer bugs than its predecessor, not more. And every time I find a ‘little bug’, it shakes the faith I have in Microsoft to win in phones.

Surely a successful phone is the most important key to Microsoft’s long-term consumer strategy. So why isn’t it their top priority to get it right?

Meaning

Image

 

Why is this image so powerful?

Even before the caption is read and the context explained, it has a very strong visual resonance.

Obama is obviously an historical figure, a phenomenal orator, a symbol of humanity and intellect. With his image comes a huge amount of that recollection and meaning: the victory speech after the Iowa primary in January 2008 (“They said this day would never come”), the victory speeches after the two elections (2012: “The role of citizens in our democracy does not end with your vote”), the basket ball, the humor of the correspondent’s diner, the battle with Donald Trump and so on.

The pose is somewhat humble and inquiring and Obama is alone, looking confident but curious.

But that is only the start of the meaning. That Obama is sitting on the bus where Rosa Parks once made her historic protest changes the picture altogether. Park’s refusal to give up her seat to a white passenger became a powerful weapon in the civil rights movement’s campaign which would eventually change laws and end segregation; her act is a landmark on the continuing struggle for racial equality in the US.

The bus is in a museum now, of course, and Rosa Parks herself died in 2005. But the act lives on a strong symbol. Fifty-two years and a few months after Park’s original ‘disobedience’ led to her arrest, Obama was sworn in as US president, the ultimate proof that – whatever racism remains in America – it is not ingrained in the institutions through which the country is run.

Race at times seems to be the least part of the Obama presidency. I’m sure that’s how it should be. However, this image reminds us that it is no small achievement for a country that – within living memory – built racism into its laws, to elect a president who might have been the victim of such segregation.

Needs

oliver-twist-007

‘Needs’ is a brilliant word. Five letters long and yet it means so many different things to so many different people; can encompass a huge range of planning challenges and can, perhaps, lead us to some interesting thinking about how to make things people really love, and to communicate things in a way which will really captivate.

The starting point for this discussion must surely be Maslow. In creating a classification of the natural order of human needs, he helped the expansion of the concept of need beyond the purely physical and into a somewhat grey area between needs and wants. Of course, he is also responsible for the concept of a hierarchy among needs with some being of a higher order than others, whether that has pejorative implications or not.

Humans are physical, but also emotional, social, ambitious and so on. And if ‘needs’ are to include all of these factors (which surely it must), then they will encompass every aspect that could be important to us in creating a product or communications which has some resonance with our customer. And of course, needs must be personal, so we will necessarily have to try and understand what needs are commonly shared.

In user experience, we may use ‘needs’ to codify user requirements. But this can be layered. If we’re designing an interface, we need to know which tasks or functions a user needs to be able to carry out – e.g. I need to be able to update my address details. All users also face straight-forward usability needs. However, in thinking about the broader concept of the product, we need to consider emotional needs in addition to the functional needs of the user. How can the product resonate on an emotional level, as well as a functional one.

Here there are two concept which may well be very useful. The first is the idea that emotional response can be classified and codified. Here the work of Plutchick seems very relevant. In classifying the emotions, exhaustively, Plutchick raises and interesting question. What is the emotional reaction that we are looking to achieve and how does the product foster that emotion or suppress its opposite. Does the product in question look to surprise or reassure, to reduce frustration or drive trust?

However, it seems reasonable to suggest that such an emotional resonance can only be properly described in context: What is surprising or fear-inducing in the light-snack market is presumably different from the mobile phone market.

And can we extend this to include the ability of a product (or communication) to offset an emotional reaction that already exists to some other object – i.e. where the emotional response is in the problem the product or communication looks to resolve?

The second framework which is quite powerful here is the concept of the derivation of emotional response. So, in choosing the right shirt for a night out, this is related to the need to impress the opposite sex, from the need to continue the species and ultimately (perhaps in all cases!) the fear of death.  I’ve tried to discuss this briefly before. I’m still convinced there is value in this. If we can understand how an emotional response is driven, we can better understand how to respond to it. After, all, if your product can’t be traced back to a real underlying human need (or multiple needs) of this sort (and here perhaps we’d be less keen on a Maslow style hierachy), then what use is it?

In this post, Northern Planner discusses his fundamental planning beliefs. One of them  is that:

Behind every business problem is a very human behavioural problem you need to change. The art of strategy is making people care enough to behave differently

When they don’t want to be sold to anymore, if they ever did, we need to start with what they’re interested in and work back from there. Real problems and tensions in real lives

Good quote, (and the reason I started writing this post). I agree entirely, apart perhaps from describing the behaviour in question as a behavioural ‘problem’. It seems to me, it’s only  a ‘problem’ from the point of view of the business being discussed. From the user’s point of view, it’s just a behavior.

Perhaps there’s little new in directing our strategies to meeting needs, but I suspect we can all benefit from being more curious in how we dissect that need in the first place.

UPDATE. I had originally meant to kick off this post with the following video. Genius, speaks for itself etc. etc.

Music

590x372.fitandcrop

“Music is, to me, proof of the existence of God. It is so extraordinarily full of magic, and in tough times of my life I can listen to music and it makes such a difference.”

The quote above from the great American Novelist, Kurt Vonnegut is made more compelling by Vonnegut being, at times, humanist, atheist and agnostic. The idea seems broadly accessible to all that – in its various forms – we can make deep, almost spiritual, connections with music.

But of course, taste in music varies considerably. Why is that, and how does your preference get defined?

For my own part, two things appear to be true.

  1. If I listen to any music for long enough, I can learn to like it and then find it enjoyable.
  2. I don’t typically fancy doing (1)

So perhaps if I’d been brought up in a house and school full or Iron Maiden and AC/DC, I’d be a metal fan now; or perhaps in different circumstances, I’d be a classical music boff, or devotee of rap?

What I tend to find, in fact, is that what I’m listening today is either:

  1. The music I learned to like in my teens
  2. Music that I’ve got to from the music I listen to in my teens (from The Smiths to Johnny Marr to The The)
  3. Music which has been pumped out of the radio, or TV so often I’ve come to like it.

A good example how tastes can develop is Martin Stephenson. Growing up in Newcastle, Stephenson and his group the Daintees were like local heroes, signed to local label, Kitchenware alongside other local favourites PrefabSprout. At the time, the band were destined for the charts, groomed and PR-ed for it.

I fell in love with the band, bought all their albums, saw them live and kept on listening to them for years, even though they quietly disappeared.

Then a few years ago, he started playing concerts again. At first I went along to see the old songs. But there had been a lot of water under this particular bridge. Still a fantastic showman, Stephenson had transformed into a much more eclectic performer, mixing many musical influences, mysticism, the wisdom of therapy, addiction and religion. Songs would blend, narratives would drift off. It is hypnotic. The Stephenson of  2012 is – of course a totally different performer to the Stephenson signed to Kitchenware in 1982. Like the philosopher’s axe, all the parts have been replaced, only the name and the memories remain.

I wonder if I’d never heard the old Stephenson, would I love the new one? I doubt it. Liking the original gave me enough licence and patience, I suppose to learn to love the descendant. And for that, I’m very lucky.

What about the catchy, manufactured pop that floats out of youth radio (to which I still, erroneously, listen)? This fantastic New Yorker article takes a peek behind the scenes at how much of this music is made. Certainly ‘manufactured’ doesn’t feel like a stretch. But then again, have you ever actually tried to write a song? It’s not easy, and part of the reason it can feel so difficult is that the result can appear flat and unseductive. These techniques pull those levers deliberated and simply adds in meaning later. The idea that the singer must write the songs, and write them from the heart, is a relatively recent one.

But if we think about it slightly differently, manufacturing music through chord sequences and hooks, is all about designing tracks which can be very quickly taken on board. It’s about giving your audience a sprinkling of good reasons for putting in the work to get, literally, hooked. And there are as many hooks in ‘A rush and a push’ as there are in ‘Diamonds’. The story about how and where the song was created and the conviction of the lyrics, are merely extensions of the meaning that can be attributed to it. Noel Gallagher claims to have written most of Definitely Maybe while manning an NCP car park. Is this preferable to a studio in LA? And of course (witness the X-Factor final last night), much of this can be manufactured too.

So music, and our preference for it, is fascinating. I’m sure similar parallels could be drawn for film, theater, books and the like.

At different stages of our lives, we may also care more or less how the music we listen to, the books we read and the films we watch define us or support our self-image. Like the character Marcus in ‘About a boy’, the choice of rap music reflects a desire to fit in to a group, as well as just a joy of the experience. Like the boys behind the bike sheds coughing their way into a an addiction to Embassy No 1, how many young music fans have to invest time to learn to love the ‘right’ acts? Is this pretense? I don’t believe so, as the effort required is to build the initial relationship which then builds through familiarity.

Perhaps an interesting spin-off question is how closely this method of liking, exploring and becoming tuned in, is reflect in brands (or rather, non-media brands). How do we learn to love, and how far will we explore beyond our preferred repertoire.

Presumably some of the same principles are true:

  1. We can embrace or reject the brands our parent’s loved
  2. Once we’ve become used to a brand, we stick with it as it develops, as Apple has
  3. We could perhaps define genres of brands and observe tendencies to favour one brand type over another
  4. We need an incentive to try new brands, a chance to sample
  5. We can use brands to define ourselves
  6. The back story of the brand can be just as important as the qualities it manifests

Kodak moments

This wonderful clip from Mad Men on Amelia’s blog started me thinking again about Kodak, both a case study of success and failure in product innovation, and – now – a cautionary tale for businesses facing change. But what, if anything, can we actually learn from it?

One of the best studies of the primacy of experience in product design is that of George Eastman, Kodak and ‘You push the button and we’ll do the rest’. The company grew hugely successful on the back of that thought, successfully navigating several technology changes in its early years and developing a massively dominant position in the film marketplace.

As recently as 1976, Kodak held 90% of film sales and 85% of camera sales in the US. This level of market leadership is interpreted as causing complacency from management, and equipping Kodak badly for battle with Japanese rival Fujifilm.

In fact, it was Kodak themselves who invented the digital camera in 1975 and the first megapixel sensor in 1986, the innovations which would prove a major component of their downfall.

The engineer who made the invention is quoted by the New York Times as describing the management reaction, “That’s cute—but don’t tell anyone about it.’” (because of the threat it posed to the core film business).

Whilst it is tempting to think of a beleaguered Kodak being overrun by digital innovation, the company was among the first movers in this new product space. Ranking #1 in the US for digital camera sales in 2005 and manufacturing the first Apple offering in digital photography, the ill fated Apple QuickTake. From that year onward  the company’s position fell by a position or two annually as new entrants, eventually including mobile phones (now the most popular cameras), took hold of the marketplace – making better and lower-priced products whilst generating better profits through lower costs and better efficiency.

Whatever the future of Kodak now, it seems certain it will not be in the consumer imaging market where it once dominated so thoroughly, even though other brands (such as Canon and Nikon) did manage to make the leap from film to digital successfully. Why is that?

It would be easy to assume that it was management incompetence or poor decision making. Indeed, there are some examples of magnificent own goals.

In 1996, Kodak introduced Advantix, a hybrid film and digital product that allowed images to previewed at a cost of around $1/2bn. This was later written off.

In 1998, Kodak commenced an expensive ($5.8bn) experiment, acquiring Sterling Drug in an attempt to diversify beyond the chemicals in its products to the chemicals in pharmaceuticals. This marriage proved mismatched leading to a disposal of Sterling Drug in pieces at a large write-down.

Perhaps these several billion dollars would have provided enough of a reserve to keep Kodak in the game a little longer.

Of course, the prolonged death of the business was down to many thousands to decisions. But at the heart appears to be a lack of enthusiasm for the digital products, even when the company was successful with them, and a business which was simply geared up to do a different thing and had too much invested (financially and emotionally) in seeing film prosper and digital fail.

George Fisher (CEO 1993-2000) described the company he had found, on leaving the CEO Role:

“It was mired in debt. It had haphazardly diversified into pharmaceuticals and other areas it knew little about. Its growth, save the revenues it added on with ill-starred acquisitions, was flat. It was a high-cost manufacturer, with a bloated staff and a sleepy culture that was slow to make decisions. And it regarded digital photography as the enemy, an evil juggernaut that would kill the chemical-based film and paper business that had fuelled Kodak’s sales and profits for decades.”

At this remove, Kodak’s demise seems inevitable. How would you avoid it in your own business:

  • Avoid entrenched conservatism
  • Allow entrepreneurial voices to speak out
  • Read the tea leaves a little better
  • and so on

All nice sentiments, but how many are really practical? The systems put in place in all large corporations are to keep the super-tanker on course, to move people to adherence with a common belief set. Every incentive in business is designed for more sales next year and greater cost efficiency. In fact, Kodak, in continuing to research at all, despite it’s apparently unassailable position, almost seems like proof that macro-economics not company structures are what can continue to deliver innovation, especially after the departure of the visionary founder.

Anyone for the iPhone 6?

Patently

For me, the patent wars, in which so many major brands are currently embroiled are fascinating because of the underlying biases they expose. Read any story about XYZ Corp winning a legal battle and scroll down to the comments and you’ll find acres of diatribe about just how immoral it is for XYZ Corp to take such a matter to court, as if they were suing the council for an uneven pavement.

Now, I’m not the biggest fan of Apple in the world. And it’s certainly easy to see their business practices as aggressive at times. But how can anyone keep a straight face and defend Samsung as having not copied the iPhone, either in principle or by debating the intention to copy.

Just after the Apple IPhone 3, Samsung released the Galaxy Ace Plus:

And then, when the IPhone 4 came out, they released the Galaxy S2

Incidentally, here are the USB plug adapters in the US:

Perhaps it is a cultural thing. Perhaps all property is theft. Perhaps it was a homage. These may be valid points but to claim that there was not  a causal effect between one and the other is to treat your audience as plain daft.