Is it just me?

Amid the phenomenal suprise of the new… 3G iphone, Jobs also slipped some other news into the Worldwide Developers Conference keynote. It seems Apple is re-releasing an old favourite from Microsoft: 

Yes, it’s the sick older sister of Windows ’98. The ill-fated ‘millennium edition’ of Windows which barely made it into the noughties.

This new platform,  (apple) mobile me is a ‘breakthrough web 2.0 app interface’ allowing the user to access their calendar and mail over the internet:

wwdc-keynote_154

And here once again, Apple shows it’s tremendous audacity:  re-inventing Outlook Web Access some five years after Microsoft built it, and declaring themselves ground breaking and market leading.

You’d be forgiven for thinking they were poking fun at their own addicted fanboys with the ridiculous ‘me’ reference.

The structure of advertising revolutions

Thomas Kuhn wrote and incredible book called ‘The Structure of Scientific Revolutions’. It’s probably the one book I studied at university which I ever think of now.

In the book, Kuhn investigates what really happens in science; how the step changes in understanding really get incorporated into the overall body of knowledge.

The view you might get if you speak to a scientist about all this, or believe what you were told at school, is that the process is essentially rational and evolutionary. As more observations come to light which challenge the conventional thinking, scientists learn to re-examine their theories and come up with better ones.

Kuhn found the truth to be quite different. He created a distinction between ‘normal science’ (in which observations which can’t be fitted into prevailing theories tend to – counter-intuitively – be abandoned), and ‘revolutionary science’ where a new comer will shake the entire foundation of science, leading to a difficult period of change which eventually the entire scientific body takes on board.

Thus we see that scientists will in fact keep adding exceptions and caveats to their experiments and results in order to maintain the status quo of overall thinking. Probably the best example of all this was complex array of assumptions and exceptions (and the invention of a considerable number of planets) needed to explain why, if the earth was indeed at the centre of the solar system, why all the planets’ orbits didn’t seem to fit together. In this phase, it is the scientist’s job to house the new information in the old framework, rather than question anything (the ‘things don’t work like that around here’ school of science).

At this point, a maverick will come along and propose a counter theory which doesn’t need all of these exceptions and caveats to work, and – grudgingly and slowly – the new paradigm will be adopted. This is ‘revolutionary science’. Of course, when I say maverick, I’m referring to Galileo, Einstein and so on.

It’s Kuhn’s book which popularised the idea that science might not be all it was cracked up to be and elevated the phrase and concept of ‘paradigm shift’ to everyday English.

And thus – and I’m sure you’re way ahead of me here – we get to our world: marketing, advertising and communication.

I’m really interested at the moment about trying to pick apart and understand the different social changes that have affected our generation and previous generations. There is a truism that it’s difficult to really remove yourself from the time you’re in – can we really imagine what it was like to live through a time when presidents made promises about going to the moon, or we were engaged in a cold war with the soviets. They’re not small things, and they define their time, but can we understand them now?

The same is true of media and communication. When I was growing up, the thought of TV disappearing was incomprehensible. It would have been the end of my little A-team-obsessed world. For us, TV was part of the evolution of man and we pitied those who went before, not really understanding how they got through the long, cold evenings.

And I think we are long due of a revolution of the size of heliocentricity or relativity in how we feel about that box in the corner of the room and the corporations who pump content into it.

Here (and here if you prefer video), Clay Shirky does a great job of exposing the TV myth for what it really is. In the time before TV, he argues, people had a ‘cognitive surplus’ which TV, with its soap operas and sitcoms filled in nicely. There’s a great Kurt Vonnegut quote on this:

‘TV is … providing artificial friends and relatives to lonely people. What it is… is recurrent families. The same friends and relatives come back week after week after week and they’re wittier, and they’re better looking and they’re richer and they’re more interesting than your real friends and relatives’

Shirky tells the story of being interviewed by a TV producer about Wikipedia and in particular about the evolution of Pluto’s page when it was recently declared to no-longer be classified as a planet, and she said ‘where do people find the time ?’. His response: ‘No one in TV gets to ask that question’.

Now, he argues, people are taking that cognitive surplus back.

Are advertising and communications agencies really understanding what’s happening here? All I hear is TV thinking applied to the web: Let’s put banners here, let’s pay these people to talk to those people; let’s try to ‘do some community’. Or these hideous ideas about getting and managing crowd-sourcing or ‘operating communities of interest’.

As with Kuhn, one paradigm cannot necessarily be understood from inside the other. There’s going to have to be a lot of fresh thinking going on if companies are going to keep paying these agencies all of these dollars.

But of course, Shirky isn’t really worried about what the big dumb agencies (as George Parker calls them) don’t know. He’s more concerned with the potential upside.

Given the size of the ‘surplus’ currently being spent on TV, imagine what they could do with all the time. More LolCats? Five more wikipedia projects? 1000 more? A million?

 

Third time lucky

3.0

Amelia’s amusing analysis of Web 1.0 and Web 2.0 came coincidentally on the same day that I was at a conference thingy and had been having exactly that discussion: what was 2.0 and how much of it was pure marketing sentiment. I couldn’t disagree more. I think 2.0 is a radical shift in society. It is, as Amelia says the shift from an internet of geeks and possibility to an internet of the mass market and reality.

What, if anything, 3.0 means is another matter. Clearly there is a quasi-technical meaning being discussed (as on the Wikipedia page), but surely we should be concerning ourselves instead with the social impact of these changes.

  • Expectations about data integration will go through the roof. Just as information became ubiquitous in 2.0, the joining and manipulation of data will become so now. Brands will have to respond to this. Expect some powerful movements in traditionally data orientated services, particularly FS.
  • The ladder of involvement will continue, with a new rung being added above ‘blogger’ or ‘publisher’ for ‘providers of utility’
  • Concepts of enterprises and the borders of corporations will continue to be challenged

Amazon and Google (and to a certain extent, Microsoft) have clearly started their engines to take advantage of this next generation with app development, elastic computing, utility computing and so on the subject of much debate this week.

There’s a powerful version of inverted marketing too (where consumers are rewarded for hand-raising) which feels like the inevitable consequence of abstracting and linking data.

How will it impact your brand?

In my day

Roy-Hattersley

A good day, yesterday for things fitting together and falling into place (to mix up the metaphors a little). The day started reading Amelia’s amazing piece in the Spectator. However much you’re in to new media there’s no denying how cool it is to read people you know in august titles like that, especially when Amelia’s piece is longer, and more prominent than – for examples – Roy Hattersley’s.

The article itself was about the generational gap emerging in technology adoption and it’s well… us. Younger people (under 20s)  find technology to be simply a fact of life* and the older groups (55+) are not cynical about it and have the time and money to explore and adopt.

Slightly ironically, the slow adopters are the group that Coupland named as “Generation X” (today’s 30-somethings) who have the healthy cynicism that comes from having seen the bubble burst once already but without the older generation’s (or indeed Mr C’s) resources to explore and experiment.

A very convincing argument, although one which I suspect is still slightly class-bound, it reinforces many of the points I heard later in the day in a fantastic presentation about social media which Antony gave at Conchango (where I work).

There were hundreds of interesting ideas in that presentation but if I had to pick just three, they would be:

  1. What is happening in the way we communicate really is nothing less than a revolution. As Antony put it, that name may seem overblown as it’s been used too much and too randomly but we must standby it. As with other revolutions in the past, the full effects may take years to become apparent, but Web 1.0, Web 2.0… whatever is as big as the printing press, as big as the enlightenment. As Cluetrain would have it: “deal with it”
  2. When we give people the tools, whether they’re 5 or 55, they take to them. Why? Because we are hard wired to communicate. It’s not clever graphics or gimmickry, it is the need for sociality and it isn’t going anywhere.
  3. Advertising agencies act like they’re getting the message, as they jump on every bandwagon through web-two-point-zero-ville but they are wearing the clothes of the revolutionary without sharing their beliefs. Driven by fear and the desire to return to the well-trodden paths of old, clients and agencies are missing the huge opportunities they could have to actually deliver the basics of marketing through network thinking.

There was a huge amount more of course, plus a look at how Spannerworks is helping clients get to grips with what can be achieved with a positive approach to the new realities.

Finally, I was able listen to Forrester’s take on what web 2.0 means within enterprises. This is a huge topic in its own right, obviously, and one that’s moving very quickly and being driven by a bizarre mix of tiny software companies like Six Apart and the huge vendors like Microsoft and Oracle.

Two points from that resonated, both of which have been talked about in a number of places before but which were really crystallized today.

  1. Getting to grips with scale. No matter how big your company is. It’s absolutely tiny in the domain of the internet (Antony also made this point). Again, this is a “get used to it” sort of a moment for the large corporates.
  2. Back to demographics. Who’s likely to be making the decision about corporate take up of “web 2.0” styles of knowledge management? It’s the IT and operations directors who are unlikely to consume social media and even less likely to contribute to it. Who’s are the next generation of recruits coming into our companies? A group who see these tools as part of day to day life! So expect some very quickly changing attitudes as the new recruits gain their voice.

* Antony recounted a story of a focus group where 11-14 year olds were asked what they would do without the internet. The questioner was met by a series of completely blank looks, as the group found the prospect unimaginable.

Thinking about the future

The late, great Kurt Vonnegut

Several discussions today have reminded me of the great quotes from Alan Kay and Samuel Goldwyn about making predictions.

Alan Kay (inventor of the term ‘object-orientated programming’):

The best way to predict the future is to invent it.

Samuel Goldwin (who also coined “a verbal contract isn’t worth the paper it isn’t written on”)

Never make predictions, especially about the future.

And a few more I’ve found since in trying to remember those ones:

  • “640K ought to be enough for anybody.” Bill Gates in 1981 (potentially apocryphal)

  • “We don’t like their sound, and guitar music is on the way out.”  Decca Recording Co. rejecting the Beatles, 1962

  • “History is merely a list of surprises. It can only prepare us to be surprised yet again.” and “She was a fool, and so am I, and so is anyone who thinks he sees what God is doing.” Kurt Vonnegut

  • “I never predict anything and I never will.” Paul Gascoigne

No room for manoeuvre

Chinese Room - illustration 

I’m as big a fan of Ray Kurzweil as the next man but this post by Northern Planner– fast turning into a favourite (and extremely prolific) blogger – reminded me of something I meant to do a post about ages ago.

Kurzweil and other futurologists often talk about how long it will be before computers become “conscious”. You can see how the thinking goes: computers used to be a bit shit, then they became good enough to do sums, then the internet. Soon, computers will be able to perform more calculations than our own brains and then soon after, they’ll have more computing power than the whole bunch of us.

This is basically Moore’s law – the power of computers (or, more to the point, their power:cost ratio) will double every 12-18 months. And the rule continues to hold. Mathematically the effect is that useful computing power grows like 2n where ‘n’ is the number of 12-18 month periods. It’s dizzying growth that will keep us in awe of the power of the machines. But it will never amount to consciousness – just like no amount of cheesecake will ever build an elephant – they’re different sorts of things.

Considering it’s our finest feature, human’s are not well disposed to feel protective of our consciousness, and people find it very hard not to think of consciousness as some higher order of information processing.

But it’s not.

Luckily there is an absolutely fabulous analogy to help us understand. This comes from John Searle’s Chinese Room Argument

Searle asks us to imagine a Chinese man wandering through the wilderness who comes across a large room. This completely sealed box has 2 slots on the outside, as well as a pad and a pen. A sign pointing at the top slot invites passers-by to submit questions in writing  in Cantonese into the top slot. Our wandering man does this, asking the room as series of questions: directions, common facts, popular culture questions. Each time an appropriate or correct answer pops out of the bottom slot.

What do we conclude? That the room understands the questions? That there is someone inside who understands?

Now let me tell you what’s actually going on inside the room. As questions come in, a young YTS trainee from Hull (who only speaks English) takes them and checks them against a series of books. All the slips of paper contain strange incomprehensible symols (Catonese symbols). When he finds an exact match, it includes a long number, he then takes this number over to the other side of his room and looks it up in a different set of books. Here he finds the number links to a different set of symbols. He traces these onto a new piece of paper and pushes them back through the slot.

Will our YTS trainee ever learn Cantonese? How could he, all he gets is syntax, he would never get a foot hold onto even the first rung of semantics.

This is what modern computers do. And better, faster processor is a better faster YTS trainee.

(illustration stolen from here)

The Bill and Steve show

Gates and Jobs (read the body language!)

When it comes to technology innovation, it’s interesting to hear some people still talk about a “five-year plan”. After all, YouTube went from zero to £1.65bn in 18 months. Google only lost their beta tag 7 years ago. Paradigms can change literally overnight.

In this fascinating interview, Jobs and Gates reveal that they’ve not got a clear view five years out, although they both broadly are expecting hardware to continue to evolve in a fairly linear way. Gates is sticking to his software-only stance (noting the exception of the X-box, the new and very exciting surface computing – although I’m not sure why that couldn’t be a pure software play for Microsoft – , a new meeting conference hardware called round table, and of course the ill-fated Zune).

While Jobs is clearly still in the hardware+software mode, he sees software as the driver, simply noting that he will continue to make the “nice boxes”. He sees the ipod’s dominance for example as a result of great software. And indeed the majority of the criticism of the Zune has been software related, and that certainly would seem to be the biggest barrier to adoption.

Both believe that users will continue to have multiple devices. Basically this means a laptop (or tablet), a mobile phone (or “post pc” device as Jobs calls it, pesumably to make the iPhone even more significant) and home entertainment equipment which will include what’s been done with media centre but will also, surely, extend to include ubiquitous computing device like the Surfaces product mentioned above.

As well as 3D visual interfaces, which have not yet lived up to their promise, Gates identified other changes in input method as big driver. In fact he talked about several different input methods – the multi-touch approach that Jeff Haan has been on about for years and appears in Surfaces and on the iPhone; what I would call passive video input – again on Surfaces, this is cameras which map how devices relate to each other (well worth watching the demo for that), and possibly in whatever this conference tool is to identify who’s speaking or presenting; and finally a general nod in the direction of natural language input.

Jobs was very tight lipped about innovation although dropped a few hints about improving .mac, most likely in some sort of 2.0, SNS kind of direction. Hugh MacLeod also spots a vieled comment from Gates about re-entering the internet space with renewed vigour from Gates. I’d guess he’s refering the Live Services platform but who knows, perhaps there’s something else about to be launched. I wonder if Hugh knows more than he’s letting on.

Both men seemed suprisingly oblivious to the threat posed by SaaS to their desktop operating systems, with both citing a mix of local applications with cloud services in support.

Interestingly, Jobs also argued that a turning point for Apple’s corporate strategy was when they realised that their success was not contingent on Microsoft’s failure, although his attempt to characterise the “Mac vs PC” advertising as not attacking Microsoft was rather unsuccessful.