Not to be outdone by the ARF Conference taking place across The Pond, London's researchers also discussed the hot topic of neuroscience today - but, as Teresa Lynch reports, they did so without the aid of electrodes or fMRI.
Yesterday in the Behavioural Economics session we were told that the 'System 1 thought process' was what stopped rational people buying logically. Today in the Neuroscience session we had three separate papers on how to tell what this part of the brain was up to, without a headset or scanner in sight.
What we were offered in place of our techie equipment was the skill and expertise of the contributors. First up were Kate Hamilton, MD of Gravity Planning & Research and Susan Martle, Owner of Smartle (prize for the best company name?). They explained the usefulness of NLP meta-programs in working out what drives individual consumers. Apparently some of us are 'towards people' and some are 'away from people'; the difference being that, for example, on a diet the former want to 'become slim' whilst the latter want to 'avoid being fat'. They conducted some research on the audience and showed that some of us were people who worked 'through time' and some were 'in time', which had a major impact on how well-organized they were. This seemed pretty spot-on to your correspondent since their experiment had me bang to rights.
The next speaker also tested his audience: Phil Barden, MD of Decode Marketing, showed us some dots of light which we correctly identified as the outline of someone walking. In showing that we could do this he proved that the entire audience were in fact human beings - possibly a first for this conference. He also explained that the brain was a pretty lazy organ and that most estimates put the amount of time it is running on autopilot at between 85 and 90 percent.
Given this simple fact it's not surprising that researchers can pick up on telltale twitches and conduct what's known as facial coding. John Habershon, Director of Momentum Research presented a paper on how we give ourselves away unconsciously all the time with involuntary micro expressions. It seems we can't tell our faces quickly enough not to show pleasure or confusion, or any other emotion and that this is a rich source of qualitative evidence to back up what respondents say they feel about stimuli such as new products, packaging and advertising material. John said the only equipment that researchers needed to make these assessments were their own eyes and their own brains.
Two interesting points were raised in the Q and A: the first was that old conference reliable 'Isn't this what the good researchers have always done?' and the second was a heated defence for words still 'having some meaning' which was agreed on by parties in the audience and on the podium. In many ways, a similar conclusion to that of the ARF which over in the US has been recommending that neuroscience should be used alongside more traditional research for the forseeable future.
All articles 2006-20 written and edited by Mel Crowther and/or Nick Thomas unless otherwise stated.