coffee-desk-notes-workspace.jpg
background-blur-chat-433617.jpg
Photo 20-04-2018, 14 24 18.jpg
drive.png
Picture1.png
fashion-clothes-hanger-clothes-rack-clothing.jpeg
mindsee.png
CEEDs main page picture.jpg
coffee-desk-notes-workspace.jpg

OurProjects


Our projects

(some of our bigger ones)

SCROLL DOWN

OurProjects


Our projects

(some of our bigger ones)

below WE SHOWCASE SOME of our BIGGER projects.

background-blur-chat-433617.jpg

Consumer behaviour of mobile and TV


CONSUMER BEHAVIOUR OF MOBILE AND TV

ONGOING

Consumer behaviour of mobile and TV


CONSUMER BEHAVIOUR OF MOBILE AND TV

ONGOING

 

How do people consume television content and advertisement in the age of the smartphone?

Understanding how attention is distributed across screens is crucial to digital platforms. Facebook,  In collaboration with i2 media research and Eye Square, conducted an extensive ethnographic study collecting ecologically valid data from 6 countries worldwide, looking at people's eye behaviour while watching TV with their smartphone available.

The project investigated simultaneous cross-device ad consumption using a unique and innovative combination of design, focus and methodology mostly based on behavioural neuroscience methods, which consisted in obtaining real time continuous measurement of where people were looking (via a mobile eye-tracking device) while in their living room (with or without other people) watching TV. 

Results from the UK sample showed how attention to media can be very fragmented. Only about two third of the ads shown on TV received some attention at all, with most distractions coming from smartphones.

The study showed how having a proper understanding of the fragmentation of user attention during TV consumption would allow digital platforms to really act as a supplement for TV plans. The research was presented by Prof Jonathan Freeman (i2 media) and Stephen J. Gray (Facebook) at Facebook's Cross Channel Measurement event in February 2018; watch the full results here: https://www.facebook.com/FacebookMarketingUK/videos/1534023306697012/?t=1533

Photo 20-04-2018, 14 24 18.jpg

Immersive UX


IMMERSIVE UX

ONGOING

(November 2017 - May 2018)

Immersive UX


IMMERSIVE UX

ONGOING

(November 2017 - May 2018)

Evaluating Immersive User Experience and Audience Impact

Conducted in collaboration with Nesta and funded by Digital Catapult, the project aims to investigate innovative research and user testing techniques suitable for assessing immersive technology content experiences (e.g., VR, AR, etc.). The ultimate goal is to define a repeatable methodology for measuring the impact of immersive content on consumer audiences, and to understand the impact this may have on businesses in the future. 

drive.png

Eye tracking


EYE TRACKING & DRIVING BEHAVIOUR

COMPLETED

(December 2016 - September 2017)

Eye tracking


EYE TRACKING & DRIVING BEHAVIOUR

COMPLETED

(December 2016 - September 2017)

Investigating road safety with eye tracking technology 

In an effort to highlight the importance of developing safe driving practices and produce valuable Road Safety Education resources, East Riding of Yorkshire Council (ERYC) and the Safer Roads Humber Partnership commissioned i2 media research to conduct an academic study to explore how drivers of varying experience and training differ in how they view the roads as they drive, and how distractions affect their behaviour.

The study, conducted using portable and fixed eye trackers, tested the impact of driving experience on gaze behaviour on over 40 people in different driving conditions. The results, in the form of video output, are used by ERYC in support of behaviour change in educational and outreach activities.

.

Picture1.png

The role of time pressure and focus on consumer behaviour


UNDERSTANDING THE EFFECT OF SHOPPER MINDSET ON OUTDOOR ADVERTISEMENT

COMPLETED (February 2014 - October 2015)

 

The role of time pressure and focus on consumer behaviour


UNDERSTANDING THE EFFECT OF SHOPPER MINDSET ON OUTDOOR ADVERTISEMENT

COMPLETED (February 2014 - October 2015)

 

The role of time pressure and focus on consumer behaviour  

In 2015 i2 media worked with Exterion Media on a project that looked at understanding if and how mindset influences the way people on the high street attend and remember bus advertising.  In earlier research, i2 media had identified two factors that drive retail engagement and propensity to respond to advertising: time pressure and focus.

i2's time pressure and focus model of consumer behaviour

i2's time pressure and focus model of consumer behaviour

Initially we conducted an abstract test in i2's lab of the impact of mindset (high/low time pressure and focus) on later recognition of incidental background stimuli (representing advertising) that were unrelated to the main task. The results showed that participants whose focus was broad and performance under pressure, were significantly more likely to notice background stimuli (‘advertising’).

Moving away from the lab, but retaining experimental control, we designed a new study using fixed viewpoint video footage set on a High Street in Glasgow. At various points in the video stimuli, incidentally to participants’ main task, buses passed by displaying advertising. Using eye-tracking technology, we identified how people in different mindsets looked and remembered the bus advertisement shown during the task. Compared to those in a focused, time pressured mindsets, participants in open mindsets looked at bus site advertising for longer and were later able to more accurately remember the bus site advertising to which they were exposed.

In a consequent study, i2 also looked at the role of design salience by modifying the formal properties of the ads shown to participants (colour, font, etc.). As expected, bus adverts designed to be particularly salient were memorable to a greater proportion of people than were low design salient bus adverts. There was decreased memorability (recognition accuracy) when participants were in focused, time pressured mindsets compared with open mindsets.

However, interestingly, high design salience adverts were found to compensate for this deficit. This meant that when design salience was high, having a focused, time pressured mindset was less deleterious to remembering the advert.

i2 media's in-depth research enabled robust evidence based recommendations to optimise the effectiveness of Exterion Media's bus campaigns.

fashion-clothes-hanger-clothes-rack-clothing.jpeg

Mindscape


MINDSCAPE

COMPLETED

(September 2015 - August 2016)

Mindscape


MINDSCAPE

COMPLETED

(September 2015 - August 2016)

The fusion of psychology and wireless sensor networks for enhancing user’s experience using their personal data

MINDScape is a 12 month project by i2 media research ltd. and HW Communications, partly funded by Innovate UK, the UK's innovation agency sponsored by the Department for Business, Innovation & Skills. Mindscape is developing a system where offers and communications to shoppers are personalised based on shopper behaviour as sensed in store and online.

mindsee.png

Mindsee


MINDSEE

COMPLETED

(October 2013 - September 2016)

Mindsee


MINDSEE

COMPLETED

(October 2013 - September 2016)

Symbiotic Mind Computer Interaction for Information Seeking

Mindsee is an exciting research project which looks into advancing symbiotic interaction in the area of information seeking . The MindSee project is a cutting edge collaboration involving experts from leading European universities and research companies, part funded by the European Commission 7th Framework programme.

The project will be developing a novel symbiotic information retrieval system aiming at lowering the workload of researchers to make the information retrieval process more effective. MindSee will capitalize on recent advances on BCI (Brain Computer Interaction), fusing EEG and peripheral physiology signals (EDR, facial EMG, eye gaze and pupillometry) to unobtrusively detect implicit user responses of which users are not aware. Combining this approach with machine learning, the MindSee system will aim to better predict user intentions and exploration needs in providing a massive advance in human-machine symbiosis.

 

CEEDs main page picture.jpg

CEEDs


CEEDs

COMPLETED

(September 2010 - February 2015)

CEEDs


CEEDs

COMPLETED

(September 2010 - February 2015)

The Collective Experience of Empathic Data Systems

What is CEEDs?

CEEDs was a 48-month Integrated Project, part-funded by the EC’s 7th Framework Programme.  The project combined basic science research, technology innovation and high impact user research methods to develop a virtual reality based system to improve humans’ abilities to process information, and experience and understand large, complex data sets.

OK… But what are the problems CEEDs is addressing?

There’s a long answer and a short answer.  Let’s try the short one… In a wide range of specialist areas – such as astronomy, neuroscience, archaeology, history and economics – experts need to make sense of and find meaning in very large and complex data sets.  Finding meaningful patterns in these large data sets is challenging.  By comparison, looking for a needle in a haystack could seem pretty simple!  Foraging for meaning in large data sets is a bottleneck that is becoming more challenging as scientific research creates and works with bigger and bigger data sets (the data deluge). And it’s not just scientists who are affected.  In everyday life, we are confronted by increasingly complex environments requiring difficult decisions and rapid responses; think of trying to get the shopping done at the supermarket in a rush.  CEEDS provides new tools for ‘human-computer interaction’ that assist our everyday decision making and information foraging.

OK, I can see the problem…. What ABOUT the solution?

CEEDS is proposing a radical solution, based on integrating work in many scientific and technological areas.  The solution has two parts. First, a new synthetic reality (SR) system allows people to consciously experience properties of large data sets, dramatically extending current work in virtual reality which tends to enable experiences of simple environments such as offices, houses, or landscapes. Second, CEEDs exploits the power and potential of the unconscious mind.  It turns out that only a small subset of sensory input reaches conscious awareness, yet the remainder is still processed by the brain.  And this subconscious processing is very good at detecting novel patterns and salient (meaningful) signals. CEEDs monitors signals of discovery or surprise in these subconscious processes, when users are experiencing innovative, artistic visualisations of large data sets.  And where it identifies such signals, CEEDs uses them to direct users to areas of potential interest in the visualisations.

How will CEEDs identify signals of surprise or discovery?

CEEDs uses a wide range of unobtrusive multi-modal wearable technologies to measure peope’s reactions to visualisations of large data sets in specially built virtual, or synthetic, reality environments.  CEEDs measures a range of variables, including users’: heart rate; skin conductance; eye gaze; observable behaviours (such as where people point or reach to, or navigate towards). By monitoring these measures, CEEDs identifies users’ implicit (subconscious) responses to different features of visualisations of massive datasets.  The implicit responses are then be used to guide users’ discovery of patterns and meaning within the datasets.

WAVE


WAVE


 
 

Introducing… WAVE, Immersive well-being solutions connecting people with nature 

Content coming soon…

Vivamus sit amet semper lacus, in mollis libero. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae. In sit amet felis malesuada, feugiat purus eget, varius mi. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per inceptos himenaeos. Mauris egestas at nibh nec finibus.