coffee-desk-notes-workspace.jpg
fashion-clothes-hanger-clothes-rack-clothing.jpeg
mindsee.png
CEEDs main page picture.jpg
coffee-desk-notes-workspace.jpg

OurProjects


Our projects.

(some of our bigger ones)

SCROLL DOWN

OurProjects


Our projects.

(some of our bigger ones)

below WE SHOWCASE SOME of our BIGGER projects.

fashion-clothes-hanger-clothes-rack-clothing.jpeg

Mindscape


MINDSCAPE

ONGOING

(September 2015 - August 2016)

Mindscape


MINDSCAPE

ONGOING

(September 2015 - August 2016)

The fusion of psychology and wireless sensor networks for enhancing user’s experience using their personal data

MINDScape is a 12 month project by i2 media research ltd. and HW Communications, partly funded by Innovate UK, the UK's innovation agency sponsored by the Department for Business, Innovation & Skills. Mindscape is developing a system where offers and communications to shoppers are personalised based on shopper behaviour as sensed in store and online.

mindsee.png

Mindsee


MINDSEE

ONGOING

(October 2013 - September 2016)

Mindsee


MINDSEE

ONGOING

(October 2013 - September 2016)

Symbiotic Mind Computer Interaction for Information Seeking

Mindsee is an exciting research project which looks into advancing symbiotic interaction in the area of information seeking has just launched. The MindSee project is a cutting edge collaboration involving experts from leading European universities and research companies.

MindSee will last 36 months and is part funded by the European Commission 7th Framework programme. The project will be developing a novel symbiotic information retrieval system aiming at lowering the workload of researchers to make the information retrieval process more effective. MindSee will capitalize on recent advances on BCI (Brain Computer Interaction), fusing EEG and peripheral physiology signals (EDR, facial EMG, eye gaze and pupillometry) to unobtrusively detect implicit user responses of which users are not aware. Combining this approach with machine learning, the MindSee system will aim to better predict user intentions and exploration needs in providing a massive advance in human-machine symbiosis.

MindSee partners are: University of Helsinki, Aalto University, University of Padova, Technische Universität Berlin and i2 media research ltd (based at Goldsmiths, University of London). The team of partners brings together expertise in: machine learning and probabilistic modeling (Aalto); basic and strategic research on information technology (UH); human-technology interaction and applied cognitive science (UNIPD); research on Brain-Computer Interfaces (TUB); and research on user experience and consumer behaviour (from identifying user needs to designing market entry strategies) (i2 media research).

MindSee Coordinator Prof Giulio Jacucci said: “It is a real pleasure to lead such a well qualified team in this ground breaking Future and Emerging Technology research project.  We’ve hit the ground running, and are making excellent progress towards our objective of better, easier search online information retrieval experiences based on symbiotic human-machine interaction”.

i2 media research’s Managing Director Prof. Jonathan Freeman said: “It is really exciting to be working with some of our long term research partners and new partners in MindSee.  i2’s initial activities have been around framing our user research around our time-pressure by focus model of consumer and searcher behavior. The model applies to diverse contexts (from retail to online information retrieval), and we are delighted to see the value it brings to the design of new services. We are confident that its application in MindSee to the domain of online information search will help the project to develop a next generation of search experiences for researchers, and for the public at large when it is applied to online search more generally in future”.

CEEDs main page picture.jpg

CEEDs


CEEDs

Complete

(September 2010 - February 2015)

CEEDs


CEEDs

Complete

(September 2010 - February 2015)

The Collective Experience of Empathic Data Systems

What is CEEDs?

CEEDs was a 48-month Integrated Project, part-funded by the EC’s 7th Framework Programme.  The project combined basic science research, technology innovation and high impact user research methods to develop a virtual reality based system to improve humans’ abilities to process information, and experience and understand large, complex data sets.

OK… But what are the problems CEEDs is addressing?

There’s a long answer and a short answer.  Let’s try the short one… In a wide range of specialist areas – such as astronomy, neuroscience, archaeology, history and economics – experts need to make sense of and find meaning in very large and complex data sets.  Finding meaningful patterns in these large data sets is challenging.  By comparison, looking for a needle in a haystack could seem pretty simple!  Foraging for meaning in large data sets is a bottleneck that is becoming more challenging as scientific research creates and works with bigger and bigger data sets (the data deluge). And it’s not just scientists who are affected.  In everyday life, we are confronted by increasingly complex environments requiring difficult decisions and rapid responses; think of trying to get the shopping done at the supermarket in a rush.  CEEDS provides new tools for ‘human-computer interaction’ that assist our everyday decision making and information foraging.

OK, I can see the problem…. What ABOUT the solution?

CEEDS is proposing a radical solution, based on integrating work in many scientific and technological areas.  The solution has two parts. First, a new synthetic reality (SR) system allows people to consciously experience properties of large data sets, dramatically extending current work in virtual reality which tends to enable experiences of simple environments such as offices, houses, or landscapes. Second, CEEDs exploits the power and potential of the unconscious mind.  It turns out that only a small subset of sensory input reaches conscious awareness, yet the remainder is still processed by the brain.  And this subconscious processing is very good at detecting novel patterns and salient (meaningful) signals. CEEDs monitors signals of discovery or surprise in these subconscious processes, when users are experiencing innovative, artistic visualisations of large data sets.  And where it identifies such signals, CEEDs uses them to direct users to areas of potential interest in the visualisations.

How will CEEDs identify signals of surprise or discovery?

CEEDs uses a wide range of unobtrusive multi-modal wearable technologies to measure peope’s reactions to visualisations of large data sets in specially built virtual, or synthetic, reality environments.  CEEDs measures a range of variables, including users’: heart rate; skin conductance; eye gaze; observable behaviours (such as where people point or reach to, or navigate towards). By monitoring these measures, CEEDs identifies users’ implicit (subconscious) responses to different features of visualisations of massive datasets.  The implicit responses are then be used to guide users’ discovery of patterns and meaning within the datasets.