Gawain Morrison, CEO at Belfast-based startup Sensum, has launched music project Mu_ that uses body movements, emotions and heart rate to compose songs.
An interactive music project, Mu_, that produces a unique ‘biosong’ recently had its official launch at CultureTech digital festival in Derry. Gawain Morrison, CEO at Sensum, which specialises in measuring customer response to media, says the project puts user in the role of conductor and orchestra.
Why was Sensum founded?
Coming from a creative media background we wanted to show what can be acheived creatively with this kind of human to machine interaction. The thought that your body can create its own unique piece of music and artwork really appealed to us and what’s more emotive than a piece of music?
We ran a project called Biosuite in 2011, where we created an emotional response horror film in which the audience’s physiological response generated a unique cinematic experience each time it was viewed.
We had so much interest in the project along with feedback from industry and media that we decided to prototype a mobile solution that could wirelessly capture individual’s physiology and display the data for analysis.
How can humans benefit from measuring an audience’s emotional and cognitive responses to media?
Humans are emotional beings. We like to think we are rational but we’re not really. It is estimated that at least 90% of our decisions are based on our emotions. With Sensum we measure conscious and non-conscious responses to media and allow data to be visualised in ways that allow experts to gain deeper insights to what’s happening with the audience when they watch their media.
If you think that in the advertising industry alone it can cost between £60-250k for a single broadcast of a 30 second advertisement, imagine the savings and efficiencies that could take place on how long your most effective advertisement should be?
Your sensors come from Shimmer Research in Dublin, a provider of wearable sensore technology. How do they work?
We use galvanic skin response [GSR], heart rate and accelerometers to measure audience engagement and create interactive entertainment. The GSR sensor measures microlevel changes in perspiration from the fingers and the heart rate sensor measures the variation between heart beats, while the accelerometer moves movement in the X, Y and Z axis.
If it’s for intereactive entertainment, we set variables and triggers for different levels, which then change an event. The potential of the platform is to allow an array of different sensor types to feed into it, allowing users to see how they respond.
How do you collect and analyse users’ data and why do you think it’s accurate?
We wirelessly capture the signals from the sensors to our app, and the data is then displayed via the Sensum dashboard, either for real time or retrospective reporting. Simmer provide these medical grade sensors outputting raw data that we collect & display using our algorithm & visualisations.
It means that we can concentrate on making the data easy to understand and allow media producers to gain deeper insights to how emotionally engaging their content is.
You recently launched your latest interactive music event at the CultureTech festival in Derry. How do the sensors use body movements, emotions and hear rate to compose a ‘Biosong’?
Everyone’s emotions drive physiological response. The heart rate controls the tempo of the track, the GSR brings different instruments and sounds in and out depending on its levels, while each of the axes of the accelerometer control different effects like a wah, tremolo and phaser, affecting different sounds depending on how you move your arm. Feel like a boogie? Then you’ll create something very different than someone who stands still and breathes deeply.
What songs have you come up with so far?
We’ve a library of audio loops. Different people will trigger and affect these sounds in different ways, so the songs will be unique, as will the artwork that is generated. We also want to allow a more organic approach to song creation where instrumentation, sequencing and effects are all generated over longer periods of time and movement. But once you start down this path your mind goes in a million different ways so we’ll need to see what people think of this version and see where that takes us.
What other plans do you have to further develop Sensum?
This last year we’ve been raising finance for the company, developing new product ideas, trialling the platform in our key target markets of broadcast, marketing and advertising, working with the BBC on the NW200, speaking at events and shows. You’re going to see a lot happen over the next 12 months with the interest in the product and development and where it’s going, so watch this space.
What are the challenges when it comes to carrying out the next stages?
The main challenge for us has been in being ahead of the curve until now; when we created Biosuite in 2011 no-one had heard of Google Glass or Smartwatches, and the thought of using physiology to do anything was seen as being futuristic and slightly nuts. Samsung launched their smartwatch earlier this week, rumours are rife about the Apple smartwatch and Google is sinking a lot of money into the glasses, so we weren’t that nuts. All industries want to understand this kind of data and we’re providing the visual interface to do what they want…learn or create, it’s up to them, and it’s going to be fun seeing where it all goes.