Home » Research Software » Research Platforms » M+M

M+M

M+M: Movement + Meaning Middleware

software banner image

Turning dance into data

School of Interactive Arts and Technology, Simon Fraser University


Movement is an essential part of human communication; we prefer face-to-face interaction because we can see another person’s reaction, observe their gestures, and read their body language. Computers, which have come far in interpreting written language and audible speech, still can’t notice a scowl, interpret a hand wave, or see the poetry in dance. Until now.

Movement and Meaning middleware (M+M) is a Research Software Platform that gives computers the ability to understand human movement by quantifying it – turning fluid analog movement into a new form of digital data. The software also allows people to use this data to manipulate robots, control game characters and avatars, teach manual skills, critique student dancers, and so on.

The Movement and Meaning Platform gives computers the ability to understand human movement by quantifying it – turning fluid analog movement into a new form of digital data.

The analysis of human movement has great potential to help researchers in social, cultural, technical and medical applications.

Interpreting movement as human

How does it work? M+M relies on a large variety of commonly available technologies – mobile phone accelerometers, motion-capture cameras, game controllers (like Microsoft Kinect and Nintendo Wii), and wearable devices (such as Fitbit) – to measure the numerous locations of a person’s body, limbs, head, or hands as they move. M+M merges these sequences of individual locations and creates a digital representation. This representation is fed into other computers where it can be used to mimic human movement or as a benchmark from which to evaluate human movement.

Potential in both arts and science

The analysis of human movement has great potential to help researchers in social, cultural, technical and medical applications – from improving Olympic athletes’ performance, through controlling the motion of computer-generated actors, to understanding motor-control diseases like Parkinson’s. M+M can contribute to the arts as well through interactive installations like the 2014 “Longing and Forgetting” exhibit in Surrey, BC, where audience members use their mobile devices to control the artwork’s characters. It’s also a natural technology to support further advancements in telesurgery, where Canada is already a leader.

Collaborative creativity

M+M is the result of a collaboration between the School of Interactive Arts + Technology at Simon Fraser University, Credo Interactive, a company developing animation and choreographic software for human movement, and H+ Technologies, creators of customizable intuitive gesture-based systems and holographic displays.

A number of the M+M components are now available through the CANARIE software registry: the M+M platform, sensor input software, a real-time movement database, and connectivity software that allows M+M to interact with other software systems.

Funding for the development of M+M was provided
through CANARIE’s Research Software Program.

Additional Resources

Technical specifications and M+M Fact Sheet
M+M website