Between 30th July and 3rd August the interns here at Bristol VR Lab were working with the Perception Neuron motion capture suits by Noitom. These suits allow a user to have the data from their movements in the real world utilized in some way by digital applications, for purposes such as realistic animation.

Our interns had no previous experience with this technology, so began the week by getting to grips with the provided software and experiencing just what the motion capture suits could do. Once the suits were strapped on, it was simply a case of connecting them via micro-USB to the computer, and running Axis Neuron, a program provided by Noitom. Axis Neuron provides live playback of the suit’s movements, as well as allowing to record animation, which was the first thing the interns practiced with, before exporting these files to Unity. The animations worked with many different existing models inside Unity, so before long there was a scarily realistic dance troupe composed of mostly bears.

The next immediate thought the interns had was as to how they could achieve live capture of movement instead of previously recorded data. Fortunately, Noitom has a Unity SDK, allowing for easy communication between Unity and Axis Neuron with only a few clicks. The live data was smooth and instantaneous, and translated perfectly into Unity, so before long the interns were thinking up ideas of how they could control games using purely their body movements.

An initial restriction to the Perception Neuron suit was the length of the data cable, which limited the user’s range. To circumvent this, the interns attempted to set up a wireless connection between the suits and Axis Neuron via the WiFi. This was straightforward, but a complication was the requirement for a power source, since the suits were no longer connected to the computers. The interns solved this problem initially via use of backpack PCs, but later by simply using portable mobile chargers. Immediately the range of motion was increased, with movement detectable from almost anywhere in the office.

With the underlying tech ready and working, the interns set to creating miniature applications to demonstrate how the suit could be used. These demos were:

ColourChange
This demo changed the colour of both the player and various interactable objects based on their position in the world. This encouraged people to take advantage of the full motion tracking to walk around their environment, observing the change as they move.

Godzilla Simulator

In which the user could stomp around a large basic city, with individual building pieces becoming damaged and eventually breaking as they are knocked around.

Crush

Here the player got to move around an environment stamping on virtual ants until they ‘explode’ into little red cubes. This demo focused on using the players feet, an ability commonly unavailable in other tracking solutions such as virtual reality.

“Competitive flying game” (working title)

With two suits hooked up to the computer, players made flapping motions with their arms. As they were propelled up the screen, if they stopped flapping they would descend back down, and the first to reach the finish line was declared the winner.

If you’re lucky enough to have access to a perception neuron kit you can download the demo here.


On Friday 3rd August as part of our new Tech Play Friday events we invited in a dancer by the name of Antonia Purdie to record her movements using the motion capture kit. We’ll be playing around with the data a later point so watch this space to see what creature we might transform her into!