This $2500 Suit Could Bring Motion Capture to the Masses

Rokoko Smart Suit ProCredit: Rokoko
Danish firm Rokoko built the Smartsuit Pro, an affordable, easy-to-use solution for motion capture.

Jakob Baslev didn't take up badminton for professional networking, but when he went looking for a technical partner to bring his vision to life, he found what he was looking for in his local shuttlecock league. 

Matias Søndergaard was working for a 3D printing company when Baslev, who had a filmmaking background, pitched him his idea: to revolutionize motion capture by building a suit that is easy to use, cost-effective, and sets filmmakers free from the studio. Together, they founded Rokoko. The result of their collaboration launches today, with the Smartsuit Pro available to the public for $2,495.

This could open up motion capture for music videos and independent films in a way that has never been possible before.

Traditional optical motion capture can yield some amazing storytelling moments—the most famous being Gollum from the original Lord of the Rings—but it is a cumbersome and complicated process. Motion capture requires many cameras recording a larger number of sensors on an actor, a stage with very controlled, even lighting, and a tremendous amount of data. Even once you have gone through the expense of building or renting a mocap stage, you still need an experienced team of experts to wrangle the data captured; system often gets confused about the data it is processing. For instance, if an actor waves their hand in front of their face, the system can get confused about the trackers mounted to the actor's arm: do they belong to the arm or the face? Anyone who has experimented with automated tracking software knows this problem well, and it's only exacerbated by the large volume of tracking required for good mocap.

After talking his film school into building a motion capture stage, Baslev was familiar with these problems firsthand and knew there had to be a better way. Working with Søndergaard, they formed a team of 13 people split between Copenhagen and San Francisco. They built the Smartsuit Pro to offer everything you need for motion capture—without the need for an expensive, complicated stage. 

The key is a network of 19 sensors built into the suit itself, like a nervous system. These sensors—acceleration, direction, and relative position, similar to the sensors you have in your smartphone—are wired together to a hub in the lower back of the suit that connects via WiFi or cellular signal to transmit data. 

The key is a network of 19 sensors built into the suit itself, like a nervous system.

If you hook up the Smartsuit over WiFi to your network, it can interface live with game engines like Unity and Unreal, but also with VFX tools like Maya and Motion Builder. For After Effects users, the system is capable of recording to the .fbx format, which can bring the 3D motion data straight into AE for animating.

The company also supports an extensive SDK for users to build new application interfaces. Because the suit records sensor data, and not images, the files remain very small and highly accurate. Three minutes of full motion recording typically comes in under 100mb.

Rokoko Smartsuit Pro ArmCredit: Rokoko

In addition to allowing filmmakers to avoid the expense of a stage, the other huge benefit to Smartsuit is the freedom it gives you to work in a variety of locations and lighting conditions.

While hard, contrasty direct light often makes traditional mocap difficult, the company was able to demonstrate their realtime unity integration while standing in a darkened room next to the bright light of a window. Since the Smartsuit has its sensors built in—thus doesn't rely on camera capture—it had no problem accurately mirroring the performers' movements in the game engine. This frees up the filmmakers for shooting in a wide variety of real locations while adding in motion capture characters. As such, this tool could open up motion capture as a powerful option for music videos and independent films in a way that has never been possible before.

Filmmakers can shoot in a wide variety of real locations while adding in motion capture characters.

Additionally, the suit is easy to maintain: just remove the sensors by unzipping their storage pouches and put it in the washing machine like a sports jersey. Rokoko's future plans include the eventual release of gloves for finger tracking, bringing the total data points up to 33. The company also plans on developing the tool as a full-body interface for VR applications in addition to developing further integration with the gaming and motion picture industries.

Available now from for $2,495.

Tech Specs

  • 19 data points
  • Wifi connect
  • Rotation, Acceleration, and movement
  • Machine washable
  • Small file sizes and .fbx recording

You Might Also Like

Your Comment


It's cool that this tech can be sold at this price, but judging by the video it looks too early to actually buy one of those. Idk if the jittering was because of software or the suit though.

December 13, 2016 at 10:07AM


Nice, there are other options that are cheaper and more configurable like which works really well

December 13, 2016 at 12:57PM, Edited December 13, 12:57PM

Lee MJ Daley
Animator / 3D & Motion designer / Lighting cameraman

I would say a *$250* suit would bring mocap to the masses...

December 13, 2016 at 6:55PM


Hi, thank you for that information. I will share my experiences on mocap for my film.

I am currently using motion capture and this new suit looks absolutely incredible technology that does free you from regular mocap constraints. I was a bit surprised by the price (I understand that professional motion capture technology for film is expensive). At this price, over 2 thousand dollars, it is unaffordable to many indie productions with small micro budgets (who put all of that 2000 dollars - on motion capture, production, advertising, etc... - not just the suit alone or motion capture part, that's one element of the whole thing; on the same 2000 dollar budget). You can't spend 2 grands on a suit and expect to have a film on a indie budget (below 10,000 dollars budget), you can expect to get Great Motion-Capture. That's it.
Only large indie and big studios who have 10,000-20,000 dollars/or over in budget can buy this and not go in the minus/over their total budget (debts). As such, it is unmarketable to large mass but only pro/big budget productions.

For my film having less than 5 grands in budget, I had to find ways to get around that costly mo-cap problem, I wished to have mo-cap that is as good as this suit if better but costs 10-times less (about 300-500 $):

- Get a Kinetic sensor (Microsoft), Version 2 is better with more sensor and precision. There is also
Playstation Eyes that do exactly the same (but you need 3 (to triangulate the room 3D depth) for one single kinect).
The power of the sensor lies in 'depth-catching', it captures a 3D Depth image of your room and you inside via infra-red wave signal (this 'depth' is then converted to a depth-video that you can use). The mo-cap video is then 'tracked on' as the sensor recognizes your body and your joints (arm, legs, knees, elbows, head, etc..), thus you actually capture your
whole body animation.

- That's just the 'capture part', then I had to find a low cost software that interprets the data
into a FBX mocap skeleton file. I tried Brekel, a good one.
But, like this suit, there can be many mo-cap problem like 'slipping/sliding' foots animation (the
animation is not 'stable' and very twitchy, you must clean the mocap data.), not satisfactory;
tried OpenNI, Nimate, Kinect SDK, Fastmocap, iClone, Windows Mocap BVH, etc none gave results that
were like the perfomance (they were stiff looking and very slip sliding all over with tons of weird
stuff happening; robo-fake/cartoony look basically despite being a true human motion capture).
One that struck greatly but I had to refine it, iPi mocap studio. That one was very accurate
no slip sliding and very little weird off-keyframes of jitterry arms or legs. You could even
'smooth' the motion and retarget the whole skeleton.

As for my 'suit', I created a sort of fake suit :

- For body mocap, the sensor did the job (as good as this suit when mocap datat cleaned up. very organic smooth
and 'human' like)

- For facial mocap, I had a bicycle helmet which I bought a 10$ NTSC mini-camera that weight 30grams or less
and light metal rod to attack in front of my face. This had to be as light as possible and as 'tight' as possible
so I could capture my facial expressions while 'moving' in front of the kinect sensor (so I capture body
mocap, at the same time, as facial-mocap in a 'one performance'. don't do 'multi-segment' performance just
for face or just for body -it looks bad and 'tacked on' when reconstructed in mocap software (again a very
robo jerky animation look). Since facial mocap is a video recording of my face it is 2D video with no depth,
thus I did the Warcraft facial setup (Warcraft movie has the best organic 'realistic subtle' facial mocap ever on the
orcs (the Orcs performance were truly 'human' like, same for the film Rise of the Planet of The Apes (using
facial mocap for the 3D apes. The facial mocap is stellar there)).
2 cameras are far more precise to capture nuanced facial expression than a single camera straight in front the face;
and that's because it captures the left and right side of the face a 2 independent videos and by the angle, captures
the 'depth' of your muscle moving up, down, left, right - but also, backwards and forward in the 3D space - thus
your dots on your face move slightly forward and backwards - that greatly increases mocap precision of acting performance
(which a single camera does not capture and thus low-budget single-facial camera mocap suffers and is 'robo-look')) -

Buy 2 cameras (10$ each, 20$ total) and put them 45o on your left and right side of face;
this way you have 2-feed/2-videos of your face and you can put the 'ink dots' on your face to 'track' them
in your video tracking software (Blender tool is perfect for that, you can track either simultaneously
and each video subsequently in a procedural fashion). I then drew the 'dots' on my face and
my facial expression would 'drive' the mocap 3D facial expressions (by tracking the facial dots).

This 2000$ suit does not capture facial mocap, a big problem , you must buy More mocap stuff (facial
mocap is the most expensive so you blew way over your 2000 budget) to get facial mocap on top of that
(there is no 'facial suit' that exists, though they could easily invent it; strap a tight clothing on someone,S face like a ninja's facial attire with
facial sensors and you would get a 'Facial Mo-Cap Tight Suit').
My total set-up costed less than 300$ and captures as good or better than sensor suit.
That is a saving of 1000% on budget vs a suit like that.

December 16, 2016 at 1:56PM


I'd be interested to learn more about what you've done, do you have links to photos of your setup or a link to the results from this setup? I'd love to see your system in action and the finished footage.

December 20, 2016 at 1:14PM

Matthew Balthrop
Chief Creative Director

Agree. I really enjoy the ease of use w/ ipisoft I'd love to see this bicycle helmet and NTSC mini-camera setup! I don't need to see the cleanup just the physical setup, I'm working my way through a similiar build for fun myself but love any advice/insight on that helmet setup you can give. Specs on powersource, how you handle the data, any good stuff to help save some time be nice. You going w/ a raspberry pi setup?

December 27, 2016 at 10:49AM, Edited December 27, 10:51AM

Remy Sanchez
VFX Producer/Artist/Writer/Director

Hey I have been wanting to work on a project for a few years now that would require mocap but shelved it due to cost. Your setup might help fix that! Do you have any video online of it in action? I would love to do more reading into what you have going on and see if I could reasonably use it for myself. Cheers!

December 28, 2016 at 1:40PM

Christian Bark
Cinema Window-shopper (for now)

Zero studios are hiring for employees right now. GL getting ppl to buy that. lol

March 8, 2017 at 12:39PM, Edited March 8, 12:39PM