There’s an interesting article on the University of Michigan news service I found via Bruce Sterling’s blog. Worth reading, and be sure to check the “See more images” link under the line drawing. It includes a video of the process; sure looks like Maya on the monitor. The operator appears to use a MicroScribe to get the point cloud into the app, but the data appears to be NURBs (which might make sense for export to the rapid prototyper since I’m guessing it went as an iges file or a surface converted to .stl). I’d like to know how he made the conversion or if they have a plug-in which does it on the fly. Or maybe there’s a feature of which I’m unaware. Damn. Going to have to find out.
(edit: figure it must be Alias Studio; didn’t notice that it was a Cintiq and was instead looking at what appears to be an animation bar at the bottom of the app.)