Elastic Implicit Skinning (SIGGRAPH ASIA 2014)

Описание к видео Elastic Implicit Skinning (SIGGRAPH ASIA 2014)

The video accompanying our SIGGRAPH ASIA 2014 research paper: "Robust Iso-Surface Tracking for Interactive Charater Skinning"

More on the official Elastic Implicit Skinning project's page:
http://rodolphe-vaillant.fr/permalink...

Related work:
https://vimeo.com/65966757 // Original Implicit Skinning
https://vimeo.com/57928521 // Blending operators enabling our work

---------------------------------------------------------------------------------------------
Remarks:

This work is a RESEARCH project this means:

- Elastic Implicit Skinning is implemented as a CUDA standalone application there is currently no plugins for Blender, 3DSmax, Maya... yet. However if you are interested by this technology we are open to collaborations (contact us for more details http://rodolphe-vaillant.fr, http://www.irit.fr/~Loic.Barthe/)

- 90% of the programming, animation and rendering work was done by a single person whose never been specifically trained in any artistic work (there is of course a team doing the theoretical research and 3D characters are provided by third parties)

---------------------------------------------------------------------------------------------
Paper abstract:

We present a novel approach for interactive character skinning, which takes advantage of the best features of the recent implicit skinning method, and make it robust to extreme character movements, such as those involving contacts between different parts of the skin and sharp bending angles at joints. While keeping the basic idea of implicit skinning, namely approximate the character at each animation step by a 3D scalar field in which mesh-vertices are appropriately re-projected, we depart from the processing pipeline used so far. Instead of being bound by an initial skinning solution, used to initialize the shape at each time step, we use the skin mesh to directly track iso-surfaces of the field over time. Achieving this requires solving two challenging problems: firstly, all contact surfaces generated between skin parts should be captured as iso-surfaces of the implicit field; secondly, the tracking method should capture elastic skin effects when the joints bend, but also insure that the skin comes back to its rest shape when the character comes back to rest. Our solutions to these problems include: new composition operators enabling the combination of blending effects with local self-contact between implicit surfaces, and a tangential relaxation scheme derived from the as-rigid-as possible energy to solve the tracking problem.The result is a very robust interactive system that can handle contacts in a way that is visually plausible, exhibits the global effect of skin elasticity (sliding), and is suitable for use in a production pipeline.

---------------------------------------------------------------------------------------------
Authors:
Rodolphe Vaillant(1,2), Gaël Guennebaud(3), Loïc Barthe(1), Brian Wyvill(2), Marie-Paule Cani(4)

(1)IRIT - Université de Toulouse, (2)University of Victoria, (3)Inria Bordeaux, (4)LJK - Grenoble Universités - Inria

---------------------------------------------------------------------------------------------
Acknowledgments:

We thank artists, companies and universities who provided us with nice 3D models and animations. Laura Paiardini (http://laurapaiardini.rd-h.fr/) from Inria Grenoble provided us with the skeleton animation of the Armadillo and Garrett Pond (http://garrettpond.blogspot.ca) from Brigham Young University (BYU) modeled Jeff model (http://vimeo.com/61555876#t=14s) wich is the big guy from the short movie 'Owned' (http://animation.byu.edu/content/owned). Finally the famous armadillo model comes from the Standford university 3D scan repository.

We also thank the blender foundation as 90% of our renderings are done with Blender with either the "cycle renderer" or the "internal renderer".

This work has been partially funded by the IM&M project http://www.irit.fr/~Loic.Barthe/imm.php (ANR-11-JS02-007) and the advanced grant EXPRESSIVE from the European Research council. Partial funding also comes from the Natural Sciences and Engineering Research Council of Canada, the GRAND NCE, Canada and Intel Corps. Finally, this work received partial support from the Royal Society Wolfson Research Merit Award.

Комментарии

Информация по комментариям в разработке