Adult2Child: Dynamic Scaling Laws to Create Child-like Motion

Citation:

Dong, Y., Paryani, S., Rana, N., Aloba, A., Anthony, L., Jain, E. 2017. Adult2Child: dynamic scaling laws to create child-like motion. MIG’17, November 8–10, 2017, Barcelona, Spain, pages 1-10.  [pdf]

Abstract:

“Child characters are widely used in animations and games; however, child motion capture databases are less easily available than those involving adult actors. Previous studies have shown that there is a perceivable difference in adult and child motion based on point light displays, so it may not be appropriate to just use adult motion data on child characters. Due to the costs associated with motion capture of child actors, it would be beneficial if we could create a child motion corpus by translating adult motion into child-like motion. Previous works have proposed dynamic scaling laws to transfer motion from one character to its scaled version. In this paper, we conduct a perception study to understand if this procedure can be applied to translate adult motion into child-like motion. Viewers were shown three types of point light display videos: adult motion, child motion, and dynamically scaled adult motion and asked to identify if the translated motion belongs to a child or an adult. We found that the use of dynamic scaling led to an increase in the number of people identifying the motion as belonging to a child compared to the original adult motion. Our findings suggest that although the dynamic scaling method is not a final solution to translate adult motion into child-like motion, it is nevertheless an intermediate step in the right direction. To better illustrate the original and dynamically scaled motions for the purposes of this paper, we rendered the dynamically scaled motion on an androgynous manikin character.”

File attachments:

Dong-et-al-MIG2017
Dong-et-al-MIG2017-presentation-slides