skip to main content
research-article

Animating responsive characters with dynamic constraints in near-unactuated coordinates

Published: 01 December 2008 Publication History

Abstract

This paper presents a technique to enhance a kinematically controlled virtual character with a generic class of dynamic responses to small perturbations. Given an input motion sequence, our technique can synthesize reactive motion to arbitrary external forces with a specific style customized to the input motion. Our method re-parameterizes the motion degrees of freedom based on joint actuations in the input motion. By only enforcing the equations of motion in the less actuated coordinates, our approach can create physically responsive motion based on kinematic pose control without explicitly computing the joint actuations. We demonstrate the simplicity and robustness of our technique by showing a variety of examples generated with the same set of parameters. Our formulation focuses on the type of perturbations that significantly disrupt the upper body poses and dynamics, but have limited effect on the whole-body balance state.

Supplementary Material

JPG File (a112-ye-mp4_hi.jpg)
MOV File (a112-ye-mp4_hi.mov)

References

[1]
Abe, Y., and Popović, J. 2006. Interactive animation of dynamic manipulation. In Eurographics/SIGGRAPH Symposium on Computer Animation.
[2]
Abe, Y., da Silva, M., and Popović, J. 2007. Multiobjective control with frictional contacts. In Eurographics/SIGGRAPH Symposium on Computer Animation, 249--258.
[3]
Alexandrov, A., Frolov, A., Horak, F., Carlson-Kuhta, P., and Park, S. 2005. Feedback equilibrium control during human standing. Biological Cybernetics 93, 5, 309--322.
[4]
Arikan, O., O'Brien, J. F., and Forsyth, D. A. 2005. Pushing people around. In Eurographics/SIGGRAPH Symposium on Computer Animation, 59--66.
[5]
Barbic, J., Safonova, A., Pan, J.-Y., Faloutsos, C., Hodgins, J. K., and Pollard, N. S. 2004. Segmenting motion capture data into distinct behaviors. In Graphics Interface, vol. 62, 185--194.
[6]
Brand, M., and Hertzmann, A. 2000. Style machines. In SIGGRAPH, 183--192.
[7]
Chai, J., and Hodgins, J. K. 2007. Constraint-based motion optimization using a statistical dynamic model. ACM Trans. on Graphics (SIGGRAPH) 26, 3 (Aug.), 8.
[8]
Cooper, S., Hertzmann, A., and Popović, Z. 2007. Active learning for real-time motion controllers. ACM Trans. on Graphics (SIGGRAPH) 26, 3 (Aug.), 5.
[9]
da Silva, M., Abe, Y., and Popović, J. 2008. Interactive simuation of stylized human locomotion. ACM Trans. on Graphics (SIGGRAPH) 27, 3 (Aug.).
[10]
Georgopoulos, A., Kalaska, J., and Massey, J. 1981. Spatial trajectories and reaction times of aimed movements: Effects of practice, uncertainty and change in target location. Journal of Neurophysiology 46, 725--743.
[11]
Gill, P., Saunders, M., and Murray, W. 1996. Snopt: An sqp algorithm for large-scale constrained optimization. Tech. Rep. NA 96-2, University of California, San Diego.
[12]
Jenkins, O. C., and Matarić, M. J. 2002. Deriving action and behavior primitives from human motion data. In IEEE/RSJ, 2551--2556.
[13]
Kokkevis, E., Metaxas, D., and Badler, N. I. 1996. User-controlled physics-based animation for articulated figures. In Computer Animation.
[14]
Komura, T., Leung, H., and Kuffner, J. 2004. Animating reactive motions for biped locomotion. In VRST '04: Proceedings of the ACM symposium on Virtual reality software and technology, 32--40.
[15]
Komura, T., Ho, E. S. L., and Lau, R. W. H. 2005. Animating reactive motions using momentum-based inverse kinematics. Computer Animation and Virtual Worlds, 16, 213--223.
[16]
Liu, C. K., Hertzmann, A., and Popović, Z. 2005. Learning physics-based motion style with nonlinear inverse optimization. ACM Trans. on Graphics (SIGGRAPH) 24, 3 (July), 1071--1081.
[17]
Mandel, M., 2004. Versatile and interactive virtual humans: Hybrid use of data-driven and dynamics-based motion synthesis.
[18]
McCann, J., and Pollard, N. S. 2007. Responsive characters from motion fragments. ACM Trans. on Graphics (SIGGRAPH) 26, 3 (Aug.).
[19]
Miall, R. C., Weir, D. J., and Stein, J. F. 1985. Visuomotor tracking with delayed visual feedback. Neuroccience 16, 3, 511--520.
[20]
Oshita, M., and Makinouchi, A. 2001. A dynamic motion control technique for human-like articulated figures. Computer Graphics Forum 20, 3, 192--202.
[21]
Safonova, A., Hodgins, J. K., and Pollard, N. S. 2004. Synthesizing physically realistic human motion in low-dimensinal, behavior-specific spaces. ACM Trans. on Graphics (SIGGRAPH) 23, 3 (July), 514--521.
[22]
Shapiro, A., Pighin, F. H., and Faloutsos, P. 2003. Hybrid control for interactive character animation. In Pacific Graphics, 456--461.
[23]
Shin, H. J., and Oh, H. S. 2006. Fat graphs: Constructing an interactive character with continuous controls. In Eurographics/SIGGRAPH Symposium on Computer Animation.
[24]
Sok, K. W., Kim, M., and Lee, J. 2007. Simulating biped behaviors from human motion data. ACM Trans. on Graphics (SIGGRAPH) 26, 3 (Aug.), 107.
[25]
Ting, L. H. 2007. Dimensional reducation in sensorimotor systems. Computational Neuroscience 13, 2 (Apr.), 103--136.
[26]
Torres-Oviedo, G., and Ting, L. H. 2007. Muscle synergies characterizing human postural responses. Journal of Neurophysiology 98, 2144--2156.
[27]
Tresch, M. C., Cheung, V. C., and d'Avella, A. 2006. Matrix factorization algorithms for the identification of muscle synergies: Evaluation on simulated and experimental data sets. Journal of Neurophysiology 95, 2199--2212.
[28]
Treuille, A., Lee, Y., and Popović, Z. 2007. Near-optimal character animation with continuous control. ACM Trans. on Graphics (SIGGRAPH) 26, 3 (Aug.).
[29]
Yin, K., Cline, M. B., and Pai, D. K. 2003. Motion perturbation based on simple neuromotor control models. In Pacific Graphics.
[30]
Yin, K., Pai, D. K., and van de Panne, M. 2005. Data-driven interactive balancing behaviors. In Pacific Graphics.
[31]
Yin, K., Loken, K., and van de Panne, M. 2007. Simbicon: simple biped locomotion control. ACM Trans. on Graphics (SIGGRAPH) 26, 3, 105--115.
[32]
Zordan, V. B., and Hodgins, J. K. 2002. Motion capture-driven simulations that hit and react. In Eurographics/SIGGRAPH Symposium on Computer Animation, 89--96.
[33]
Zordan, V. B., Majkowska, A., Chiu, B., and Fast, M. 2005. Dynamic response for motion capture animation. ACM Trans. on Graphics (SIGGRAPH) 24, 3 (July), 697--701.

Cited By

View all
  • (2021)Visual Quality of 3D Meshes With Diffuse Colors in Virtual Reality: Subjective and Objective EvaluationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.303615327:3(2202-2219)Online publication date: 1-Mar-2021
  • (2019)Low Dimensional Motor Skill Learning Using CoactivationProceedings of the 12th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3359566.3360071(1-10)Online publication date: 28-Oct-2019
  • (2019)Synthesis of biologically realistic human motion using joint torque actuationACM Transactions on Graphics10.1145/3306346.332296638:4(1-12)Online publication date: 12-Jul-2019
  • Show More Cited By

Index Terms

  1. Animating responsive characters with dynamic constraints in near-unactuated coordinates

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Graphics
    ACM Transactions on Graphics  Volume 27, Issue 5
    December 2008
    552 pages
    ISSN:0730-0301
    EISSN:1557-7368
    DOI:10.1145/1409060
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 December 2008
    Published in TOG Volume 27, Issue 5

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. motion capture
    2. physically based animation

    Qualifiers

    • Research-article

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)6
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 20 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2021)Visual Quality of 3D Meshes With Diffuse Colors in Virtual Reality: Subjective and Objective EvaluationIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.303615327:3(2202-2219)Online publication date: 1-Mar-2021
    • (2019)Low Dimensional Motor Skill Learning Using CoactivationProceedings of the 12th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3359566.3360071(1-10)Online publication date: 28-Oct-2019
    • (2019)Synthesis of biologically realistic human motion using joint torque actuationACM Transactions on Graphics10.1145/3306346.332296638:4(1-12)Online publication date: 12-Jul-2019
    • (2018)Real-time locomotion with character-fluid interactionsProceedings of the 11th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3274247.3274515(1-8)Online publication date: 8-Nov-2018
    • (2017)MechVRProceedings of the 10th International Conference on Motion in Games10.1145/3136457.3136468(1-5)Online publication date: 8-Nov-2017
    • (2016)Data-driven inverse dynamics for human motionACM Transactions on Graphics10.1145/2980179.298244035:6(1-12)Online publication date: 5-Dec-2016
    • (2016)A synergy‐based control solution for overactuated characters: Application to throwingComputer Animation and Virtual Worlds10.1002/cav.174328:6Online publication date: 24-Nov-2016
    • (2013)A virtual reality setup for controllable, stylized real-time interactions between humans and avatars with sparse Gaussian process dynamical modelsProceedings of the ACM Symposium on Applied Perception10.1145/2492494.2492515(41-44)Online publication date: 22-Aug-2013
    • (2013)Synthesizing Two‐character Interactions by Merging Captured Interaction Samples with their Spacetime RelationshipsComputer Graphics Forum10.1111/cgf.1221032:7(41-50)Online publication date: 25-Nov-2013
    • (2013)Real-Time Reactive Biped CharactersTransactions on Computational Science XVIII10.1007/978-3-642-38803-3_9(155-171)Online publication date: 2013
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media