2015년 7월 5일 일요일

Data-Driven Biped Control

Yoonsang LeeSungeun KimJehee Lee
SIGGRAPH 2010


Our data-driven controller allows the physically-simulated biped character to reproduce challenging motor skills captured in motion data. 

Abstract
We present a dynamic controller to physically simulate under-actuated three-dimensional full-body biped locomotion. Our data-driven controller takes motion capture reference data to reproduce realistic human locomotion through realtime physically based simulation. The key idea is modulating the reference trajectory continuously and seamlessly such that even a simple dynamic tracking controller can follow the reference trajectory while maintaining its balance. In our framework, biped control can be facilitated by a large array of existing data-driven animation techniques because our controller can take a stream of reference data generated on-the-fly at runtime. We demonstrate the effectiveness of our approach through examples that allow bipeds to turn, spin, and walk while steering its direction interactively.

Paper
Download : pdf (1.4MB)

Video

Full video : mov (60.2MB)

Spinning example : 
- original speed - mov (1.2MB)
- 1/2 speed - mov (2.5MB)
- 1/3 speed - mov (3.8MB)
- 1/4 speed - mov (4.8MB)

Slides
SIGGRAPH 2010 talk slides : pptx (2.2MB, without video) / zip (132MB, with video)

Data
Reference motion capture data : zip (0.7MB)

댓글 없음:

댓글 쓰기

참고: 블로그의 회원만 댓글을 작성할 수 있습니다.