Social-Event-Driven Camera Control for Multicharacter Animations

  

Overview

 

In a virtual world, a group of virtual characters can interact with each other, and these characters may leave a group to join another. The interaction among individuals and groups often produces interesting events in a sequence of animation. The goal of this paper is to discover social events involving mutual interactions or group activities in multicharacter animations and automatically plan a smooth camera motion to view interesting events suggested by our system or relevant events specified by a user. Inspired by sociology studies, we borrow the knowledge in Proxemics, social force, and social network analysis to model the dynamic relation among social events and the relation among the participants within each event. By analyzing the variation of relation strength among participants and spatiotemporal correlation among events, we discover salient social events in a motion clip and generate an overview video of these events with smooth camera motion using a simulated annealing optimization method. We tested our approach on different motions performed by multiple characters. Our user study shows that our results are preferred in 66.19 percent of the comparisons with those by the camera control approach without event analysis and are comparable (51.79 percent) to professional results by an artist.

 

 

Publications

 

·        I-Cheng Yeh, Wen-Chieh Lin, Tong-Yee Lee, Hsin-Ju Han, Jehee Lee, and Manmyung Kim, “Social-Event-Driven Camera Control for Multicharacter Animations,” To appear in IEEE Transactions on Visualization and Computer Graphics.

 

Real-time rendering results

 

Demo Video [Quicktime mp4]