Spatially and Temporally Optimized Video Stabilization

Yu-Shuen Wang1, Feng Liu2, Pu-Sheng Hsu3, Tong-Yee Lee3

IEEE Transactions on Visualization and Computer Graphics

1National Chiao Tung University, Taiwan   2Portland State University   3National Cheng Kung University, Taiwan

The original (a) and the stabilized (b) feature trajectories are shown in the spatial-temporal coordinate system. We smooth feature trajectories while retaining the neighboring feature offsets (c) in each frame to handle parallax. We then warp each video frame based on the stabilized features and crop the maximal overlapping area to produce the result (d).



Properly handling parallax is important for video stabilization. Existing methods that achieve the aim require either 3D reconstruction or long feature trajectories to enforce the subspace or epipolar geometry constraints. In this paper, we present a robust and efficient technique that works on general videos. It achieves high-quality camera motion on videos where 3D reconstruction is difficult or long feature trajectories are not available. We represent each trajectory as a Bezier curve and maintain the spatial relations between trajectories by preserving the original offsets of neighboring curves. Our technique formulates stabilization as a spatial-temporal optimization problem that finds smooth feature trajectories and avoids visual distortion. The Bezier representation enables strong smoothness of each feature trajectory and reduces the number of variables in the optimization problem. We also stabilize videos in a streaming fashion to achieve scalability. The experiments show that our technique achieves high-quality camera motion on a variety of challenging videos that are difficult for existing methods.




Related projects