Estimating Cloth Simulation Parameters from Video
Authors
Abstract
Cloth simulations are notoriously difficult to tune due to the
many parameters that must be adjusted to achieve the look of a
particular fabric. In this paper, we present an algorithm for
estimating the parameters of a cloth simulation from video data of
real fabric. A perceptually motivated metric based on matching
between folds is used to compare video of real cloth with
simulation. This metric compares two video sequences of cloth and
returns a number that measures the differences in their folds.
Simulated annealing is used to minimize the frame by frame
error between the metric for a given simulation and
the real-world footage. To estimate all the cloth parameters, we
identify simple static and dynamic calibration experiments that
use small swatches of the fabric. To demonstrate the power of this
approach, we use our algorithm to find the parameters for four
different fabrics. We show the match between the video footage
and simulated motion on the calibration experiments, on new video
sequences for the swatches, and on a simulation of a full skirt.
Paper
Download the paper:
Download the video:
Results
Side-by-side comparison of the optimized waving test with the original
video. The cloth on the left is the actual video data and the cloth
on the right is simulation.
Quicktime (2MB)
AVI(1.2MB)
Validation with the robot arm to show that the difference in
motion is due to the parameters and not merely the driving motion.
Quicktime (2MB)
AVI (1MB)
4 skirts simulated with the parameters recovered from the above
optimization.
Quicktime (3MB)
AVI (3MB)
Validation on real skirts: