Flow-based Video Synthesis and Editing
Authors
Abstract
This paper presents a novel algorithm for synthesizing and editing
video of natural phenomena that exhibit continuous flow patterns.
The algorithm analyzes the motion of textured particles in the
input video along user-specified flow lines, and synthesizes
seamless video of arbitrary length by enforcing temporal
continuity along a second set of user-specified flow lines. The
algorithm is simple to implement and use. We used this technique
to edit video of waterfalls, rivers, flames, and smoke.
Paper
Download the paper (to appear in SIGGRAPH 04):
Download the final video:
Results
Waterfall:
- Infinite Sequence. DivX (4.5MB)
- Editing to add two extra channels. DivX (MB)
Niagara Falls:
- Infinite Sequence. DivX (3.2MB)
- Editing to remove the foam. DivX (2MB)
- Editing to remove the foam. DivX (3.2MB)
Stream:
- Infinite Sequence. DivX (2.1MB)
- Adding a log into the stream. DivX (2.4MB)
Fire:
- Infinite Sequence. DivX (2MB)
- Fire blowing in the wind. DivX (2.2MB)
- Changing the periodicity by varying K. DivX (3.3MB)
Smoke:
- Infinite Sequence. DivX (2MB)
- Adding an extra chimney and changing the smoke direction. DivX (2.4MB)
Extensions - 3D terrain and particle systems
Video control using particle systems:
- DivX (24MB). We automatically generate the edited (output) flow lines using a 3D particle system (in Maya).
Our algorithm renders the output sequences using the texture from the input sequence.
City scene with a moving camera, 3D terrain and particle systems:
- DivX (2.1MB). We modeled the city scene in Maya with a 3D terrain and a moving camera. The output flow lines
were obtained using a particles that move on the terrain and fall under gravity. The input flow lines were obtained from the video clip of Niagara falls (single view).
Making of the city scene:
- DivX (8.8MB). Showing the different stages: model, particles, input video and the rendered result.