Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

ChenyiZhang007/MattePro

Repository files navigation

MattePro: A Professional Matting Model for Anything

Chenyi Zhang*, Yiheng Lin*, Ting Liu, Xiaochao Qu, Luoqi Liu, Fang Zhao, Yao Zhao, Yunchao Wei+

Beijing Jiaotong University | MT Lab, Meitu Inc | Nanjing University

Introduction

drawing

a matting method that supports various forms of interaction

Results

Model AIM-500 MSE ↓ AIM-500 SAD ↓ Comp-1K MSE ↓ Comp-1K SAD ↓ P3M-500-NP MSE ↓ P3M-500-NP SAD ↓
MatteAnything 36.97 81.50 51.33 157.97 28.98 58.91
MattingAnything 14.48 42.63 26.76 123.60 9.21 25.82
SmartMatting 7.65 25.32 31.23 122.38 3.73 14.11
SEMatte 7.07 24.30 7.44 57.00 2.76 10.88
MattePro (Ours) 3.93 18.75 5.67 39.09 2.17 10.11

Quick Installation

Run the following command to install required packages.

conda create -n mattepro python==3.10
conda activate mattepro
pip install torch==2.4.1 torchvision==0.19.1 torchaudio==2.4.1 --index-url https://download.pytorch.org/whl/cu121
pip install -r requirements.txt

Install detectron2 please following its document, you can also run following command

python -m pip install 'git+https://github.com/facebookresearch/detectron2.git'

Demo

Image Matting

drawing

Step 1.
Download the pretrained weights of MattePro and MEMatte, and place them in the weights directory.

weights/
├── MattePro.pth
└── MeMatte.pth

Step 2.

python demo_image_matting.py

Tips for Click Mode:
Click the left mouse button on foreground region
Click the roll mouse button on unknown region
Click the right mouse button on background region

Video Matting

example_1.mov
example_2.mov

Step 1.
Download the weight of BirefNet, and place them in the weights directory.

weights/
├── MattePro.pth
├── MeMatte.pth
└── BiRefNet-HRSOD_D-epoch_130.pth

Step 2.

python demo_video_matting.py --input-path 'videos/hair.mp4'

ps:This mode is under research and currently only supports salient video matting.

Evaluate

examples:

B-Box Matting Mode

python evaluate.py --testset AIM500

ps: If your GPU memory is limited, please set pachify to True.

Train

MattePro is a two-stage workflow.

First stage

This stage can be trained within 24 hours using a single GPU(memory > 10GB).

Step 1: Download Dataset

Segmentation Dataset: DUTS, DIS5K

Matting Dataset: DIM, Distinctions-646, AM-2K, P3M-10k, Transparent-460

Background Images: COCO2017 (training images)

And then configure the path in in config.yml

Step 2: Pretrained SAM2 Checkpoint

Download checkpoint of SAM2 large, and place it in the pretrained directory

pretrained/
└── sam2.1_hiera_large.pt

Step 3:

python train.py

Second stage

For the second stage, you can refer to MEMatte for training, or directly use our pre-trained MEMatte checkpoint. You can even replace MEMatte with any trimap-based matting model(e.g., ViTMatte or AEMatter, though this will require some minor necessary modifications to the code), and it still produces excellent results.

License

The code is released under the MIT License. It is a short, permissive software license. Basically, you can do whatever you want as long as you include the original copyright and license notice in any copy of the software/source.

Acknowledgement

Our project is developed based on SEMatte, MEMatte, SimpleClick, BiRefNet. Thanks for their wonderful work!
If you have any questions, feel free to contact us through zchenyi007@gmail.com and 23120298@bjtu.edu.cn

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

AltStyle によって変換されたページ (->オリジナル) /