You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- We use configuration files to store most options which were in argument parser. The definitions of options are detailed in ```config/defaults.py```.
22
21
- HRNet model is now supported.
22
+
- We use configuration files to store most options which were in argument parser. The definitions of options are detailed in ```config/defaults.py```.
23
+
- We conform to Pytorch practice in data preprocessing (RGB [0, 1], substract mean, divide std).
23
24
24
25
25
26
## Highlights
@@ -61,7 +62,7 @@ Decoder:
61
62
- UPerNet (Pyramid Pooling + FPN head, see [UperNet](https://arxiv.org/abs/1807.10221) for details.)
62
63
63
64
## Performance:
64
-
IMPORTANT: We use our self-trained base model on ImageNet. The model takes the input in BGR form (consistent with opencv) instead of RGB form as used by default implementation of PyTorch. The base model will be automatically downloaded when needed.
65
+
IMPORTANT: The base ResNet in our repository is a customized (different from the one in torchvision). The base models will be automatically downloaded when needed.
0 commit comments