-
Couldn't load subscription status.
- Fork 6.5k
Fix/freeu suppress complexhalf warning #12511
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
SrijanUpadhyay
wants to merge
2
commits into
huggingface:main
from
SrijanUpadhyay:fix/freeu-suppress-complexhalf-warning
Open
Fix/freeu suppress complexhalf warning #12511
SrijanUpadhyay
wants to merge
2
commits into
huggingface:main
from
SrijanUpadhyay:fix/freeu-suppress-complexhalf-warning
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This commit adds configurations and setup scripts to resolve NCCL timeout issues during DeepSpeed ZeRO-2 training on H200 GPUs. The changes include: - Extended NCCL and DeepSpeed timeouts - Optimized bucket sizes for gradient communication - CPU and dataloader optimizations - System shared memory improvements - Enhanced debugging capabilities The implementation provides: 1. DeepSpeed ZeRO-2 configuration (ds_config_zero2.json) 2. Environment setup script (setup_training_env.sh) 3. Accelerate configuration (accelerate_config.yaml) These changes improve training stability on H200 GPUs with high-resolution data and aggressive configurations.
...ations When using FreeU with half-precision (torch.float16) models, PyTorch may emit UserWarnings about experimental ComplexHalf support during FFT operations. This change locally suppresses that specific warning in the fourier_filter function to avoid flooding user logs while preserving behavior. - Added warnings import - Added local warning suppression around fftn/ifftn calls when dtype is float16 - Only suppresses the specific ComplexHalf experimental warning
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
When using FreeU with half-precision (torch.float16) models, PyTorch may emit
UserWarnings about experimental ComplexHalf support during FFT operations.
This change locally suppresses that specific warning in the fourier_filter
function to avoid flooding user logs while preserving behavior.
Fix Why use torch.repeat instead of torch.repeat_interleave in train_dreambooth_lora_sdxl #12292