Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Add Dream 7B Diffusion Large Language Model Pipeline #12091

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
dg845 wants to merge 28 commits into huggingface:main
base: main
Choose a base branch
Loading
from dg845:dream-7b-pipeline

Conversation

Copy link
Contributor

@dg845 dg845 commented Aug 7, 2025

What does this PR do?

This PR implements a pipeline for the Dream 7B diffusion large language model (blog post, weights and code, repo). Dream is a masked (discrete) diffusion model for text which claims to perform comparably to similarly sized SOTA autoregressive LLMs such as Qwen 2.5 7B on NLP tasks and have superior performance on planning tasks.

Fixes #12017.

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@yiyixuxu
@a-r-r-o-w
@ntoxeg

yiyixuxu, jiacheng-ye, ntoxeg, a-r-r-o-w, and tolgacangoz reacted with heart emoji
@dg845 dg845 mentioned this pull request Aug 7, 2025
2 tasks
Copy link
Contributor Author

dg845 commented Aug 15, 2025

@yiyixuxu @a-r-r-o-w the Dream 7B model uses a transformers-style custom tokenizer, which I believe is based on GPT2Tokenizer but with different pre-tokenization rules. Since this tokenizer is not in transformers, should I open a PR there to add it? And if so, do you think the Dream transformer should also be added to transformers? The original transformer implementation is also transformers-compatible.

Copy link
Member

In case of certain custom implementations, we try to implement and keep the relevant files within diffusers. Some examples I could quickly find are:

Maybe in this case you could create a tokenizer_gpt.py file within the pipeline directory to use it? WDYT @yiyixuxu?

I don't think the model implementation can live in transformers if we're using it for diffusion sampling. For example, Cosmos 1.0 released with an autoregressive and diffusion version, but we have two different implementations and PRs for support it in both libraries. So, let's maintain it here :)

dg845 reacted with thumbs up emoji

dg845 added 14 commits August 19, 2025 17:13
Copy link
Contributor Author

dg845 commented Aug 26, 2025

Hi @a-r-r-o-w, I think this PR is ready for an initial design review :).

a-r-r-o-w reacted with heart emoji

@dg845 dg845 marked this pull request as ready for review August 26, 2025 01:36
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Reviewers

@a-r-r-o-w a-r-r-o-w Awaiting requested review from a-r-r-o-w

@yiyixuxu yiyixuxu Awaiting requested review from yiyixuxu

At least 1 approving review is required to merge this pull request.

Assignees
No one assigned
Labels
None yet
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

Dream 7B
2 participants

AltStyle によって変換されたページ (->オリジナル) /