DiscoverAI: post transformersNeurIPS 2025: Large Language Diffusion Models
NeurIPS 2025: Large Language Diffusion Models

NeurIPS 2025: Large Language Diffusion Models

Update: 2025-11-29
Share

Description

This research paper introduces LLaDA, an 8-billion parameter language model based on the masked diffusion model (MDM) architecture, specifically developed to challenge the assumption that core Large Language Model (LLM) capabilities are exclusive to autoregressive models (ARMs). Unlike ARMs that predict the next token sequentially, LLaDA employs a generative approach featuring a forward token-masking process and a reverse process that simultaneously predicts masked tokens using a Transformer network. Trained and evaluated from scratch, LLaDA demonstrates strong scalability and achieves performance comparable to advanced ARM baselines like LLaMA 3 8B across various benchmarks covering general knowledge, math, and code generation. Crucially, the non-autoregressive nature enables bidirectional modeling, which allows LLaDA to effectively address the reversal curse and outperform contemporary models, including GPT-4o, on complex reversal reasoning tasks. These findings confirm that fundamental generative modeling principles, rather than dependence on sequential ARMs, underpin essential LLM capabilities. The work concludes that diffusion models offer a promising new paradigm for building robust, large-scale language models.


Source:

https://openreview.net/pdf?id=KnqiC0znVF

Comments 
loading
In Channel
Meta: SAM 3

Meta: SAM 3

2025-11-2014:22

loading
00:00
00:00
1.0x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

NeurIPS 2025: Large Language Diffusion Models

NeurIPS 2025: Large Language Diffusion Models

mcgrof