Skip to article frontmatterSkip to article content

1Daily News

Generated at 2025-11-12 04:57:48

We have 4 news from different sources.

2feed

2.1全模态到底是不是“1+1>2”?美团UNO-Bench揭示单模态与全模态能力的组合规律

src root src

2025/11/11 05:34 GTM

全模态不是加法,是协同能否成立的科学验证。

2.2通往AGI的歧路:上海AI Lab重磅发现,自进化智能体可能“错误进化

src root src

2025/11/11 05:34 GTM

首次系统性地揭示一种潜藏的风险——“错误进化”

2.3北京内推 | 百度文心一言基座团队模型组招聘大模型方向研究型实习生

src root src

2025/11/11 05:34 GTM

3paper

3.1LLaDA-Rec: Discrete Diffusion for Parallel Semantic ID Generation in Generative Recommendation

arxiv html pdf kimi

2025/11/12 04:57 GTM

Generative recommendation represents each item as a semantic ID, i.e., a sequence of discrete tokens, and generates the next item through autoregressive decoding. While effective, existing autoregressive models face two intrinsic limitations: (1) unidirectional constraints, where causal attention restricts each token to attend only to its predecessors, hindering global semantic modeling; and (2) error accumulation, where the fixed left-to-right generation order causes prediction errors in early tokens to propagate to the predictions of subsequent token. To address these issues, we propose LLaDA-Rec, a discrete diffusion framework that reformulates recommendation as parallel semantic ID generation. By combining bidirectional attention with the adaptive generation order, the approach models inter-item and intra-item dependencies more effectively and alleviates error accumulation. Specifically, our approach comprises three key designs: (1) a parallel tokenization scheme that produces semantic IDs for bidirectional modeling, addressing the mismatch between residual quantization and bidirectional architectures; (2) two masking mechanisms at the user-history and next-item levels to capture both inter-item sequential dependencies and intra-item semantic relationships; and (3) an adapted beam search strategy for adaptive-order discrete diffusion decoding, resolving the incompatibility of standard beam search with diffusion-based generation. Experiments on three real-world datasets show that LLaDA-Rec consistently outperforms both ID-based and state-of-the-art generative recommenders, establishing discrete diffusion as a new paradigm for generative recommendation.