**A scalable CTR model inspired by LLM to explore scaling laws **
🔗 Paper (RecSys '25) | 💻 Code
Each Unified Attention Block (UAB) contains:
- Self-Attention: Spatiotemporal behavior modeling
- Cross-Attention: User profile-guided importance scoring
- Dual Alignment Attention: Feature selection
- RMSNorm + SwiGLU FFN (LLM-inspired)
📌 Input: Target-aware sequence = User behaviors + candidates
📌 Output:P(click|S,p,c) = σ(MLP(E_block[-1,:], e_p, e_other))
Due to industrial deployment constraints, we release:
- File:
./handle_layer/handle_lib/handle_rec_unit.py - Key classes:
Mix1k_SUAN: For industrial datasetEleme_SUAN: For Eleme dataset
exp/user1/Mix1k_SUAN/: Industrial dataset configexp/user1/Eleme_SUAN/: Eleme dataset config
@inproceedings{lai2025exploring,
title={Exploring Scaling Laws of CTR Model for Online Performance Improvement},
author={Lai, Weijiang and Jin, Beihong and Zhang, Jiongyan and Zheng, Yiyuan and Dong, Jian and Cheng, Jia and Lei, Jun and Wang, Xingxing},
booktitle={Proceedings of the Nineteenth ACM Conference on Recommender Systems},
pages={114--123},
year={2025},
organization={ACM}
}- Email: [email protected]
- Affiliation: Institute of Software, Chinese Academy of Sciences
- GitHub: https://github.com/laiweijiang/SUAN
⭐ Star us if you find it useful!