Sitemap
A list of all the posts and pages found on the site. For you robots out there, there is an XML version available for digesting as well.
Pages
Posts
Future Blog Post
Published:
This post will show up by default. To disable scheduling of future posts, edit config.yml
and set future: false
.
Blog Post number 4
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 3
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 2
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
Blog Post number 1
Published:
This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.
portfolio
Portfolio item number 1
Short description of portfolio item number 1
Portfolio item number 2
Short description of portfolio item number 2
publications
Paper Title Number 2
Published in Journal 1, 2010
This paper is about the number 2. The number 3 is left for future work.
Citation: Your Name, You. (2010). "Paper Title Number 2." Journal 1. 1(2).
Download Paper | Download Slides
Paper Title Number 3
Published in Journal 1, 2015
This paper is about the number 3. The number 4 is left for future work.
Citation: Your Name, You. (2015). "Paper Title Number 3." Journal 1. 1(3).
Download Paper | Download Slides
A Unified Framework for Multi-Domain CTR Prediction via Large Language Models
Published in TOIS, ACM Transactions on Information Systems, 2024
Uni-CTR leverages Large Language Models and pluggable domain networks to address the seesaw phenomenon and scalability challenges in multi-domain CTR prediction, achieving SOTA performance across various scenarios.
Citation: Zichuan Fu, Xiangyang Li, Chuhan Wu, Yichao Wang, Kuicai Dong, Xiangyu Zhao, Mengchen Zhao, Huifeng Guo, and Ruiming Tang. 2024. A Unified Framework for Multi-Domain CTR Prediction via Large Language Models. ACM Trans. Inf. Syst. Just Accepted (October 2024). https://doi.org/10.1145/3698878
Download Paper
LLM4MSR: An LLM-Enhanced Paradigm for Multi-Scenario Recommendation
Published in CIKM’24 (Full Research Paper track), Proceedings of the 33rd ACM International Conference on Information and Knowledge Management, 2024
LLM4MSR enhances multi-scenario recommendation by leveraging LLM for knowledge extraction and hierarchical meta networks, achieving improved performance and interpretability without LLM fine-tuning while maintaining deployment efficiency.
Citation: Yuhao Wang, Yichao Wang, Zichuan Fu, Xiangyang Li, Wanyu Wang, Yuyang Ye, Xiangyu Zhao, Huifeng Guo, and Ruiming Tang. 2024. LLM4MSR: An LLM-Enhanced Paradigm for Multi-Scenario Recommendation. In Proceedings of the 33rd ACM International Conference on Information and Knowledge Management (CIKM '24). Association for Computing Machinery, New York, NY, USA, 2472–2481. https://doi.org/10.1145/3627673.3679743
Download Paper
Model Merging for Knowledge Editing
Published in ACL’25(Industry Track), Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics, 2025
A two-stage framework combining robust supervised fine-tuning with model merging for efficient knowledge editing in LLMs that preserves general capabilities while outperforming existing methods.
Citation:
Download Paper
Training-free LLM Merging for Multi-task Learning
Published in ACL’25, Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics, 2025
Hi-Merging: a training-free method that merges specialized LLMs into a unified multi-task model using hierarchical pruning and scaling, preserving individual strengths while minimizing parameter conflicts across languages and tasks.
Citation:
Download Paper
talks
Talk 1 on Relevant Topic in Your Field
Published:
This is a description of your talk, which is a markdown file that can be all markdown-ified like any other post. Yay markdown!
Conference Proceeding talk 3 on Relevant Topic in Your Field
Published:
This is a description of your conference proceedings talk, note the different field in type. You can put anything in this field.
teaching
Teaching experience 1
Undergraduate course, University 1, Department, 2014
This is a description of a teaching experience. You can use markdown like any other post.
Teaching experience 2
Workshop, University 1, Department, 2015
This is a description of a teaching experience. You can use markdown like any other post.