A Large Language Model-based Approach for Personalized Search Results Re-ranking in Professional Domains

Authors

  • Tianyu Lu Computer Science, Northeastern University, MA, USA Author
  • Zhongwen Zhou Computer Science, University of California, Berkeley, CA, USA Author
  • Jiayi Wang Computer engineering, Illinois institute of technology, IL, USA Author
  • Yong Wang Information Technology, University of Aberdeen, Aberdeen, United Kingdom Author

DOI:

https://doi.org/10.60087/ijls.v1.n2.001

Keywords:

Large Language Models, Professional Search, Personalization, Cross-encoder Re-ranking

Abstract

Search result personalization and re-ranking in professional domains present significant challenges due to the complexity of domain-specific terminology and varying user expertise levels. This paper proposes a novel framework integrating Large Language Models (LLMs) with personalized search re-ranking for professional domains. The framework incorporates four key components: LLM-based user profile construction, professional domain knowledge encoding, cross-encoder re-ranking, and dynamic weight allocation. The user profile construction module utilizes historical interactions and professional behaviors to generate comprehensive user representations, while the domain knowledge encoding module captures specialized terminology and relationships. A cross-encoder architecture performs deep semantic matching between queries and documents, with results optimized through a dynamic weight allocation strategy. Experimental evaluation on three professional datasets (MedSearch, LegalDoc, and TechQuery) demonstrates significant improvements over existing methods, achieving 15.2% higher nDCG@10 and 12.8% better MRR compared to traditional ranking approaches. The framework maintains stable performance under varying query loads while effectively handling domain-specific terminology and user expertise variations. Ablation studies reveal the substantial impact of each component, with the LLM-based user profile construction and domain knowledge encoding contributing 7.5% and 6.3% improvements respectively. The proposed approach establishes new benchmarks for professional search systems while providing insights into effective integration of LLMs in domain-specific information retrieval.

Downloads

Download data is not yet available.

Downloads

Published

2024-11-23

How to Cite

A Large Language Model-based Approach for Personalized Search Results Re-ranking in Professional Domains. (2024). The International Journal of Language Studies (ISSN : 3078 - 2244), 1(2), 1-6. https://doi.org/10.60087/ijls.v1.n2.001

Similar Articles

1-10 of 43

You may also start an advanced similarity search for this article.