university of glasgow,

No.24-06 Generative Sequential Recommendation

Follow May 22, 2024 · 1 min read
No.24-06 Generative Sequential Recommendation
Share this

In this talk, we first introduce the Sequential Recommendation problem and draw parallels between language modelling and recommender systems. To set the stage, we also briefly cover state-of-the-art methods, such as BERT4Rec, the traditional “score-and-rank” approach for producing recommendations, and describe typical recommendation goals (accuracy, diversity, novelty, etc.).

Moving forward, then demonstrate why the “score-and-rank” approach may fail when recommendation goals include beyond-accuracy objectives, such as diversity. We show how this problem can be solved using the generative recommendations approach and how we can optimise generative recommendation models for almost any recommendation goal using reinforcement learning.

We also address the challenges of generative recommendation models, such as large catalogues that can be much larger when compared to the vocabularies of language models and data sparsity. Finally, we will discuss the role of generative Large Language Models in the future of sequential recommendation.

Speaker Bio

Aleksandr Petrov is a last year PhD candidate at the University of Glasgow, specialising in the usage of Transformers for recommendation with a large catalogue. His scholarly contributions include papers in such venues as RecSys, WSDM, ECIR and ToRS. He received the Best Paper Award at RecSys 2023 and was nominated for the Best Student Paper Award at RecSys 2022. Before his PhD, he accumulated over ten years of industry experience with big tech companies such as Amazon and Yandex, and he co-founded a recommendation startup, E-Contenta. Aleksandr also has experience lecturing in Big Data and ML courses, presenting RecSys tutorials, and invited talks.

More Details

  • When: Wed 22 May 2024, at 3:00 - 4:00 pm (GMT+10)
  • Speaker: Aleksandr Petrov (University of Glasgow)
  • Host: Dr Ruihong Qiu
  • Venue: Online only
  • Zoom: https://uqz.zoom.us/j/81016859627
Join Newsletter
Get the latest news right in your inbox. We never spam!