Sorry! De informatie die je zoekt, is enkel beschikbaar in het Engels.
This programme is saved in My Study Choice.
Something went wrong with processing the request.
Something went wrong with processing the request.

FoMo (Foundation model) lectures

The FoMo lectures is a series of seminars highlighting the work being done in Amsterdam around foundation models: large, highly re-usable machine learning models trained on great amounts of data. These include large language models like GPT and generative vision models like DALLE·2, Stable Diffusion and Midjourney. The talks are held alternately at the UvA and VU, and feature speakers from a variety of domains. They are intended for a technical audience.

13 June 12:00-13:00 FoMo lectures: Gabriel Bénédict and Wouter van Atteveldt
talk 1: RecFusion: A Binomial Diffusion Process for 1D Data for Recommendation, Gabriel Bénédict
talk 2: Are LLMs and Transfer Learning a game changer for Computational Social Science?, Wouter van Atteveldt

In room NU-3A06 at VU or online at https://vu-live.zoom.us/j/95377231761?pwd=dmFwZHNrSHlOMTdNQVRJKzArZEZZQT09

For more information see below or check: https://pbloem.github.io/fomo/

**********************************************

20 June 12:00-13:00 FoMo lectures, session 3


Room L3.36, Lab42 at UvA or online at https://vu-live.zoom.us/j/95377231761?pwd=dmFwZHNrSHlOMTdNQVRJKzArZEZZQT09

************************************************

27 June 12:00-13:00 FoMo lectures, session 4

Location TBD or online at https://vu-live.zoom.us/j/95377231761?pwd=dmFwZHNrSHlOMTdNQVRJKzArZEZZQT09

*************************************************

Gabriel Bénédict (IR Lab, UvA)

RecFusion: A Binomial Diffusion Process for 1D Data for Recommendation

Generative Information Retrieval (a.k.a. Generative Neural Search or chatGPT + attribution + no-hallucination) has experienced substantial growth across multiple research communities and has been highly visible in the popular press. Theoretical, empirical, and actual user-facing products have been released that retrieve documents (via generation) (Generative Document Retrieval) or directly generate answers given an input request (Grounded Answer Generation).

A subfield of Generative IR, Generative Recommendations, is still in its infancy. We propose RecFusion to use diffusion models to generate recommendations. We benchmark classical diffusion formulations (normal distribution for the forward and backward diffusion process, Unets and ELBO) against formulations fitted to the RecSys setting:  1D diffusion (user-by-user), binomial diffusion and multinomial loss (like in MultVAE). We also experiment with diffusion guidance to condition the generation of recommendation strips on movie genre (a.k.a. controllable recommendation).

Wouter van Atteveldt (Computational Communication Science, VU)

Are LLMs and Transfer Learning a game changer for Computational Social Science?

A core part of computational social science is extracting structured data such as political topic or stances from unstructured data such as news, twitter, or tiktok feeds. Although Supervised Machine Learning has been part of our toolkit for at least two decades, it has traditionally suffered from data scarcity problems as the complex and shifting nature of social science concepts are unsuitable for large standardized data sets such as common in computer vision and NLP. BERT and other pre-trained models may well be a game changer here, as they can offer valid results even with relatively small data sets. We show empirically how BERT and BERT-NLI can be used for valid measurement of political communication concepts in a number of settings.

If you can't make it in-person, you can also join online at the following zoom link:

https://vu-live.zoom.us/j/95377231761?pwd=dmFwZHNrSHlOMTdNQVRJKzArZEZZQT09

We hope to see you there,

Peter Bloem & Cees Snoek

* https://vu.nl/en/about-vu/more-about/new-university-building