Adam R. Kosiorek

Hi, I’m Adam. I’m passionate about machine learning, particularly deep generative modelling and representation learning. I spend a lot of time thinking about what objects are and how to learn about them from vision without any supervision. I’m also interested in various inference methods, neural net architectures including attention and memory, and in everything that can be described as “machine reasoning”. Feel free to reach out if you want to chat about or work on any of these topics!

Along my studies, I worked at Bloomberg, Samsung and IBM on various machine learning projects. I am always looking for exciting research and professional opportunities, so feel free to get in touch.

In my free time I read lots of books, train calithenics, lift heavy weights, jog and try to spend as much time as possible hiking somewhere in the mountains.

My CV (updated 03/2020)


Y. Shi, N. Siddharth, P. H.S. Torr, A. R. Kosiorek “Adversarial Masking for Self-Supervised Learning”, ICML, 2022. code

A. R. Kosiorek, H. Strathmann, D. Zoran, P. Moreno, R. Schneider, S. Mokra, D. J. Rezende “NeRF-VAE: A Geometry Aware 3D Scene Generative Model”, ICML, 2021.

K. Stelzner, K. Kersting, A. R. Kosiorek “Decomposing 3D Scenes into Objects via Unsupervised Volume Segmentation”, arXiv, 2021.

A. R. Kosiorek, H. Kim, D. J. Rezende “Conditional Set Generation with Transformers”, Workshop on Object-Oriented Learning at ICML, 2020.

K. Stelzner, K. Kersting, A. R. Kosiorek “Generative adversarial set transformers”, Workshop on Object-Oriented Learning at ICML, 2020.

M. Engelcke, A. R. Kosiorek, O. Parker Jones, I. Posner “GENESIS: Generative Scene Inference and Sampling with Object-Centric Latent Representations”, ICLR, 2020. code

J. Xu, J. F. Ton, H. Kim, A. R. Kosiorek, J. Schwarz, Y. W. Teh, “Meta Learning in Function Space”, arXiv, 2019.

A. R. Kosiorek, S. Sabour, Y. W. Teh, G. E. Hinton “Stacked Capsule Autoencoders”, NeurIPS, 2019. code

J. Lee, Y. Lee, J. Kim, A. R. Kosiorek, S. Choi, Y. W. Teh “Set Transformer”, ICML, 2019. code

F. B. Fuchs, O. Groth, A. R. Kosiorek, A. Bewley, M. Wulfmeier, A. Vedaldi, I. Posner “Learning Physics with Neural Stethoscopes”, NeurIPS workshop on Modeling the Physical World: Learning, Perception, and Control, 2018.

A. R. Kosiorek, H. Kim, I. Posner, Y. W. Teh “Sequential Attend, Infer, Repeat: Generative Modelling of Moving Objects”, NeurIPS, 2018. code, NeurIPS spotlight talk, video, poster

T. A. Le\(\circ\), A. R. Kosiorek\(\circ\), N. Siddharth, Y. W. Teh, F. Wood “Revisiting Reweighted Wake-Sleep”, arXiv, 2018.

F. B. Fuchs, O. Groth, A. R. Kosiorek, A. Bewley, M. Wulfmeier, A. Vedaldi, I. Posner “Neural Stethoscopes: Unifying Analytic, Auxiliary and Adversarial Network Probing”, arXiv, 2018.

T. Rainforth, A. R. Kosiorek, T. A. Le, C. J. Maddison, M. Igl, F. Wood, Y. W. Teh, “Tighter Variational Bounds are Not Necessarily Better”, ICML, 2018.

A. R. Kosiorek, A. Bewley, I. Posner, “Hierarchical Attentive Recurrent Tracking”, NeurIPS, 2017. code

N. Dhir\(\circ\), A. R. Kosiorek\(\circ\), I. Posner, “Bayesian Delay Embeddings for Dynamical Systems”, NeurIPS Timeseries Workshop, 2017.

\(\circ\) Equal contribution.


A. R. Kosiorek “Learning Object-Centric Representations”, University of Oxford, 2020. source

Forge - a lightweight framework-agnostic tool for managing ML experiments.

An implementation of “Attend, Infer, Repeat” by Ali Eslami et. al. code