| CARVIEW |
Select Language
HTTP/2 200
server: GitHub.com
content-type: text/html; charset=utf-8
last-modified: Wed, 30 Jun 2021 20:18:25 GMT
access-control-allow-origin: *
strict-transport-security: max-age=31556952
etag: W/"60dcd191-a342"
expires: Sat, 27 Dec 2025 17:25:18 GMT
cache-control: max-age=600
content-encoding: gzip
x-proxy-cache: MISS
x-github-request-id: A888:2916CC:6EE105:7B8E6E:69501425
accept-ranges: bytes
age: 0
date: Sat, 27 Dec 2025 17:15:18 GMT
via: 1.1 varnish
x-served-by: cache-bom-vanm7210040-BOM
x-cache: MISS
x-cache-hits: 0
x-timer: S1766855718.466532,VS0,VE207
vary: Accept-Encoding
x-fastly-request-id: e0061b452b8a18e032dc1c75c8b84e7a2ff7bbd2
content-length: 8215
Emiel Hoogeboom
Biography
My name is Emiel Hoogeboom and I am a PhD student at the University of Amsterdam under the supervision of Max Welling, in the UvA-Bosch Delta Lab.
Interests
- Generative Modelling
- Bayesian Inference
- Artificial Intelligence
Education
-
MSc in Artificial Intelligence, 2017
University of Amsterdam
-
BSc in Aerospace Engineering, 2015
Delft University of Technology
Recent Posts
How to build E(n) Equivariant Normalizing Flows, for points with features?
How to build E(n) Equivariant Normalizing Flows from our recent paper? We will discuss 1) Normalizing Flows 2) Continuous Time Normalizing Flows 3) E(n) GNNs, 4) Argmax Flows. Finally we talk about our 5) E(n) Flows. Most of these topics are tangential: if you don’t care, just read the intuition and skip it :)
Invertible Convolutions
We introduce three types of invertible convolutions: i) emerging convolutions for invertible zero-padded convolutions, ii) invertible periodic convolutions, and iii) stable and flexible 1 x 1 convolutions. convolutions
Publications
E(n) Equivariant Normalizing Flows
This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs). To construct …
Argmax Flows and Multinomial Diffusion: Learning Categorical Distributions
Generative flows and diffusion models have been predominantly trained on ordinal data, for example natural images. This paper …
E(n) Equivariant Graph Neural Networks
This paper introduces a new model to learn graph neural networks equivariant to rotations, translations, reflections and permutations …
Self Normalizing Flows
Efficient gradient computation of the Jacobian determinant term is a core problem of the normalizing flow framework. Thus, most …
Thomas Anderson Keller, Jorn W.T. Peters, Priyank Jaini, Emiel Hoogeboom, Patrick Forré, Max Welling
Variational Determinant Estimation with Spherical Normalizing Flows
This paper introduces the Variational Determinant Estimator (VDE), a variational extension of the recently proposed determinant …
The Convolution Exponential and Generalized Sylvester Flows
This paper introduces a new method to build linear flows, by taking the exponential of a linear transformation. This linear …
SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows
Normalizing flows and variational autoencoders are powerful generative models that can represent complicated density functions. …
Learning Discrete Distributions by Dequantization
Media is generally stored digitally and is therefore discrete. Many successful deep distribution models in deep learning learn a …
Predictive Sampling with Forecasting Autoregressive Models
Autoregressive models (ARMs) currently hold state-of-the-art performance in likelihood-based modeling of image and audio data. …
Learning Likelihoods with Conditional Normalizing Flows
Normalizing Flows (NFs) are able to model complicated distributions p(y) with strong inter-dimensional correlations and high …
Integer Discrete Flows and Lossless Compression
Lossless compression methods shorten the expected representation size of data without loss of information, using a statistical model. …
Emerging Convolutions for Generative Normalizing Flows
Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & …
HexaConv
The effectiveness of Convolutional Neural Networks stems in large part from their ability to exploit the translation invariance that is …