mar'18 | Two papers accepted for ICLR Workshops: - DeepNCM: Deep Nearest Class Mean Classifiers, with Samantha Guerriero and Barbara Caputo, see openreview and the code on GitHub - IterGANs for Object Rotation, with Ysbrand Galama, see openreview, the extended version on arXiv, and source code on GitHub.
Usually projects are 6ECTS, for large projects it could be 12 ECTS
Requests take around 8 weeks for approval
After approval, start the project
After completion, the supervisor and examinator will grade your project
Note: In total you're entitled to have a maximum of 12 ECTS of project courses (PAI, PAI2, and PAI3) on your study programme!
2017 Computer Vision II, MSc AI
2016 Applied Machine Learning, MSc IS (HCM and DS track)
2013-2015 Visual Search Engines, MSc IS (HCM track)
Current Supervision (Outdated)
MSc IS 2017: Daniël Bartolomé-Rojas, Diederik Beker, Michal Kozal, Yu-Ri Tan, Madli Uutma, Bastiaan Waanders, and Aeron Yung
Msc AI 2017: Ysbrand Galema, Thomas Jongstra (finished March'17) and Sébastien Negrijn
BSc AI 2017: Caitlin Lagrand
Teaching & Supervision
MSc AI thesis
Encoding Context in Visual RepresentationsAbstract: The goal of this project is to study how context is encoded in visual representations.
There is evidence that activation patterns in brain regions related to the visual observations are influenced by an external context, eg by music.
So, depending on the style of music, the same visual stimulus yield different activation patterns.
In this project, we aim to reproduce this behavior by training ConvNets with an additional context input.
The goal is to observe to what extend we obtain similar results and how those insights can be used for further understanding of the brain.
This is a joint project with Jorrit Montijn (Netherlands Institute for Neuroscience, Google Scholar).
Followed Computer Vision 1 / Deep Learning
Interest in Machine Learning
Interest in Neuroscience
Willing to work at both NIN and Science Park
Experience with TensorFlow/PyTorch
Large Scale Visual Classification Using Class Means Abstract: The final layer of (almost) any classification network is a soft-max layer.
An alternative is to learn using class prototypes, eg the mean representation of each class.
In this project we explore Deep Nearest Mean Classification for large scale classification.
The starting point is the DeepNCM paper, but the goal is to extend DeepNCM and to explore training on large(r) datasets in the order of 3,000 classes and 10M training images.