Evaluating GANs (Generative Adversarial Networks) is difficult – unlike classification problems there is no final accuracy to compare against, for instance. For my OpenAI Spring Scholars project, I focused on different ways to understand & to evaluate image synthesis GANs, using the approach of Distill’s Activation Atlas.
Read MoreWhen I first heard about NST, & of ‘extracting style’ from an image I was deeply suspicious - how can an algorithm define style? Define human creativity & artistic expression after all?… It turns out - pretty easily actually. (Well, extracting sufficient style features to successfully apply them into a pastiche is pretty easy - defining human creativity is a different topic for another time!)
Read Moreor how a Muggle can perform Math Magic....
I recently had to perform a large amount of dimensionality reduction - & as such needed to consider how to do it - in the end I went with UMAP. It is a relatively new technique so I figured that putting down some thoughts may be of interest.
Read MoreGenerative models can be described in ML terms as learning any kind of data distribution using unsupervised learning. Some examples that you might have seen include removing watermarks, transforming zebra into horses (and vice versa), and creating pictures of people who don't exist, among others. When I started diving in to this field, the range of methods, as well as what they could do, was confusing to me. After alot of research the simple taxonomy developed by Ian Goodfellow remains
Read MoreI am by no means an expert in ML. However, I am a former consultant and a newcomer to reading ML papers in a program that requires alot of reading them. So you could say that I am an expert in dealing with complicated content that I am not well versed in ;-) So, this week I thought I would put down the tips, tricks, hacks & approach that have helped me in tackling ML research papers.
Read MoreThis week I completed Assignment 2 from the awesome Stanford Cs231n course. This included implementing (among other things) vectorized backpropogation, batch & layer normalization & building a CNN to train CIFAR-10 both in vanilla python and Tensorflow. Implementing batch normalization - particularly the backward pass - was one of the more surprising parts of the assignment, so I thought I would write about it here.
Read More