Neural Style Transfer - in Pytorch & English

When I first heard about NST, & of ‘extracting style’ from an image I was deeply suspicious - how can an algorithm define style? Define human creativity & artistic expression after all?… It turns out - pretty easily actually. (Well, extracting sufficient style features to successfully apply them into a pastiche is pretty easy - defining human creativity is a different topic for another time!)

Read More
Generative Models

Generative models can be described in ML terms as learning any kind of data distribution using unsupervised learning. Some examples that you might have seen include removing watermarks, transforming zebra into horses (and vice versa), and creating pictures of people who don't exist, among others. When I started diving in to this field, the range of methods, as well as what they could do, was confusing to me. After alot of research the simple taxonomy developed by Ian Goodfellow remains

Read More
Reading ML Papers as a newcomer - tips & tricks

I am by no means an expert in ML. However, I am a former consultant and a newcomer to reading ML papers in a program that requires alot of reading them. So you could say that I am an expert in dealing with complicated content that I am not well versed in ;-) So, this week I thought I would put down the tips, tricks, hacks & approach that have helped me in tackling ML research papers.

Read More
MLFirst LastPapers, CS230
Getting to grips with Batch Norms

This week I completed Assignment 2 from the awesome Stanford Cs231n course. This included implementing (among other things) vectorized backpropogation, batch & layer normalization & building a CNN to train CIFAR-10 both in vanilla python and Tensorflow. Implementing batch normalization - particularly the backward pass - was one of the more surprising parts of the assignment, so I thought I would write about it here.

Read More