NA / 5
We will cover the following tasks in 52 minutes:
In this chapter, let’s go over the course objectives and get a feel of the course material we will encounter later. We get to look at a few motivating examples of k-means clustering, and then build our own implementation.
In this chapter, we start out by cultivating our intuition for the settings most appropriate for k-Means clustering. Next, we generate a two-dimensional dataset containing four distinct blobs. To emphasize that this is an unsupervised algorithm, we will leave the labels out of the visualization.
k-Means: By Example
Now that we have intuitive understanding of k-means, we will implement a clustering on our dataset from Chapter 2 using sklearn.cluster.KMeans. Lastly, we use our intuition to build a mathematically sound construction of k-means objective function.
Plotting the Results
Let’s visualize the results by plotting the data colored according to labels. We will also plot the cluster centers as determined by the k-means estimator:
In this chapter, we break down the work-horse behind k-means: the EM algorithm. Once we understand the fundamentals thoroughly, we implement the EM algorithm from scratch and visualize our results!
Failure Cases: Suboptimal Local Minimum
Here we look at situations where k-means fails in practice, and try to piece together why this is so.
First, although the E–M procedure is guaranteed to improve the result in each step, there is no assurance that it will lead to the global best solution. For example, if we use a different random seed in our simple procedure, the particular starting guesses lead to poor results.
Failure Cases: Number of Clusters
Another common challenge with k-means is that you must tell it how many clusters you expect: it cannot learn the number of clusters from the data. For example, if we ask the algorithm to identify six clusters, it will happily proceed and find the best six clusters.
About the Host (Snehan Kekre)
Snehan hosts Machine Learning courses at Rhyme. He is in his senior year of university at the Minerva Schools at KGI, pursuing a double major in the Natural Sciences and Computational Sciences, with a focus on physics and machine learning. When not applying computational and quantitative methods to identify the structures shaping the world around him, he can sometimes be seen trekking in the mountains of Nepal.