top of page

KNN or K Nearest Neighborhood)l (Machine Learning Algorithm)

  • Writer: Danielle Costa Nakano
    Danielle Costa Nakano
  • Apr 24, 2019
  • 1 min read

Updated: Dec 14, 2024

Description: KNN can easily be mapped to our real lives. If you want to learn about a person, of whom you have no information, you might like to find out about his close friends and the circles he moves in and gain access to his/her information!



It can be used for both classification and regression problems. However, it is more widely used in classification problems in the industry.

 If K = 1, then the case is simply assigned to the class of its nearest neighbor. At times, choosing K turns out to be a challenge while performing kNN modeling.


Algorithm: K nearest neighbors is a simple algorithm that stores all available cases and classifies new cases by a majority vote of its k neighbors. The case being assigned to the class is most common amongst its K nearest neighbors measured by a distance function.

These distance functions can be Euclidean, Manhattan, Minkowski and Hamming distance. First three functions are used for continuous function and fourth one (Hamming) for categorical variables.

Recent Posts

See All
Data Products: Too good to be true

If you are productizing a predictive model at work or playing around with MLE in R for the first time, always check the data. Roles When...

 
 
 

Comments


Subscribe Form

Thanks for submitting!

©2025 by Danielle Costa Nakano.

  • LinkedIn
bottom of page