comparison of k-means clustereing and k-medoids

  • Centroid vs. Medoid:
    • K-means: Uses centroids (mean points), it is sensitive to outliers.
    • K-medoids: Uses medoids (minimizes distances), less sensitive to outliers.
  • Robustness to Outliers:
    • K-means: Sensitive to outliers.
    • K-medoids: More robust to outliers.
  • Initialization:
    • K-means: Sensitive to initial centroids.
    • K-medoids: Less sensitive to initial medoids.
  • Cluster Shape:
    • K-means: Assumes spherical clusters.
    • K-medoids: Handles arbitrary shapes.
  • Computational Complexity(cost) :
    • K-means: Less computationally expensive.
    • K-medoids: Can be more computationally expensive.
  • Cluster Connectivity:
    • K-means: Does not naturally connect non-contiguous regions.
    • K-medoids: Connects points based on density.
  • Use Cases:
    • K-means: Well-defined, spherical clusters, computational efficiency.
    • K-medoids: Irregular clusters, robustness to outliers.

Finally, k-means clustering  is faster, suitable for spherical clusters, and sensitive to outliers.

K-medoids is more robust to outliers, handles arbitrary shapes, but can be computationally more expensive.

Leave a Reply

Your email address will not be published. Required fields are marked *