A Survey Of Hierarchical Clustering Algorithms

Volume 5, Issue 3, pp 229 - 240 Publication Date: October 15, 2012


Marjan Kuchaki Rafsanjani - Department of Computer Science, Shahid Bahonar University of Kerman, Kerman, Iran.
Zahra Asghari Varzaneh - Department of Computer Science, Shahid Bahonar University of Kerman, Kerman, Iran.
Nasibeh Emami Chukanlo - Department of Computer Science, Shahid Bahonar University of Kerman, Kerman, Iran.


Clustering algorithms classify data points into meaningful groups based on their similarity to exploit useful information from data points. They can be divided into categories: Sequential algorithms, Hierarchical clustering algorithms, Clustering algorithms based on cost function optimization and others. In this paper, we discuss some hierarchical clustering algorithms and their attributes, and then compare them with each other.



[1] A.K. Jain, M.N. Murty and P.J. Flynn, Data clustering: A review, ACM Computing Surveys, 31 (1999), 264-323.
[2] N. A. Yousri, M. S. Kamel and M. A. Ismail, A distance-relatedness dynamic model for clustering high dimensional data of arbitrary shapes and densities, Pattern Recognition, 42 (2009), 1193- 1209.
[3] K. Koutroumbas and S. Theodoridis, Pattern Recognition, Academic Press, (2009).
[4] M. Kantardzic ,Data Mining: Concepts, Models, Methods, and Algorithms, John Wiley & Sons, (2003).
[5] R. Capaldo and F. Collova, Clustering: A survey, Http://uroutes.blogspot.com, (2008).
[6] D.T. Pham and A.A. Afify, Engineering applications of clustering techniques, Intelligent Production Machines and Systems, (2006), 326-331.
[7] L. Feng, M-H Qiu, Y-X. Wang, Q-L. Xiang, Y-F. Yang and K. Liu, A fast divisive clustering algorithm using an improved discrete particle swarm optimizer, Pattern Recognition Letters, 31 (2010), 1216-1225.
[8] R. Gil-García and A. Pons-Porrata, Dynamic hierarchical algorithms for document clustering, Pattern Recognition Letters, 31 (2010), 469-477.
[9] S. Guha, R. Rastogi and K. Shim, CURE: An efficient clustering algorithm for large databases, Information Systems, 26 (2001), 35-58.
[10] J.A.S. Almeida, L.M.S. Barbosa, A.A.C.C. Pais and S.J. Formosinho, Improving hierarchical cluster analysis: A new method with outlier detection and automatic clustering, Chemometrics and Intelligent Laboratory Systems, 87 (2007), 208-217.
[11] M. Charikar, C. Chekuri, T. Feder and R. Motwani, Incremental Clustering and Dynamic Information Retrieval, Proceeding of the ACM Symposium on Theory of Computing, (1997), 626- 634.
[12] T. Zhang, R. Ramakrishnan and M. Livny, BIRCH: An efficient clustering method for very large databases, Proceeding of the ACM SIGMOD Workshop on Data Mining and Knowledge Discovery, (1996), 103-114.
[13] J. Harrington and M. Salibián-Barrera, Finding approximate solutions to combinatorial problems with very large data sets using BIRCH, Computational Statistics and Data Analysis, 54 (2010), 655-667.
[14] M. Dutta, A. Kakoti Mahanta and A.K. Pujari, QROCK: A quick version of the ROCK algorithm for clustering of categorical data, Pattern Recognition Letters, 26 (2005), 2364-2373.
[15] S. Guha, R. Rastogi and K. Shim, ROCK: A robust clustering algorithm for categorical attributes, Information Systems, 25 (2000), 345-36.
[16] G. Karypis, E.H. Han and V. Kumar, CHAMELEON: Hierarchical clustering using dynamic modeling, IEEE Computer, 32 (1999), 68-75.
[17] Y. Song, S. Jin and J. Shen, A unique property of single-link distance and its application in data clustering, Data & Knowledge Engineering, 70 (2011), 984-1003.
[18] D. Krznaric and C. Levcopoulos, Optimal algorithms for complete linkage clustering in d dimensions, Theoretical Computer Science, 286 (2002), 139-149.
[19] P.A. Vijaya, M. Narasimha Murty and D.K. Subramanian, Leaders–Subleaders: An efficient hierarchical clustering algorithm for large data sets, Pattern Recognition Letters, 25 (2004), 505- 513.
[20] V.S. Ananthanarayana, M. Narasimha Murty and D.K. Subramanian, Rapid and Brief Communication Efficient clustering of large data sets, Pattern Recognition, 34 (2001), 2561-2563.