ClusteringTree [{e1,e2,…}]
constructs a weighted tree from the hierarchical clustering of the elements e1, e2, ….
ClusteringTree [{e1v1,e2v2,…}]
represents ei with vi in the constructed graph.
ClusteringTree [{e1,e2,…}{v1,v2,…}]
represents ei with vi in the constructed graph.
ClusteringTree [label1e1,label2e2…]
represents ei using labels labeli in the constructed graph.
ClusteringTree [data,h]
constructs a weighted tree from the hierarchical clustering of data by joining subclusters at distance less than h.
ClusteringTree
ClusteringTree [{e1,e2,…}]
constructs a weighted tree from the hierarchical clustering of the elements e1, e2, ….
ClusteringTree [{e1v1,e2v2,…}]
represents ei with vi in the constructed graph.
ClusteringTree [{e1,e2,…}{v1,v2,…}]
represents ei with vi in the constructed graph.
ClusteringTree [label1e1,label2e2…]
represents ei using labels labeli in the constructed graph.
ClusteringTree [data,h]
constructs a weighted tree from the hierarchical clustering of data by joining subclusters at distance less than h.
Details and Options
- ClusteringTree creates a Tree object showing how data points cluster together hierarchically.
- The data elements ei can be numbers; numeric lists, matrices, or tensors; lists of Boolean elements; strings or images; geo positions or geographical entities; colors; as well as combinations of these. If the ei are lists, matrices, or tensors, each must have the same dimensions.
- The result from ClusteringTree is a binary weighted tree, where the weight of each vertex indicates the distance between the two subtrees that have that vertex as root:
- ClusteringTree has the same options as Graph , with the following additions and changes: [List of all options]
-
VertexSize 0 size of vertices
- By default, ClusteringTree will preprocess the data automatically unless either a DistanceFunction or a FeatureExtractor is specified.
- ClusterDissimilarityFunction defines the intercluster dissimilarity, given the dissimilarities between member elements.
- Possible settings for ClusterDissimilarityFunction include:
-
"Average" average intercluster dissimilarity"Centroid" distance from cluster centroids"Complete" largest intercluster dissimilarity"Median" distance from cluster medians"Single" smallest intercluster dissimilarity"Ward" Ward's minimum variance dissimilarity"WeightedAverage" weighted average intercluster dissimilarity a pure function
- The function f defines a distance from any two clusters.
- The function f needs to be a real-valued function of the DistanceMatrix .
-
ImageMargins 0. the margins to leave around the graphicPreserveImageOptions Automatic whether to preserve image options when displaying new versions of the same graphicVertexSize 0 size of vertices
List of all options
Examples
open all close allBasic Examples (5)
Obtain a cluster hierarchy from a list of numbers:
Unify clusters at distance less than 2:
Obtain a cluster hierarchy from a list of strings:
Obtain a cluster hierarchy from a list of images:
Obtain a cluster hierarchy from a list of cities:
Obtain a cluster hierarchy from a list of Boolean entries:
Scope (8)
Obtain a cluster hierarchy from a list of numbers:
Obtain the leaves' labels:
Look at the distance between subclusters by looking at the VertexWeight :
Find the shortest path from the root vertex to the leaf 3.4:
Obtain a cluster hierarchy from a heterogeneous dataset:
Compare it with the cluster hierarchy of the colors:
Generate a list of random colors:
Obtain a cluster hierarchy from the list using the "Centroid" linkage:
Compute the hierarchical clustering from an Association :
Compare it with the hierarchical clustering of its Values :
Compare it with the hierarchical clustering of its Keys :
Obtain a cluster hierarchy by merging clusters at distance less than 0.4:
Change the style and the layout of the ClusteringTree :
Obtain a cluster hierarchy from a list of three-dimensional vectors and label the leaves with the total of the corresponding element:
Compare it with the cluster hierarchy of the total of each vector:
Obtain a cluster hierarchy from a list of integers:
Change the vertex labels by using regular polygons:
Options (9)
ClusterDissimilarityFunction (1)
Generate a list of random colors:
Obtain a cluster hierarchy from the list using the "Centroid" linkage:
Obtain a cluster hierarchy from the list using the "Single" linkage:
Obtain a cluster hierarchy from the list using a different "ClusterDissimilarityFunction":
DistanceFunction (1)
Generate a list of random vectors:
Obtain a cluster hierarchy using different DistanceFunction :
FeatureExtractor (1)
Obtain a cluster hierarchy from a list of pictures:
Use a different FeatureExtractor to extract features:
Use the Identity FeatureExtractor to leave the data unchanged:
ImageSize (2)
Specify an explicit image size for the whole tree:
Independent settings for width and height affect the tree bounding box but not its aspect ratio:
Set both ImageSize and AspectRatio explicitly:
VertexLabelStyle (4)
Customize the labels' size:
Customize the labels' color:
Customize several aspects of the labels:
Some expressions, like images, are not affected by FontSize :
Use Magnification to affect every type of expression:
Alternatively, provide explicit labels:
Related Guides
-
▪
- Graph Construction & Representation ▪
- Cluster Analysis ▪
- Data Visualization ▪
- Distance and Similarity Measures ▪
- Statistical Data Analysis ▪
- Discrete Mathematics ▪
- Natural Language Processing ▪
- Text Analysis ▪
- Sequence Alignment & Comparison ▪
- Unsupervised Machine Learning ▪
- Tree Construction & Representation
Text
Wolfram Research (2016), ClusteringTree, Wolfram Language function, https://reference.wolfram.com/language/ref/ClusteringTree.html (updated 2017).
CMS
Wolfram Language. 2016. "ClusteringTree." Wolfram Language & System Documentation Center. Wolfram Research. Last Modified 2017. https://reference.wolfram.com/language/ref/ClusteringTree.html.
APA
Wolfram Language. (2016). ClusteringTree. Wolfram Language & System Documentation Center. Retrieved from https://reference.wolfram.com/language/ref/ClusteringTree.html
BibTeX
@misc{reference.wolfram_2025_clusteringtree, author="Wolfram Research", title="{ClusteringTree}", year="2017", howpublished="\url{https://reference.wolfram.com/language/ref/ClusteringTree.html}", note=[Accessed: 24-November-2025]}
BibLaTeX
@online{reference.wolfram_2025_clusteringtree, organization={Wolfram Research}, title={ClusteringTree}, year={2017}, url={https://reference.wolfram.com/language/ref/ClusteringTree.html}, note=[Accessed: 24-November-2025]}