Distance equalizers are introduced as empirical measures of central tendency that make distances to univariate data as similar as possible. These measures are made precise by means of various so-called fluctuation functions which account for distances in different ways. Distance equalizers differ from the mean and the median. Also, distance equalizers relate to dispersion measures like the median of absolute deviations and allow to define new dispersion measures. Algorithms as well as a closed form solutions for special cases are given. Computations require to perform multiextemal function minimization of several kinds.
Distance equalization is shown to enable cluster analysis with a special neighbouhood notion. Cluster computations are reduced to computations of shortest paths with prescribed number of intermediate nodes in weighted directed graphs.
A revised version of the paper will appear in the Journal of Statistical Computation and Simulation, vol. 82, 2012.