In this paper, we show that the coercive functions reach their minimum value on closed subsets of R^n. We then introduce the concept of sequence variance, a statistical metric designed to quantify the degree of non-uniformity in a sequence of observations (the irregularity in the spatial arrangement of points). This metric is calculated as the average squared distance between ordered data points. Additionally, the paper introduces the sequence correlation coefficient and examines its properties. Finally, we present a method for detecting outliers in a sequence of data points within Euclidean spaces.