Specifications

Multivariate Statistics
Multivariate statistics methods let you
analyze your data by evaluating groups of
variables together. You can:
Segment data in clusters for further analysis
Visualize and assess the group-to-group
differences in a data set
Reduce a large set of variables to a more
manageable but still representative set
Multivariate statistics tasks supported by the
Statistics Toolbox include:
Factor analysis
Principal components analysis (PCA)
Factor rotation
Cluster analysis (both hierarchical and
k-means)
Discriminate analysis
Multivariate ANOVA
Multidimensional scaling (classical, metric,
and nonmetric)
Multivariate plotting
Descriptive Statistics
Descriptive statistics methods enable you to
quickly understand and describe potentially
large sets of data. The Statistics Toolbox
includes functions for calculating:
Measures of central tendency (measures of
location), including average, median, and
various means
Measures of dispersion (measures of
spread), including range, variance, standard
deviation, and mean absolute deviation
Linear and rank correlation
Results based on data with missing values
Percentile and quartile estimates
Bootstrap statistics
Density estimates (using a kernel smooth-
ing function)
These functions help you summarize the
values in a data sample with a few highly rel-
evant numbers.
Analysis of Variance
Analysis of variance (ANOVA) lets you determine
whether data sets from different groups have
different characteristics. You can classify groups
using discrete predictor variables. A follow-up
multiple comparisons analysis can pinpoint
which pairs of groups differ from each other.
The Statistics Toolbox includes algorithms for
ANOVA and related techniques, including:
One-way ANOVA with graphics
Two-way ANOVA for balanced data
Multiway ANOVA for unbalanced data,
(both fixed and random effects)
Analysis of covariance
One-way multivariate ANOVA
Nonparametric one- and two-way ANOVA
(Kruskal-Wallis, Friedman)
Multiple comparison of group means,
slopes, and intercepts
Hypothesis Testing
Random variation often makes it difficult to
determine whether samples taken under differ-
ent conditions really are different. Hypothesis
testing is an effective tool for analyzing whether
sample-to-sample differences are significant
and require further evaluation or are consistent
with random and expected data variation.
The Statistics Toolbox supports the most
widely used parametric and nonparametric
hypothesis testing procedures, such as:
One- and two-sample t tests
One-sample z test
Nonparametric tests for one sample
Nonparametric tests for two independent
samples
Distribution tests (Jarque-Bera, Lilliefors,
and Kolmogorov-Smirnov)
Comparison of distributions (two-sample
Kolmogorov-Smirnov)
The analysis of covariance
(ANACOVA) tool plots data to
assess group-to-group differences
and the impact of a predictor vari-
able on a response variable.