- Algebra
- Arithmetic
- Whole Numbers
- Numbers
- Types of Numbers
- Odd and Even Numbers
- Prime & Composite Numbers
- Sieve of Eratosthenes
- Number Properties
- Commutative Property
- Associative Property
- Identity Property
- Distributive Property
- Order of Operations
- Rounding Numbers
- Absolute Value
- Number Sequences
- Factors & Multiples
- Prime Factorization
- Greatest Common Factor
- Least Common Multiple
- Squares & Perfect Squares
- Square Roots
- Squares & Square Roots
- Simplifying Square Roots
- Simplifying Radicals
- Radicals that have Fractions
- Multiplying Radicals

- Integers
- Fractions
- Introducing Fractions
- Converting Fractions
- Comparing Fractions
- Ordering Fractions
- Equivalent Fractions
- Reducing Fractions
- Adding Fractions
- Subtracting Fractions
- Multiplying Fractions
- Reciprocals
- Dividing Fractions
- Adding Mixed Numbers
- Subtracting Mixed Numbers
- Multiplying Mixed Numbers
- Dividing Mixed Numbers
- Complex Fractions
- Fractions to Decimals

- Decimals
- Exponents
- Percent
- Scientific Notation
- Proportions
- Equality
- Properties of equality
- Addition property of equality
- Transitive property of equality
- Subtraction property of equality
- Multiplication property of equality
- Division property of equality
- Symmetric property of equality
- Reflexive property of equality
- Substitution property of equality
- Distributive property of equality

- Commercial Math

- Calculus
- Differential Calculus
- Limits calculus
- Mean value theorem
- L’Hôpital’s rule
- Newton’s method
- Derivative calculus
- Power rule
- Sum rule
- Difference rule
- Product rule
- Quotient rule
- Chain rule
- Derivative rules
- Trigonometric derivatives
- Inverse trig derivatives
- Trigonometric substitution
- Derivative of arctan
- Derivative of secx
- Derivative of csc
- Derivative of cotx
- Exponential derivative
- Derivative of ln
- Implicit differentiation
- Critical numbers
- Derivative test
- Concavity calculus
- Related rates
- Curve sketching
- Asymptote
- Hyperbolic functions
- Absolute maximum
- Absolute minimum

- Integral Calculus
- Fundamental theorem of calculus
- Approximating integrals
- Riemann sum
- Integral properties
- Antiderivative
- Integral calculus
- Improper integrals
- Integration by parts
- Partial fractions
- Area under the curve
- Area between two curves
- Center of mass
- Work calculus
- Integrating exponential functions
- Integration of hyperbolic functions
- Integrals of inverse trig functions
- Disk method
- Washer method
- Shell method

- Sequences, Series & Tests
- Parametric Curves & Polar Coordinates
- Multivariable Calculus
- 3d coordinate system
- Vector calculus
- Vectors equation of a line
- Equation of a plane
- Intersection of line and plane
- Quadric surfaces
- Spherical coordinates
- Cylindrical coordinates
- Vector function
- Derivatives of vectors
- Length of a vector
- Partial derivatives
- Tangent plane
- Directional derivative
- Lagrange multipliers
- Double integrals
- Iterated integral
- Double integrals in polar coordinates
- Triple integral
- Change of variables in multiple integrals
- Vector fields
- Line integral
- Fundamental theorem for line integrals
- Green’s theorem
- Curl vector field
- Surface integral
- Divergence of a vector field
- Differential equations
- Exact equations
- Integrating factor
- First order linear differential equation
- Second order homogeneous differential equation
- Non homogeneous differential equation
- Homogeneous differential equation
- Characteristic equations
- Laplace transform
- Inverse laplace transform
- Dirac delta function

- Differential Calculus
- Matrices
- Pre-Calculus
- Lines & Planes
- Functions
- Domain of a function
- Transformation Of Graph
- Polynomials
- Graphs of rational functions
- Limits of a function
- Complex Numbers
- Exponential Function
- Logarithmic Function
- Sequences
- Conic Sections
- Series
- Mathematical induction
- Probability
- Advanced Trigonometry
- Vectors
- Polar coordinates

- Probability
- Geometry
- Angles
- Triangles
- Types of Triangles
- Special Right Triangles
- 3 4 5 Triangle
- 45 45 90 Triangle
- 30 60 90 Triangle
- Area of Triangle
- Pythagorean Theorem
- Pythagorean Triples
- Congruent Triangles
- Hypotenuse Leg (HL)
- Similar Triangles
- Triangle Inequality
- Triangle Sum Theorem
- Exterior Angle Theorem
- Angles of a Triangle
- Law of Sines or Sine Rule
- Law of Cosines or Cosine Rule

- Polygons
- Circles
- Circle Theorems
- Solid Geometry
- Volume of Cubes
- Volume of Rectangular Prisms
- Volume of Prisms
- Volume of Cylinders
- Volume of Spheres
- Volume of Cones
- Volume of Pyramids
- Volume of Solids
- Surface Area of a Cube
- Surface Area of a Cuboid
- Surface Area of a Prism
- Surface Area of a Cylinder
- Surface Area of a Cone
- Surface Area of a Sphere
- Surface Area of a Pyramid
- Geometric Nets
- Surface Area of Solids

- Coordinate Geometry and Graphs
- Coordinate Geometry
- Coordinate Plane
- Slope of a Line
- Equation of a Line
- Forms of Linear Equations
- Slopes of Parallel and Perpendicular Lines
- Graphing Linear Equations
- Midpoint Formula
- Distance Formula
- Graphing Inequalities
- Linear Programming
- Graphing Quadratic Functions
- Graphing Cubic Functions
- Graphing Exponential Functions
- Graphing Reciprocal Functions

- Geometric Constructions
- Geometric Construction
- Construct a Line Segment
- Construct Perpendicular Bisector
- Construct a Perpendicular Line
- Construct Parallel Lines
- Construct a 60° Angle
- Construct an Angle Bisector
- Construct a 30° Angle
- Construct a 45° Angle
- Construct a Triangle
- Construct a Parallelogram
- Construct a Square
- Construct a Rectangle
- Locus of a Moving Point

- Geometric Transformations

- Sets & Set Theory
- Statistics
- Collecting and Summarizing Data
- Common Ways to Describe Data
- Different Ways to Represent Data
- Frequency Tables
- Cumulative Frequency
- Advance Statistics
- Sample mean
- Population mean
- Sample variance
- Standard deviation
- Random variable
- Probability density function
- Binomial distribution
- Expected value
- Poisson distribution
- Normal distribution
- Bernoulli distribution
- Z-score
- Bayes theorem
- Normal probability plot
- Chi square
- Anova test
- Central limit theorem
- Sampling distribution
- Logistic equation
- Chebyshev’s theorem

- Difference
- Correlation Coefficient
- Tautology
- Relative Frequency
- Frequency Distribution
- Dot Plot
- Сonditional Statement
- Converse Statement
- Law of Syllogism
- Counterexample
- Least Squares
- Law of Detachment
- Scatter Plot
- Linear Graph
- Arithmetic Mean
- Measures of Central Tendency
- Discrete Data
- Weighted Average
- Summary Statistics
- Interquartile Range
- Categorical Data

- Trigonometry
- Vectors
- Multiplication Charts
- Time Table
- 2 times table
- 3 times table
- 4 times table
- 5 times table
- 6 times table
- 7 times table
- 8 times table
- 9 times table
- 10 times table
- 11 times table
- 12 times table
- 13 times table
- 14 times table
- 15 times table
- 16 times table
- 17 times table
- 18 times table
- 19 times table
- 20 times table
- 21 times table
- 22 times table
- 23 times table
- 24 times table

- Time Table

# Z-Score – Explanation & Examples

*The definition of the Z-score is:*

**“The Z-score is the number of standard deviations by which an observed value is above or below the mean value.”**

*In this topic, we will discuss the Z-score from the following aspects:*

- What is Z-score?
- Z-score formula.
- Z-score properties.
- How to calculate Z-score?
- The role of Z-score.
- Practice questions.
- Answer key.

## 1. What is Z-score?

**The Z-score (standard score)** is the number of standard deviations by which an observed value is above or below the mean value.

The Z-score is positive if the value lies above (greater than) the mean, and negative if the value lies below (smaller than) the mean.

For example, if the Z-score for an individual height is +1. This means that his height is 1 standard deviation above the mean height of his population.

On the other hand, if the Z-score for the same individual weight is -1. This means that his weight is 1 standard deviation below the mean weight of his population.

**The Z-score** can be 0 if the observed value exactly equals the mean.

The Z-score is used when the distribution of data, plotted as a histogram, nearly follows a normal distribution curve (a bell-shaped symmetrical curve centered around the mean).

### – Example 1

The following are the histograms of heights, physical activity, and mental component summary from a certain population.

The mean value was plotted as a red dashed vertical line for each data.

*We see that:*

- The histogram of height nearly follows a normal distribution curve (a bell-shaped symmetrical curve centered around the mean).
- The histogram of the mental component shows a left-skewed distribution (low frequent small values).
- The histogram of physical activity shows a right-skewed distribution (low frequent large values).

The Z-score can be applied to an individual’s height but cannot be applied to an individual’s mental component or physical activity.

However, there are different normal distributions with different means and standard deviations.

### – Example 2

The following are the histograms of heights and weights from a certain population.

The mean value was plotted as a red dashed vertical line for each data.

*We see that:*

- The histogram of heights and weights nearly follows a normal distribution curve (a bell-shaped curve).
- However, the mean value for heights was at about 165 (cm), while the mean value for weights was about 75 (kg).

### – Example 3

In the above example, the mean height was 163 cm and standard deviation = 9.22 cm, while the mean weight was 73.4 kg and the standard deviation = 13.7 kg.

Assuming that heights and weights from this population follow the normal distribution, we can plot the normal distribution curves for heights and weights as follows:

*We see that:*

- Each normal distribution curve is bell-shaped, peaked, and symmetric about its mean.
- When the standard deviation increases as for weights, the curve flattens away.

The Z-score converts all different normal distributions to a standard normal distribution with mean = 0 and standard deviation = 1.

*We see that:*

- The two curves are superimposed over each other.
- Both heights and weights are now with a mean = 0 and standard deviation = 1.
- The Z-score allows the comparison of values (as heights and weights) from different normal distributions by standardizing their distribution.

## 2. Z-score formula

*The Z-score formula is:*

Z=(x-μ)/σ

*where:*

x is the data point.

μ is the population mean.

σ is the population standard deviation.

When the population mean and the population standard deviation are unknown, the Z-score can be calculated using the sample mean (¯x) and sample standard deviation (s) as estimates of the population values.

## 3. Z-score properties

As the Z-score forms a standard normal distribution with mean = 0 and standard deviation = 1, so it follows the properties of the normal distribution as the 68-95-99.7% rule.

*The following is the normal distribution curve for any Z-score:*

The **important properties of normal distribution**, that the Z-score follows, are:

- 68% of the data are within 1 standard deviation from the mean.

This means that 68% of the population has Z-score between +1 and -1. In other words, the probability of data from this population to lie between +1 and -1 Z-score is 68%.

As the normal distribution is symmetric around its mean, so 34% (68%/2) of this population have Z-score between 0 (mean) and +1 and 34% of this population have Z-score between -1 and 0.

If we shade the area within 1 standard deviation from the mean or between -1 and +1.

Without doing integration for this green AUC, the green shaded area represents 68 % of the total area, or the data within -1 and +1 Z-score represents 68% of the total data.

- 95% of the data are within 2 standard deviations from the mean.

This means that 95% of the population has Z-score between +2 and -2. In other words, the probability of data from this population to lie between +2 and -2 Z-score is 95%.

As the normal distribution is symmetric around its mean, so 47.5% (95%/2) of this population have Z-score between 0 (mean) and +2 and 47.5% of this population have Z-score between -2 and 0.

If we shade the area within 2 standard deviations from the mean or between -2 and +2.

Without doing integration for this red AUC, the red shaded area represents 95% of the total area, or the data within -2 and +2 Z-score represents 95% of the total data.

- 99.7% of the data are within 3 standard deviations from the mean.

This means that 99.7% of the population has Z-score between +3 and -3. In other words, the probability of data from this population to lie between +3 and -3 Z-score is 99.7%.

As the normal distribution is symmetric around its mean, so 49.85% (99.7%/2) of this population have Z-score between 0 (mean) and +3 and 49.85% of this population have Z-score between -3 and 0.

If we shade the area within 3 standard deviations from the mean or between -3 and +3.

Without doing integration for this blue AUC, the blue shaded area represents 99.7% of the total area, or the data within -3 and +3 Z-score represents 99.7% of the total data.

- The proportion (probability) of data that are larger than the mean = probability of data that are less than the mean = 0.50 or 50%.

This means that 50% of the population has Z-score more than 0 and the other half has Z-score smaller than 0.

In other words, the probability of data from this population to be more than 0 Z-score = the probability of data from this population to be less than 0 Z-score = 50%.

*This is plotted as follows:*