In statistics, skewness measures the asymmetry of the probability distribution of a random variable. There are two main types of skewness: right (positive) skewness and left (negative) skewness.
1. Right (Positive) Skewness:
Right skewness occurs when the tail on the right side of the probability distribution is longer or fatter than the left tail.
The majority of the data points are concentrated on the left side, with a few extremely large values on the right side.
The mean is typically greater than the median in a positively skewed distribution.
Example: Consider the distribution of household income in a country. Most households have moderate to low incomes, but there are a few extremely wealthy households with very high incomes. This creates a right-skewed distribution because the right tail (high incomes) is longer and fatter than the left tail (low incomes).
2. Left (Negative) Skewness:
Left skewness occurs when the tail on the left side of the probability distribution is longer or fatter than the right tail.
The majority of the data points are concentrated on the right side, with a few extremely small values on the left side.
The mean is typically less than the median in a negatively skewed distribution.
Example: Consider the distribution of test scores for a very easy exam. Most students score very high because the exam is not challenging, but a few students might score very low due to random factors. This creates a left-skewed distribution because the left tail (low scores) is longer and fatter than the right tail (high scores).
In summary, skewness helps us understand the shape and asymmetry of a data distribution. Right skewness indicates that the data is stretched out to the right, with extreme values on the right side, while left skewness indicates the opposite, with extreme values on the left side.