Law of Large Numbers
The Law of Large Numbers is a statistical theorem that states as the number of trials in a random experiment increases, the sample mean will converge to the Expected Value (population mean) of the underlying probability distribution. In simpler terms, as you collect more data, your average will get closer to the true average of the population.
For example, if you flip a fair coin a small number of times, you might get a higher proportion of heads or tails due to random variation. However, if you flip the coin a large number of times, the proportion of heads and tails will approach 50%.
Case 1: A factory produces light bulbs, and the expected lifespan of a bulb is 1000 hours. If you test a few bulbs, some may last significantly longer or shorter than 1000 hours. But if you test 10,000 bulbs, the average lifespan of those bulbs will be very close to 1000 hours.
Case 2: In gambling, if a player rolls a die 10 times, they may roll a number more than the others. However, if the player rolls the die 1,000 times, the frequency of each number (1 through 6) will tend to be approximately equal, converging to a probability of 1/6 for each outcome.