what happens to the standard error of a distribution of sample means when the sample size increases?
Standard error decreases when the sample size increases.
Standard error = sigma/sqrt(n)
Standard error is inversely proportional to square root of n
Content will be erased after question is completed.
Enter the email address associated with your account, and we will email you a link to reset your password.
Forgot your password?