What did the Americans gain from the Treaty of Paris?
America gained its independence from Britain and all the lands westward to the
Mississippi River and south to the Gulf of Mexico, with the exception of Spanish
Florida, which wasn't acquired until 1819.
Pls best my answer.Thank you.
Content will be erased after question is completed.
Enter the email address associated with your account, and we will email you a link to reset your password.
Forgot your password?