isabelgarcia188
isabelgarcia188
01.04.2021 • 
History

How did Florida become part of the United States? Britain lost Florida to the United States during the American Revolution.
The United States bought Florida as part of the Louisiana Purchase.
Spain and the United States agreed to the Adams-Onís Treaty.
Italy sold Florida to the United States in exchange for a portion of California.

Solved
Show answers

Ask an AI advisor a question