Is Haiti a colony of the United States?
Updated: May 16, 2023
42
No, Haiti is not a colony of the United States. While the United States occupied Haiti from 1915 to 1934, it was an intervention rather than a colonization. The occupation aimed to protect American interests and establish stability but ended with the restoration of Haitian sovereignty.
Please Write Your Comments