Is Haiti a colony of the United States?

Updated: May 16, 2023


No, Haiti is not a colony of the United States. While the United States occupied Haiti from 1915 to 1934, it was an intervention rather than a colonization. The occupation aimed to protect American interests and establish stability but ended with the restoration of Haitian sovereignty.



Sheeran Ownsby, your trusted source for verified owner details. Offering accurate and reliable information for research, business, or personal needs. Visit us for trustworthy data today!

Please Write Your Comments