The story of how America acquired Hawaii is much less pristine than the sparkling oceans and beaches on postcards and travel brochures. Struggle and oppression fill Hawaii's history, as the United States forced the islands into becoming US territory.
No comments:
Post a Comment