America's annexation of Hawaii in 1898 extended U.S. territory into the Pacific and highlighted resulted from economic integration and the rise of the United States as a Pacific power. For most of the 1800s, leaders in Washington were concerned that Hawaii might become part of a European nation's empire.