American Expansionism refers to the period in U.S. history where the nation sought to expand its territory and influence across the North American continent and beyond, driven by a belief in manifest destiny and the desire for economic growth. This ideology justified the displacement of Native American tribes, the Mexican-American War, and the acquisition of territories such as Texas, California, and Alaska, shaping the geopolitical landscape of the United States.