Skip to definition.
Get the FREE one-click dictionary software for Windows or the iPhone/iPad and Android apps


Noun: western United States
  1. The region of the United States lying to the west of the Mississippi River
    - West

See also: western

Type of: geographic area, geographic region, geographical area, geographical region

Part of: America, the States, U.S., U.S.A., United States, United States of America, US, USA

Encyclopedia: Western United States