Skip to definition.
Get the FREE one-click dictionary software for Windows or the iPhone/iPad and Android apps


Noun: West Germany  west jur-mu-nee
  1. A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
    - Federal Republic of Germany

Type of: European country, European nation

Encyclopedia: West Germany