Copy page URL Share on Twitter Share on WhatsApp Share on Facebook
Get it on Google Play
Meaning of word west germany from English dictionary with examples, synonyms and antonyms.

west germany   noun

Meaning : A republic in north central Europe on the North Sea. Established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat. Reunified with East Germany in 1990.

Synonyms : federal republic of germany