noun
- official name of Germany.
- (formerly) official name of West Germany.
noun
- the official name of Germany, formerly used to refer to West Germany
Official name for Germany; until 1990, the official name for West Germany.
noun
noun
Official name for Germany; until 1990, the official name for West Germany.