German Empire

From Wiktionary

Proper noun[change]

Proper noun
German Empire

The Flag of the German Empire.
A map where The German Empire is.
  1. (history) The German Empire was a country in Europe from 1871 to 1918.