Respuesta :

Japan emerged as a world power but Germany was weakened and humiliated.

Japan began to gain recognition after WWI while Germany had all the blame set on them, forcing them to lose colonies and such.

Hope this helps!

Otras preguntas