There might be a problem with the way this question is framed. I don’t think we can reasonably call Germany democratic after 1934, when all political parties except the Nazis were outlawed.
Are you thinking specifically of events in the 20th or 21st centuries? The phrase “first world” doesn’t come into use until after World War II. If we go farther back, we might be able to pinpoint something in the imperial adventures of Western powers.