Friday, January 13, 2012

Was the European colonization of Africa the best thing that ever happened to them?

If Europe had not colonized Africa, they would still be a bunch o' savage natives running around (besides a select few countries such as Egypt and Morocco). As soon as the Africans kicked the Europeans out of Africa, what did they accomplish? Nothing. They just wanted to kick out the Europeans so that they could regress back into a state of violent Anarchy. What do you guys think? Was the colonization of Africa the best thing that ever happened to them?

0 comments:

Post a Comment