White House encourages US firms to invest in Africa

From major oil companies to hamburger restaurants, the US is investing in Africa, with many American companies now firmly established in more countries on the continent than ever before.

The BBC’s Matthew Davies has been taking a look at US trade investment in Africa and how it has changed to adapt to modern times. >>>> read more