Amazon recently confused many of its customers with weird yet harmless email notifications about non-existent baby registry gifts, and now, it seems that the company’s algorithm has led to some potentially serious trouble. Algorithms, of course, have come under fire recently regarding online ad-buying tools (including those of Facebook, Google, and Twitter) that allow anyone to target racists and anti-Semites, and Amazon’s apparent algorithm oopsie is a big one. According to an investigative report from British public-service TV broadcaster Channel 4, the online retailer has been inadvertently suggesting that people purchase bomb-making materials together.
The issue reportedly arose through Amazon’s handy “Frequently Bought Together” feature, which the retailer uses to boost suggestive sales (and impulse buys), and many consumers do find the tool to be a useful one. However, Channel 4 reported — in the context of last week’s London tube attack through an improvised explosive device — that Amazon’s algorithm was suggesting that people purchase multiple standalone items that double as bomb-making ingredients. And that’s not a good look, so Amazon has vowed to “review” its website, via the New York Times:
“All products sold on Amazon must adhere to our selling guidelines and we only sell products that comply with U.K. laws.”
“In light of recent events, we are reviewing our website to ensure that all these products are presented in an appropriate manner. We also continue to work closely with police and law enforcement agencies when circumstances arise where we can assist their investigations.”
Channel 4 dug deeper after noticing the algorithm’s suggestions — black powder (gunpowder) and thermite together — and the broadcaster easily tossed 45 kilograms of black powder into a basket despite U.K. laws that place restrictions on any private citizen purchasing more than 100 grams of the stuff in one sitting. The algorithm also indicated that steel ball bearings are often purchased in conjunction with “push button switches and battery connectors and cables.” Again, a flawed algorithm is making the case that a human perspective can never be replaced by machines.
You can watch the entire 9-minute Channel 4 report below.