This week, the Silicon Valley giant added another algorithmic screw-up to the list: misidentifying a software engineer as a serial killer. The victim of this latest botch was Hristo Georgiev, an engineer based in Switzerland. Georgiev discovered that a Google search of his name returned a photo of him linked to a Wikipedia entry on a notorious murderer.
— Hristo Georgiev (@hggeorgievcom) June 24, 2021 “My first reaction was that somebody was trying to pull off some sort of an elaborate prank on me, but after opening the Wikipedia article itself, it turned out that there’s no photo of me there whatsoever,” said Georgiev in a blog post. [Read: Why entrepreneurship in emerging markets matters] Georgiev believes the error was caused by Google’s knowledge graph, which generates infoboxes next to search results. He suspects the algorithm matched his picture to the Wikipedia entry because the now-dead killer shared his name. Georgiev is far from the first victim of the knowledge graph misfiring. The algorithm has previously generated infoboxes that falsely registered actor Paul Campbell as deceased and listed the California Republican Party’s ideology as “Nazism”. In Georgiev’s case, the issue was swiftly resolved. After reporting the bug to Google, the company removed his image from the killer’s infobox. Georgiev gave credit to the HackerNews community for accelerating the response. Other victims, however, may not be so lucky. If they never find the error — or struggle to resolve it — the misinformation could have troubling consequences. I certainly wouldn’t want a potential employer, client, or partner to see my face next to an article about a serial killer. Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.