Microsoft travel article, seemingly written by AI, suggests vacationers dine at a food bank
The best travel guides steer would-be visitors to hidden treasures, spots that showcase the unique qualities of a city or illustrate its grandeur while avoiding tourist traps. But a recently published guide to Ottawa from Microsoft Travel, seemingly written by an AI bot, offers what might be the most unique suggestion of them all. Along with recommendations for local museums and shopping centers, it suggests you pay a visit to the city’s food bank.
And to top that off, it suggests you go there while hungry.
“The organization has been collecting, purchasing, producing, and delivering food to needy people and families in the Ottawa area since 1984,” the guide reads. “We observe how hunger impacts men, women, and children on a daily basis, and how it may be a barrier to achievement. People who come to us have jobs and families to support, as well as expenses to pay. Life is already difficult enough. Consider going into it on an empty stomach.” (Emphasis ours.)
In response to Fast Company’s request for comment about the article, including whether a human editor had ever laid eyes on it before it was published on MSN, a Microsoft spokesperson stated, “This article has been removed and we are investigating how it made it through our review process.”
Later, Microsoft told Fast Company that “the issue was due to human error” and that “the article was not published by an unsupervised AI.” Apparently, it was just a poorly supervised AI: “We combine the power of technology with the experience of content editors to surface stories. In this case, the content was generated through a combination of algorithmic techniques with human review, not a large language model or AI system. We are working to ensure this type of content isn’t posted in future.”
Paris Marx, a Fast Company contributor and self-described “critic of tech futures,” was the first to notice the gaffe.
Travel guides have been a popular focus of AI guides. Amazon, in particular, has seen a wave of guidebooks that offer questionable advice, which have been signal-boosted by planted reviews, making them appear alongside long-trusted sources like Fodor’s and Lonely Planet.
A New York Times story, looking at the spread of these guides on the site, noted that a search for “Paris Travel Guide 2023” yielded dozens of guides with that exact title. And nearly 200 samples of 64 guidebooks run through an artificial intelligence detector were identified as highly probable of being generated by a chatbot.
Guidebooks to Russia and Ukraine it found failed to mention the ongoing war, with one telling readers to “pack your bags and get ready for an unforgettable adventure in one of Eastern Europe’s most captivating destinations.”
BuzzFeed, meanwhile, has run several AI-assisted travel stories of its own, and while the advice isn’t as dubious as showing up at a food bank hungry, it’s not exactly award-winning stuff.
“Now, I know what you’re thinking – ‘Cape May? What is that, some kind of mayonnaise brand?’” one story, cowritten by “Buzzy,” the creative AI assistant, reads. And dozens of destinations, including Charleston, South Carolina; Aruba, Prague, and Amelia Island, Florida, are classified as “hidden gems.” (Charleston, alone, had 7.68 million visitors last year, making it hardly hidden.)
That’s not to say AI is completely useless when it comes to travel tips. Fast Company’s own Jessica Bursztynsky let ChatGPT pick an itinerary for her on a recent trip to California. She described it, overall, as a “mixed bag.” It tried to send her to a restaurant that apparently didn’t exist and another that was closed, and left her with some big holes in her schedule, but she did acknowledge that its suggestion to kayak a lagoon was a wonderful one.
Today’s generative AI, of course, has shown off many flaws in its short history, from gaslighting and insulting users to offering disturbing advice to teens. So travel gaffes, on the whole, aren’t the worst thing that can happen with the technology.
But that could be little consolation to users who heed the advice of these guides, only to end up wasting their limited travel time and dollars.
Update, August 21, 2023: This article has been updated with an additional statement from Microsoft.
(13)