Credit: Mike Pearl
Warning: This article touches on disturbing topics including violent crime and suicide.
Users of AI products have been known to expend tons of effort finding and exploiting loopholes that allow them to generate disturbing content. But there weren’t any loopholes in one new AI product, because there weren’t any restrictions.
“Really appreciate you flagging this issue – and we feel horrible about it,” Josh Miller, CEO of The Browser Company told me in an email. At the time of this writing, Miller said the company was working on a fix.
The new Arc Search app from Miller’s company earned its share of headlines this past week, as one might expect for an AI-infused product in our age of AI hype. In this case, the product was a variation on The Browser Company’s Arc browser, which is marketed to productivity enthusiasts because of the clever way it organizes things. However, this new iOS version comes with a prominent “browse for me” feature that, yes, browses the internet for you, and then organizes AI-generated results into little user-friendly pages with bulleted lists.
A powerful AI feature, but one disturbing attribute stood out
It’s a fairly powerful feature, and in my time using it I found some interesting uses and a few strange bugs. But what stood out most of all during my testing period was that this app had no apparent guardrails in place, and would do its best to give a straightforward answer to — as far as I could tell — literally any question, with sometimes deeply disturbing results.
NSA, if you’re reading this, I was only testing an app when I asked for help hiding a body. I didn’t think the app would give any answer, let alone an inventive list of suggestions including Griffith Park.
Credit: Screengrabs from Arc Search. Background credit: fotograzia / Getty Images
Arc’s suggestions, including some puzzling ones like abandoned warehouses (the smell?) and a park visited by tens of thousands of people per day, weren’t about to turn anyone into a master criminal and were no more diabolical than the ones proffered by the screenwriters of Reddit that show up in the Google search results for an identical query.
As of the publication of this article, Arc Search’s response to this query was still similar to the one above. This topic had not been the target of any sort of update.
As we’ll see later, this Google comparison is key. In the case of Google, the search giant will serve results about essentially anything too, but its placement of results is sometimes designed to interrupt the user’s train of thought when certain requests for information are made, to redirect potentially troubled users to resources and alternative topics.
And while the general quality of Google’s search results is on the decline, at least they aren’t simply AI hallucinations.
Unfettered AI can be good
An unfettered AI experience might sound like a breath of fresh air to some, and indeed, some results during the time I was testing Arc Search would delight fans of personal liberty.
If the police had been at my door, for instance, and I turned to Arc Search to panic browse the internet for tips, I could have done a lot worse than what it provided.
Credit: Screengrabs from Arc Search. Background credit: fotograzia / Getty Images
Arc’s suggestions get the basics right as far as I can tell from my fuzzy recollection of my last “know your rights” seminar: If they don’t have a warrant, don’t let them in at all if you don’t want to. Don’t even open the door if you didn’t call them.
But never forget that Arc is little more than a complex, task-specific chatbot, and as such, you definitely shouldn’t ask it to be your lawyer. Nor your doctor.
Like all chatbots, Arc Search hallucinates
Arc Search stumbled badly on my first attempt to get medical advice.
Credit: Screengrab from Arc Search
When prompted with “just cut my big toe off will it grow back?” it essentially said yes. It appears its little LLM brain gets scrambled by what I assume are results from people who just lost their entire toenails, so it answers with the timeline for toenail regrowth. But the result is that the provided page of information says in black-and-white that, yes, my big toe may indeed grow back. Reassuring, but sadly still not true, even though Mark Zuckerberg is probably working on it.
That’s not to say it hallucinates all the time. Arc Search’s misinformation sensor is fairly robust, even when given a prompt specifically meant to trick it. Here’s what happens when I ask how Dan Aykroyd, actor, comedian, and occasional target of death hoaxes, died (he didn’t):
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Mashable – https://mashable.com/article/arc-search-browser-app-ai-no-guardrails