ChatGPT caught “gaslighting” TikToker on how to spell simple word
TikTok: yallwatchthis / OpenAIA TikTok user was left baffled after ChatGPT attempted to ‘gaslight’ them on the spelling of a simple word.
OpenAI‘s ChatGPT has been an immensely popular tool for many of its users, despite the significant controversy and debate surrounding the chatbot.
However, while many consider it a reliable source of information, a TikTok user has gone viral after revealing ChatGPt’s struggle to answer one simple question.
When asked to spell “strawberry”, the generative AI tool determined there were only two R’s in the word – a hill ChatGPT was willing to die on.
The user, who goes by ‘YallWatchThis‘ on TikTok, took to the social media platform to share his conversation with the chatbot, beginning with him asking how many R’s were in “strawberry”. ChatGPT responded that there were two.
After being corrected, ChatGPT doubled down and questioned whether YallWatchThis was “counting the R in ‘straw’ and ‘berry’ separately.” Even after being asked to break down the spelling, the chatbot remained insistent that YallWatchThis was mistaken.
The pair continued to go back on forth on the word’s spelling, but no matter YallWatchThis’ efforts, ChatGPT did not want to be swayed (and even spelled “straw” as “strawr” at one point). Reading the conversation out loud for TikTok, it took YallWatchThis over five minutes before he was finally able to get the chatbot to recognize the third R.
“Not ChatGPT gaslighting you,” one person commented. Another joked that YallWatchThis has successfully “defeated ChatGPT in single combat.”
Meanwhile, a third user shared their own experience asking the chatbot. While it was able to correctly identify three R’s, when the user “told” ChatGPT that there were two, it allegedly “apologized and agreed”.
Shortly after YallWatchThis’ argument with ChatGPT, OpenAI announced plans to launch a new AI model with “greater” problem-solving skills. To the TikToker’s surprise, the new model was coincidentally named ‘Strawberry’.