It’s AI-lingual.
English-speaking ChatGPT users were caught off guard after OpenAI’s chatbot started increasingly injecting Arabic words into its responses, as seen in viral social media posts.
“It did it twice on my phone and once on my work laptop, I’m not even in an Arabic speaking country, nor the Middle East lol,” claimed one flabbergasted GPT trustee on Reddit.
They included a recipe list with one of the ingredients randomly listed in the Middle Eastern language.
That wasn’t the only alleged slip of the digital tongue. In a viral X post, one flummoxed AI enthusiast recalled how the chatbot decided to plop in some Arabic while helping them write a prompt for a logo.”
When asked about the gaffe, the large language model claimed that it “slipped in by mistake,” per the screenshot.
“SLIPPED IN??? It’s a whole different alphabet,” spluttered the confused user in the caption. “Has anyone else had ChatGPT randomly switch languages on them?”
Many Reddit commenters recalled experiencing the same glitch with some claiming that the multilingual machine had started responding to prompts in Armenian, Hebrew, Spanish, Chinese and Russian.
Commenters were taken aback by the technological tics, which were blamed on everything from “AI hallucinations” to ChatGPT becoming increasingly stupid.
However, as more astute users observed, this so-called digital pidgin actually has to do with how the AI system is programmed. The machine is trained using a cybernetic shorthand called tokens, which correspond to the data its attempting to process, whether it’s images, videos, audio clips, or, in this case, text.
For instance, large language models like ChatGPT may represent shorter words with one token, while splitting larger words into several with each of these digital abbreviations denoted by a different number. The more efficient the tokenization, the less computing power is required for training and inference.
However, as these AI bots are trained on a large number of languages — hence the name — they might throw in a corresponding foreign word that’s shorter and easier to process because it saves on tokens and is therefore more economical.
One Redditor replied to the aforementioned recipe post, explaining that the Arabic word in question means “low,” thereby translating to “low-fat yogurt.”
In another post discussing the Health Insurance Portability and Accountability Act of 1996 (HIPAA), the original poster described how the Arabic phrase translated to “within the USA” so it “did make sense.”
Coincidentally, the Arabinglish phenomenon isn’t the first time ChatGPT was caught speaking in a different tongue.
In 2024, the advanced AI chatbot appeared to have an epic meltdown that caused it to start babbling in Spanglish and firing off other gibberish responses.
Per one such example posted to the platform, a user had inquired about which Bill Evans jazz albums it would recommend getting on Vinyl.
After rattling off several recommendations, GPT puzzlingly repeated the phrase “happy listening” over and over again like a stuck jukebox, or perhaps — more ominously — like the Hal-9000 dying at the end of “2001: A Space Odyssey.”
Others claimed that the virtual assistant was responding to their queries in Spanglish: “Let me encylopease me si there’s more wonderenda tu articulation’s hungry for!” it wrote during one exchange, per a viral screenshot.


