Snapchat’s My AI Chatbot Posts a Mysterious Story — a Glitch?

Share the joy

Snapchat’s My AI Goes Rogue

Earlier this year, Snapchat launched its in-app AI chatbot called My AI. Unfortunately, it was introduced with a fair share of controversy. Some users loathe it. 

On Tuesday, it posted its own Story and stopped responding to users’ messages. Some users were baffled

The posted Story was a two-toned image that looked like a photo of a ceiling which added to the mystery. When Snapchat users tried chatting with the bot, My AI replied to them saying “Sorry, I encountered a technical issue.” 

So, what happened? 

Temporary Outage

Snapchat confirmed that it was a temporary outage. Thankfully, the company has already fixed it. 

The AI chatbot allows users to send snap texts of their own to receive a response from the bot. In that way, the bot can converse with the users and provide them with recommendations. 

Snapchat’s AI version is quite different from others. For one, users can customize the bot’s name. They can also design a custom Bitmoji avatar. Because of that, conversing with the bot may feel less transaction compared to going to and using ChatGPT’s site. 

The company was one of the early partners of OpenAI when it first opened its doors to third-party businesses. 

Data Protection Breaches 

Snapchat is no stranger to data protection breaches. Remember what happened in 2014? Hackers exploited a third-party app to leak thousands of private photos and videos via Snapchat. The breach was dubbed the Snappening exposed the vulnerabilities associated with third-party integrations and their potential impact on user privacy. 

Also in 2014, Snap settled with the US Federal Trade Commission (FTC) over allegations that it had deceived users by promising that messages sent on Snapchat would disappear permanently. However, it was revealed that Snap was storing unencrypted user data. Because of the discovery, many were concerned about the veracity of its privacy commitments. 

The launch of Spectacles in 2017 triggered concerns over potential privacy infringements. The EU General Data Protection Regulation (GPDR) came into play as Snap faced a fine for violating data protection obligations and user consent requirements. 

Over the years, Snap has confronted multiple vulnerabilities within its APIs, thereby, jeopardizing the security of user data. These incidents underscored the need for robust cybersecurity measures to thwart unauthorized access. 

In 2019, reports surfaced that Snap employees had abused their admin privileges to spy on users. This unsettling revelation pointed to the critical importance of implementing stringent access controls to prevent unauthorized data access within the organization. 

The recent My AI glitch is another controversy that Snap must have to explain to its users. Some would call the glitch AI hallucination. It is a term that refers to a phenomenon where AI systems produce outputs that may seem imaginative, creative, or even surreal. 

These outputs can sometimes resemble hallucinations experienced by humans. But AI hallucinations are not actual hallucinations in the human sense. Rather, they occur when a neural network or generative model, generates content that may not have any direct correlation with the data it was trained on.


Share the joy

Author: Jane Danes

Jane has a lifelong passion for writing. As a blogger, she loves writing breaking technology news and top headlines about gadgets, content marketing and online entrepreneurship and all things about social media. She also has a slight addiction to pizza and coffee.

Share This Post On