The names of well-known AI companies, ChatGPT and Gemini, are being abused in a new phishing campaign to fool users into downloading phony iOS apps and divulging Facebook login credentials. Phishing emails that seem to offer helpful business, advertising, or account management tools are the first step in the attack. Nevertheless, the links take users to phony applications that are made to steal login credentials.
How the phishing campaign operates The shared indicators show that the attackers are using links to the Apple App Store to promote malicious iPhone apps. These apps use reputable AI branding to look authentic and are marketed as advertising or business support tools. After installation, the apps request that users log in with their Facebook login information, stating that this is necessary for account access or ad management.
The apps function as tools for gathering credentials rather than offering actual services. This implies that hackers could gather the usernames and passwords entered and use them to take control of Facebook accounts, particularly those connected to advertising platforms and businesses. Because compromised business accounts can be used for ad fraud, scam promotion, or additional phishing activity, such access can be beneficial for threat actors.
What Makes This Attack Noteworthy? This campaign demonstrates how cybercriminals are switching from straightforward phony websites to mobile apps that appear more reliable to users. The attackers raise the possibility that victims will think the apps are legitimate or authorized tools by using names connected to popular AI products.
Additionally, a focus on business users, marketers, and advertisers who manage pages or ad accounts from mobile devices is suggested by the use of Facebook credential prompts. Phony AI Applications Phish (Source: SpiderLabs) The attack also shows that trusted platforms are not impervious to abuse, according to SpiderLabs research. Users should confirm the publisher, the purpose of the app, and the requested permissions even if they download apps from official marketplaces.
Any app that claims to be connected to ChatGPT or Gemini should be thoroughly examined, particularly if it requests unrelated credentials like Facebook login information. Users and security teams should keep an eye out for phishing emails mentioning AI tools or ad services, unexpected Facebook login prompts, and suspicious app installs.
Employers should also instruct employees not to sign in via email links unless the service has been independently validated. Type of Indicator Context Value URL Malicious ChatGPT App hxxps[://]apps[.]apple[. ]com/au/app/ads-gpt/id6759514534 URL Malicious Gemini App hxxps[://]apps[.]apple[.
]com/au/app/geminiai-advertising/id6759005662 This campaign serves as a reminder that one of the most successful phishing techniques is still brand impersonation, particularly when it incorporates mobile platforms, business account targeting, and reputable names.












