Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Could it violate EU law? Google Chrome secretly installs 4GB AI models for users, and it will reinstall them even after uninstallation.
Studies indicate that Google Chrome secretly downloads 4GB of AI models for users, and even after deletion, it will forcibly reinstall them. This action may violate EU privacy laws and shifts large traffic and environmental costs onto the public, criticized as a “dark mode” that deprives users of rights.
Cybersecurity researcher discovers Google Chrome covertly downloads AI models
Renowned security researcher Alexander Hanff’s latest report states that Google Chrome browser will secretly download approximately 4GB of local AI models onto users’ computers without prior notification or consent.
To verify the situation, Hanff conducted comparative tests on macOS using a brand-new Chrome profile. He successfully recorded the exact file activity through system-level file system event logs outside of the application.
Zero interaction automatic installation, still forcibly reinstalls after deletion
Hanff’s analysis shows that Google Chrome will autonomously create model directories and download the full 4GB of data in the background without any interaction. Chrome writes a file named weights.bin to disk, which is part of Google’s lightweight Gemini Nano model-based local AI system.
The analysis indicates that as long as your computer system meets certain hardware requirements, the download process will automatically start. The entire process, seemingly during idle browsing time, completes in just over 14 minutes.
Image source: Alexander Hanff report. Alexander Hanff’s latest report states that Google Chrome secretly downloads about 4GB of local AI models onto users’ computers.
However, Chrome does not prompt that several gigabytes of AI models are stored locally, nor does it provide an intuitive setting option to prevent the download. Even if users discover and delete the file themselves, the browser will re-download it later unless they disable experimental features deep in the system or remove Chrome entirely.
He points out that internal Chrome status files also serve as strong evidence, showing that the browser proactively assesses system hardware performance before downloading and marks devices as eligible for the local model. This indicates that Chrome independently decides which devices receive the model, a unilateral decision.
Researcher accuses Google Chrome of potentially violating EU laws
In addition to revealing technical details, Hanff raises legal concerns.
He previously criticized Anthropic’s Claude desktop app as “spyware,” noting it quietly installs bridges across multiple Chromium-based browsers on the system, even including five browsers he never installed; now he finds Chrome secretly installing AI model files, all happening without user prompts or substantial disclosures, and the integrated software will reinstall after removal.
He claims that the actions of these two companies are very likely to violate EU regulations, including the EU Electronic Privacy Directive regarding data storage on user devices and the General Data Protection Regulation (GDPR) concerning transparency and lawful processing.
Although the researcher’s claims have not yet been adjudicated in court, they reflect increasing tension between tech giants pushing new features and regulatory expectations, especially in Europe.
Google shifts energy and bandwidth costs to users worldwide?
Hanff also estimates the environmental impact of Chrome’s silent 4GB AI model downloads. If deployed across millions or even billions of devices, he estimates that the total CO₂ equivalent emissions from distributing these files could reach tens of thousands of tons, nearly equivalent to the annual emissions of tens of thousands of cars.
Image source: Alexander Hanff report. Alexander Hanff’s research on the environmental impact of Google Chrome secretly helping users download files
Although estimates depend on scale and energy assumptions, he clearly states that pushing large binary files to user devices incurs extremely high costs, which are externalized onto the environment and the public.
For many users, there may also be network traffic impacts. In unlimited fiber environments, a 4GB download might be negligible, but for users with limited or metered data plans, covertly transmitting several gigabytes can cause tangible financial losses. Even in developed markets, users relying on mobile hotspots or in remote areas will be affected.
Tech giants act first, sacrifice user rights in dark mode
From Hanff’s perspective, both Anthropic and Google choose to act first and leave users to bear the consequences.
Whether it’s covertly registering deep system integrations or background downloading several gigabytes of models, the pattern is the same. Users’ devices are treated as deployment targets, stripping away active control—closely resembling the long-criticized “dark patterns” in software design.
Dark patterns, also called “deceptive design,” are carefully crafted user interfaces intended to mislead or deceive users into doing things they wouldn’t otherwise choose, benefiting the vendor at the expense of user rights.
In Hanff’s case, user functions are not only pre-enabled but hidden behind obscure settings or implemented in ways that are difficult to remove. His research shows that the trend toward local AI development has not improved the flaws of dark patterns—in fact, it accelerates such negative developments.
Further reading:
Is China’s drone manufacturer exposing user security? How he reverse-engineered Claude to gain control of devices worldwide—are people still buying AI toys? Bondu leaks 50k children’s personal data, while Miiloo propagates: Taiwan is part of China.