
It’s been discovered that Google Chrome may be consuming more than just system memory, as a 4 GB AI model was found downloaded to a user’s PC. It seems that we can never have enough AI in our lives these days, and Google wants to ensure that everyone who can run it locally has it installed locally, whether they’ve consented to it or not. Researcher Alexander Hannf (via TechPowerUp), aka The Privacy Guy, has found Google Chrome silently downloading the Gemini Nano AI model locally without user consent or input. Sadly, though, it appears this tactic could be more widespread than most PC users are aware of, as Claude was discovered to be behaving in a similar pattern.
“The pattern was: install on user launch of product A, write configuration into the user’s installs of products B, C, D, E, F, G, H without asking. Reach across vendor trust boundaries. No consent dialog. No opt-out UI. Re-installs itself if the user removes it manually, every time Claude Desktop is launched.”
– Alexander Hannf
Hannf’s research has produced more results exposing that Claude is establishing messaging connections with Chromium-based browsers such as Brave, Edge, Arc, Vivaldi, Opera, and Chromium and, of course, led to finding Chrome’s AI model download. The file is titled “weights.bin” and can be located in the file folder OptGuideOnDeviceModel and serves as Gemini Nano’s local AI LLM, and if deleted, it will reinstall, without user permission or input. For non-enterprise users, the only way to prevent it from re-installing is to disable Chrome’s AI features via chrome://flags. Mac users face similar issues, as testing showed detailed reporting by macOS when files had been downloaded.
“This week I discovered the same pattern, executed by Google. Google Chrome is reaching into users’ machines and writing a 4 GB on-device AI model file to disk without asking. “
– Alexander Hannf
It is entirely possible that Google may be including the permissions for Chrome’s automatic downloading of local AI models under its EULA, but this may not pass regulatory requirements in the European Union or the United Kingdom, or local laws in the United States.
As if it’s not scary enough to know that such a large file is silently being downloaded in the background, it is even more concerning how much effort could be involved in removing and preventing it from happening again. Furthermore, if this process is determined to be non-compliant with various legal requirements, what consequences for Google and other AI-providers, such as Anthropic, will there be? In regard to Google, the repercussions could be great given the millions (billions?) of machines using Chrome.
