Will Chrome, Safari, or Firefox add native support for local LLMs in 2024? (Opera just did)
➕
Plus
105
Ṁ20k
Dec 31
93%
chance

Opera issued a press release titled "Opera becomes the first major browser with built-in access to local AI models."

Will any other major browsers follow suit*?

* Access in beta or dev mode would suffice to resolve this market as yes

  • Update 2024-15-12 (PST): Canary (experimental) versions of browsers do not count - only beta or dev versions will qualify for resolution. (AI summary of creator comment)

Get
Ṁ1,000
and
S3.00
Sort by:

Can’t this resolve yes? Chrome Canary at least has built in support for local Gemini

@jonathan21m canary is the experimental version of chrome and different from the dev/beta

@Soli apologies for the incorrect comment. The built in support for Gemini Nano is actually in regular Chrome right now. It was added in version 131. Here are the links: https://developer.chrome.com/origintrials/#/view_trial/1923599990840623105

https://chromestatus.com/feature/5193953788559360

sold Ṁ481 NO

@jonathan21m looks promising but i still need to validate when i have more time tomorrow.

The Summarizer API uses a powerful AI model trained to generate high-quality summaries. While the API is built into Chrome, the model is downloaded separately the first time a website uses the API. [source]

bought Ṁ250 YES

@Soli Tested on Chrome stable 131 by visiting https://chrome.dev/web-ai-demos/summarization-api-playground/ , it works and used local tensorflow lite XNNPACK according to console logs.

bought Ṁ1,000 YES

@jonathan21m Implementation status: In developer trial (Behind a flag); this is three steps behind "Prepare to Ship"

@marvingardens but it's also in origin trial status where you opt in to have it work on your domain (origin) in stable and demonstrably usable on stable.

@Lun Demonstrating it on https://lun.horse on chrome 131

@Lun Works for me too! Presumably lun.horse is not in the origin trial?

@lxgr It is in the trial, getting added to the trial is a short form and was immediately/automatically approved.

bought Ṁ1,000 YES

@Lun Cool, thank you!

I'd argue this should still count, given that the market explicitly accepts beta/dev versions, and this works even with the production/live release as long as the origin has opted in to the API (which is not something I, as a visitor, have to do anything for).

bought Ṁ20 NO

why is this so high...? This market seems miscalibrated!

bought Ṁ300 NO

@Soli Oddly enough people keep buying YES, firefox is pretty slow with such features so is chrome! Maybe Safari might do it, but apple is generally behind on the curve so that seems even more unlikely.

The reason vivaldi,edge,opera do well with feature geeks is because they add new stuff faster.

Firefox doesn't even have tab groups for context! Chrome has nowhere near the integrations of edge.

is it happening? 👀

bought Ṁ750 YES from 54% to 75%
bought Ṁ100 NO

Can't see Google or Apple supporting local model access built-in for their flagship browsers. Gemini, sure, but local? meh.

@firstuserhere What about Firefox? It could be their differentiator.

@firstuserhere assuming apple doesn’t release a local model optimized for their machines 😏

I think this is a poor question because any browser can currently support local LLMs via extensions. If you mean first party integrations, that is a different question.

@NateIO the linked article clearly says “built-in access to local … “ which makes it obvious what this market is about :)

@NateIO i changed to native support - hopefully this resolves any misunderstandinfs

Google will use Chrome to push their own LLM instead, but the other two might.

© Manifold Markets, Inc.Terms + Mana-only TermsPrivacyRules