{"provider_name":"Hatena Blog","provider_url":"https://hatena.blog","type":"rich","url":"https://impsbl.hatenablog.jp/entry/CallGeminiNanoLocallyInChrome_en","width":"100%","author_name":"espio999","html":"<iframe src=\"https://hatenablog-parts.com/embed?url=https%3A%2F%2Fimpsbl.hatenablog.jp%2Fentry%2FCallGeminiNanoLocallyInChrome_en\" title=\"Calling built-in Gemini Nano - for Chrome Version 128, 129 or later - Technically Impossible\" class=\"embed-card embed-blogcard\" scrolling=\"no\" frameborder=\"0\" style=\"display: block; width: 100%; height: 190px; max-width: 500px; margin: 10px 0px;\"></iframe>","blog_url":"https://impsbl.hatenablog.jp/","image_url":"https://cdn-ak.f.st-hatena.com/images/fotolife/e/espio999/20240904/20240904222535.png","title":"Calling built-in Gemini Nano - for Chrome Version 128, 129 or later","description":"Installing and running LLM on 8GB RAM PC was one of fun activities last year*1. Now, LLM is integrated with a web browser and user can call it locally through browser`s console.developer.chrome.comUser can call Gemini Nano model with API in Google Chrome 128 or later. Even it is a small size LLM, es\u2026","height":"190","version":"1.0","blog_title":"Technically Impossible","author_url":"https://blog.hatena.ne.jp/espio999/","categories":["AI","English post","Google Chrome","IT"],"published":"2024-09-11 00:00:00"}