{"image_url":"https://cdn-ak.f.st-hatena.com/images/fotolife/e/espio999/20230903/20230903012715.png","type":"rich","description":"As I delved into the overview of large-scale language models since the beginning of the year, the budding idea that emerged was that of language generation as a probability theory. There seems to exist a \"true\" probability model that produces appropriate words, and the idea is to bring large-scale l\u2026","blog_url":"https://impsbl.hatenablog.jp/","author_name":"espio999","html":"<iframe src=\"https://hatenablog-parts.com/embed?url=https%3A%2F%2Fimpsbl.hatenablog.jp%2Fentry%2FSpeechOfLZMA_en\" title=\"Generating words using compression algorithms and random data - Technically Impossible\" class=\"embed-card embed-blogcard\" scrolling=\"no\" frameborder=\"0\" style=\"display: block; width: 100%; height: 190px; max-width: 500px; margin: 10px 0px;\"></iframe>","categories":["AI","English post","IT","Python","\u8a71\u984c"],"url":"https://impsbl.hatenablog.jp/entry/SpeechOfLZMA_en","version":"1.0","title":"Generating words using compression algorithms and random data","blog_title":"Technically Impossible","width":"100%","author_url":"https://blog.hatena.ne.jp/espio999/","provider_url":"https://hatena.blog","height":"190","provider_name":"Hatena Blog","published":"2023-09-04 00:00:00"}