{"published":"2025-12-18 23:03:37","height":"190","version":"1.0","author_name":"HTN20190109","html":"<iframe src=\"https://hatenablog-parts.com/embed?url=https%3A%2F%2Fhtn20190109.hatenablog.com%2Fentry%2F2025%2F12%2F18%2F230337\" title=\"BERT - HTN20190109\u306e\u65e5\u8a18\" class=\"embed-card embed-blogcard\" scrolling=\"no\" frameborder=\"0\" style=\"display: block; width: 100%; height: 190px; max-width: 500px; margin: 10px 0px;\"></iframe>","provider_url":"https://hatena.blog","provider_name":"Hatena Blog","image_url":null,"description":"\u30fbBERT( Bidirectional Encoder Representation from Transformers )\u53cc\u65b9\u5411Transformer\u3092\u30d9\u30fc\u30b9\u306b\u3057\u305f\u81ea\u7136\u8a00\u8a9e\u51e6\u7406\u30e2\u30c7\u30eb\u4e00\u3064\u306e\u6587\u7ae0\u307e\u305f\u306f\u4e8c\u3064\u306e\u6587\u7ae0\u306e\u30bb\u30c3\u30c8\u3067\u5165\u529b\u3068\u3059\u308b\u3002\u57cb\u3081\u8fbc\u307f\u51e6\u7406\u306f\u300c\u4f4d\u7f6e\u57cb\u3081\u8fbc\u307f\u300d\u3001\u300c\u30c8\u30fc\u30af\u30f3\u57cb\u3081\u8fbc\u307f\u300d\u3001\u300c\u30bb\u30b0\u30e1\u30f3\u30c8\u57cb\u3081\u8fbc\u307f\u300d\u3092\u7d44\u307f\u5408\u308f\u305b\u308b\u3002\u4e8b\u524d\u5b66\u7fd2\u5b9f\u65bd\u5f8c\u3001\u30d5\u30a1\u30a4\u30f3\u30c1\u30e5\u30fc\u30cb\u30f3\u30b0\u304c\u5fc5\u8981\u3002\u30af\u30e9\u30b9\u30c8\u30fc\u30af\u30f3\u3001\u5404\u5358\u8a9e\u30c8\u30fc\u30af\u30f3\u3092\u51fa\u529b\u3068\u3057\u3066\u4f7f\u7528\u3059\u308b\u3002\u4e8b\u524d\u5b66\u7fd2\u30bf\u30b9\u30af\u3068\u3057\u3066MLM(Masked Language Model)\u3068NSP(Next Sentence Prediction)\u3092\u884c\u3046\u3002","type":"rich","url":"https://htn20190109.hatenablog.com/entry/2025/12/18/230337","blog_title":"HTN20190109\u306e\u65e5\u8a18","categories":["DL"],"width":"100%","title":"BERT","blog_url":"https://htn20190109.hatenablog.com/","author_url":"https://blog.hatena.ne.jp/HTN20190109/"}