{"id":987,"date":"2024-02-25T23:25:41","date_gmt":"2024-02-25T14:25:41","guid":{"rendered":"http:\/\/mukgee.com\/?p=987"},"modified":"2024-02-26T00:02:31","modified_gmt":"2024-02-25T15:02:31","slug":"transformer-%ea%b8%b0%eb%b0%98%ec%9d%98-%eb%8b%a4%ec%96%91%ed%95%9c-%eb%aa%a8%eb%8d%b8%ea%b3%bc-%ec%95%84%ed%82%a4%ed%85%8d%ec%b3%90","status":"publish","type":"post","link":"http:\/\/mukgee.com\/?p=987","title":{"rendered":"Transformer \uae30\ubc18\uc758 \ub2e4\uc591\ud55c \ubaa8\ub378\uacfc \uc544\ud0a4\ud14d\uccd0"},"content":{"rendered":"<p data-pm-slice=\"0 0 []\">LLM \uae30\ubc18\uc758 Application \uc744 \uac1c\ubc1c\ud558\uae30 \uc2dc\uc791\ud588\ub2e4\uba74 \uac00\uc7a5 \ub9ce\uc774 \ubc29\ubb38\ud558\ub294 \ud398\uc774\uc9c0\ub294 \uc544\ub9c8 <a href=\"https:\/\/huggingface.co\/\">\bHugging Face<\/a> \uc77c\uac83 \uac19\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>Transformer \uc758 \ub4f1\uc7a5\uacfc \uc774\ud6c4 GPT \/ BERT \uc758 \uc131\uacf5\uc73c\ub85c \uc815\ub9d0 \ub2e4\uc591\ud55c \uc544\ud0a4\ud14d\uccd0\ub97c \uac00\uc9c4 transformers \ubaa8\ub378\uc774 \ub4f1\uc7a5\ud588\uace0, Huggingface \ub355\ubd84\uc5d0 \b\uc27d\uac8c NLP Task \uc5d0 \uc811\uadfc\ud560 \uc218 \uc788\uac8c \ub418\uc5c8\ub294\ub370\uc694.<\/p>\n<p>\uc624\ub298\uc740 Huggingface\uc5d0 \uc62c\ub77c\uc628 \ub2e4\uc591\ud55c Transformers \ubaa8\ub378\ub4e4\uc5d0 \ub300\ud574 \uc774\uc57c\uae30\ud574\ubcf4\ub824\uace0 \ud569\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/18\/29096475a3b078fafb8e16e2b4acb25099c1bbeae58b59e2e7d10646c098e23c\" alt=\"image.png\" \/><\/p>\n<h2><strong>1. Transfomer \ubaa8\ub378<\/strong><\/h2>\n<p>Attention Is All You Need \ub17c\ubb38\uc5d0\uc11c \uc18c\uac1c\ub41c Transformer\ub294 \uc774\uc804\uc758 \ub9ce\uc740 Seq2Seq Model\ub4e4 \ucc98\ub7fc Encoder-Decoder \uad6c\uc870\ub97c \uac00\uc9c0\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>( Transfomer\uc758 \uc790\uc138\ud55c \uc124\uba85\uc740 tensorflow\uc758 <a href=\"https:\/\/blog.tensorflow.org\/2019\/05\/transformer-chatbot-tutorial-with-tensorflow-2.html\">Transform \ud29c\ud1a0\ub9ac\uc5bc \ud398\uc774\uc9c0<\/a>\ub97c \ucc38\uace0 \ubd80\ud0c1\ub4dc\ub9bd\ub2c8\ub2e4. )<\/p>\n<p>&nbsp;<\/p>\n<p>\uc774\ud6c4 Transformer\ub294 \uc67c\ucabd\uc758 Encoder \ubd80\ubd84\uc744 \ud65c\uc6a9\ud55c Encoder-Only \uc544\ud0a4\ud14d\uccd0\uc640 \uc624\ub978\ucabd\uc758 Decoder\ub97c \ud65c\uc6a9\ud55c GPT \uacc4\uc5f4\uc758 Decoder-only \uc544\ud0a4\ud14d\uccd0 \uadf8\ub9ac\uace0, Encoder\uc640 Decoder \ubaa8\ub450\ub97c \ud65c\uc6a9\ud558\ub294 Encoder-decoder \uc544\ud0a4\ud14d\uccd0\ub85c \ub098\ub220 \ubc1c\uc804\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/18\/44b5e7623da68c0d5be26a22239740d6ff6a7de73cd443b92f846cc3ab379307\" alt=\"image.png\" \/><\/p>\n<h2><strong>2. Encoder-Only<\/strong><\/h2>\n<p>Encoder-Only \uc544\ud0a4\ud14d\uccd0\uc758 \ub300\ud45c\uc8fc\uc790\ub294 2018\ub144 \uad6c\uae00\uc5d0\uc11c \ubc1c\ud45c\ud55c BERT(<a href=\"https:\/\/arxiv.org\/abs\/1810.04805?source=post_page\">\ub17c\ubb38<\/a>) \uc785\ub2c8\ub2e4. BERT\ub294 Bidirectional Encoder Representations from Transformers \uc758 \uc57d\uc790\ub85c \ubaa8\ub378\uba85\uc5d0 \ud45c\ud604\ub41c\uac83\ucc98\ub7fc Transformer\uc758 Encoder\ub97c \ud65c\uc6a9\ud55c \ubaa8\ub378\uc785\ub2c8\ub2e4.<\/p>\n<p>BERT\ub294 \ud14d\uc2a4\ud2b8\uc5d0\uc11c \ub9c8\uc2a4\ud0b9\ub41c \ud1a0\ud070\uc744 \uc608\uce21(Masked Language Modeling, MLM) \uacfc \ub2e4\uc74c \ubb38\uc7a5\uc744 \uc608\uce21(Next Sentence Prediction, NSP) \ub97c \ubaa9\ud45c\ub85c \ud559\uc2b5\ub41c \ubaa8\ub378\uc778\ub370\uc694.<\/p>\n<p>Input \uc815\ubcf4\ub97c \uc778\ucf54\ub529\ub41c \ud45c\ud604\uc73c\ub85c \ub9cc\ub4dc\ub294 Transformer Encoder\uc758 \ud2b9\uc131\uc744 \uc774\uc6a9\ud558\uc5ec Task\uc758 \ubaa9\uc801\uc5d0 \ub530\ub77c \uc801\uc6a9 \ubc29\uc2dd\uc785\ub2c8\ub2e4.<\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/4885b94a1528851513ef0184b2eb311837c11c243f91df115b805270562ce792\" alt=\"image.png\" \/><\/p>\n<p>( \ucd9c\ucc98 : https:\/\/huggingface.co\/blog\/bert-101 )<\/p>\n<p>&nbsp;<\/p>\n<p>\uc774\uc81c, Huggingface\uc758 transfomrer \ub77c\uc774\ube0c\ub7ec\ub9ac\ub97c \ud65c\uc6a9\ud558\uc5ec BERT \ubaa8\ub378\uc744 \uc0b4\ud3b4 \ubcf4\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n<p>Huggingface \uc758 \ubaa8\ub378\ub4e4\uc740 AutoModel \uc774\ub77c\ub294 \uae30\ubcf8 \ubaa8\ub378\uc744 \ubc14\ud0d5\uc73c\ub85c AutoModelFor{XX} ( Ex. AutoModelForSequenceClassification ) \ucc98\ub7fc NLP Task\uc5d0 \ub9de\ub294 \ubaa8\ub378\uc744 \ubd88\ub7ec \uc62c \uc218 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\uc624\ub298\uc740 BERT\uc758 \uae30\ubcf8 \uad6c\uc870\ub97c \uc0b4\ud3b4 \ubcf4\uae30 \uc704\ud574 AutoModel \uc744 \uc0ac\uc6a9\ud574\uc11c pretrain\ub41c BERT \ubaa8\ub378\uc744 \uba3c\uc800 \ubd88\ub7ec\uc624\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n<pre><code data-language=\"python\">from transformers import AutoModel\r\nmodel = AutoModel.from_pretrained(\"bert-base-uncased\")<\/code><\/pre>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/876a2593eea85b6454038e535565075a9abd8a972911c64e7f2ff3c5a659d968\" alt=\"image.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>\uc785\ub825\ub41c \uc2dc\ud000\uc2a4\ub97c Token Embeding + Positional E\bmbeding \ud558\ub294 Embeddings Layer \ub97c \uc9c0\ub098 12\uac1c\uc758 Multi-Head Attention Layer \uadf8\ub9ac\uace0 Linear Layer( Feed Forward ) \ub97c \uac00\uc9c4 \ubaa8\ub378 \uad6c\uc870\uac00 \ucd9c\ub825\ub429\ub2c8\ub2e4.<\/p>\n<p>\uc774\uc81c, \ub9c8\uc2a4\ud0b9\ub41c Token\uc744 \uc608\uce21\ud558\ub294 Task\ub97c \uc218\ud589\ud558\uae30 \uc704\ud574 transformers library\uc758 AutoModelForMaskedLM \ub97c \uc0ac\uc6a9\ud558\uc5ec AutoModel \ub4a4\uc5d0 \uc5b4\ub5a4 Layer\uac00 \ucd94\uac00\ub85c \ubd99\ub294\uc9c0 \ud655\uc778\ud574\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n<pre><code data-language=\"python\">from transformers import AutoModelForMaskedLM\r\nmodel = AutoModelForMaskedLM.from_pretrained(\"bert-base-uncased\")<\/code><\/pre>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/7f639b0c2aca852cebfe3b33527f9c1a205cb7db782b1bc2524125ebdb07e8fa\" alt=\"image.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>AutoModel\uc5d0\uc11c \ucd9c\ub825\ud55c Model \uad6c\uc870\uc640 \ub2e4\ub974\uac8c Encoder \uc5d0\uc11c \ub098\uc628 \uac12\uc744 BertOnlyMLMHead \ub77c\ub294 Layer\uc5d0 \ud1b5\uacfc\uc2dc\ud0a4\ub294 \uad6c\uc870\uac00 \ucd9c\ub825\ub429\ub2c8\ub2e4.<\/p>\n<p>\uc774\ucc98\ub7fc, \uc785\ub825\ub41c \ubb38\uc7a5\uc744 \uc798 \ud45c\ud604\ud558\ub294 Embedding vectors(Hidden State)\ub97c \ub9cc\ub4e0 \ub4a4 \uadf8 \uac12\uc744 \ub9c8\uc2a4\ud0b9\ub41c Token\uc744 \uc608\uce21\ud558\ub294 Task(Masked Language Modeling, MLM) \uc5d0\uc11c \ud65c\uc6a9\ud558\ub294 \uad6c\uc870\uac00 \ub098\uc624\uac8c \ub429\ub2c8\ub2e4.<\/p>\n<p>GPT \ubaa8\ub378\uacfc \ub2e4\ub974\uac8c BERT \uac19\uc740 \uc544\ud0a4\ud14d\uccd0\uc758 \uacbd\uc6b0 \ubb38\uc7a5\uc758 \uc88c\/\uc6b0(\uc55e\/\ub4a4) \uc5d0\uc11c\u001cbidirectional self-attention \uc744 \uacc4\uc0b0\ud558\uae30 \ub54c\ubb38\uc5d0 \uc624\ub978\ucabd\uc5d0 \uc704\uce58\ud55c \ubb38\ub9e5\uc744 \uac19\uc774 \uc774\ud574\ud574\uc57c \ud480 \uc218 \uc788\ub294 MLM \uac19\uc740 Task \ub97c \uc798 \uc218\ud589\ud558\ub294\uac83\uc73c\ub85c \uc54c\ub824\uc838\uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>( BERT \uac00 \ucc98\uc74c \ub4f1\uc7a5\ud588\uc744\ub54c, \uac10\uc815 \ubd84\uc11d\/\ubb38\uc7a5 \uc608\uce21\/\ubb38\uc7a5 \uc694\uc57d \ub4f1 11\uac1c \uc774\uc0c1\uc758 NLP Task \uc5d0\uc11c \uc555\ub3c4\uc801\uc778 \uc131\ub2a5\uc744 \ubcf4\uc5ec\uc8fc\uc5c8\uc2b5\ub2c8\ub2e4. \uaf2d MLM \ub9cc \uc798 \ud55c\ub2e4\ub294 \uc758\ubbf8\ub294 \uc544\ub2d9\ub2c8\ub2e4. )<\/p>\n<p>&nbsp;<\/p>\n<p>\ud604\uc7ac Huggingface\uc5d0\uc11c \uc0ac\uc6a9\uac00\ub2a5\ud55c Encoder-only \uc544\ud0a4\ud14d\uccd0 \ubaa8\ub378\uc740 BERT \uc774\ud6c4 \ub4f1\uc7a5\ud55c DistilBERT, RoBERTa , XML, ALBERT \uac19\uc740 \ubaa8\ub378\ub4e4\uc774 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<h2><strong>3. Decoder-Only<\/strong><\/h2>\n<p>GPT \ubaa8\ub378\uc740 Decoder-Only \uc544\ud0a4\ud14d\uccd0\uc758 \ub300\ud45c \ubaa8\ub378\ub85c ChatGPT\uc758 \uc131\uacf5\uc744 \uc774\ub04c\uba70 \uc5b4\ub5a4 \uad00\uc810\uc5d0\uc11c\ub294 LLM \ubaa8\ub378 \uc911 \uac00\uc7a5 \uc720\uba85\ud55c \ubaa8\ub378\uc774 \ub418\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n<p>GPT\ub294 Generative Pre-trained Transformer \uc758 \uc57d\uc790\ub85c 2018\ub144 Open AI \uc5d0\uc11c \ubc1c\ud45c\ud55c Improving Language Understanding by Generative Pre-Training <a href=\"https:\/\/cdn.openai.com\/research-covers\/language-unsupervised\/language_understanding_paper.pdf\">\ub17c\ubb38<\/a>\uc5d0\uc11c \ucc98\uc74c \uc18c\uac1c\ub418\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\uc9c0\ub09c \ucd9c\ub825\uacfc \ud604\uc7ac \ud1a0\ud070\ub9cc \uc0ac\uc6a9\ud574 \ub2e4\uc74c \ud1a0\ud070\uc744 \uc0dd\uc131\ud558\ub294 Transformer Decoder \uc758 \ud2b9\uc9d5\uc744 \ud65c\uc6a9\ud55c Decoder-Only \uc544\ud0a4\ud14d\uccd0\ub294 \ub2e4\uc74c \ub2e8\uc5b4\ub97c \uc608\uce21\ud558\ub294\ub370 \ub6f0\uc5b4\ub098\ub2e4\uace0 \uc54c\ub824\uc838 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ud2b9\ud788, GPT \ub294 transformer Decoder \uc544\ud0a4\ud14d\uccd0\uc640 \uc804\uc774\ud559\uc2b5\uc744 \ud6a8\uc728\uc801\uc73c\ub85c \uacb0\ud569\ud558\uc5ec \ub2e8\uc21c \ub2e4\uc74c \ub2e8\uc5b4\ub97c \uc608\uce21\ud558\ub294\uac83\uc774 \uc544\ub2cc Text \ubd84\ub958 \uac19\uc740 \ub2e4\uc591\ud55c NLP Task\uc5d0\uc11c\ub3c4 \ub6f0\uc5b4\ub09c \uc131\uacfc\ub97c \ubcf4\uc5ec\uc8fc\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/9820f255b7a6d02259c110adb2e078982d877986eb67598889e27d62458b2ee6\" alt=\"image.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>( \ucd9c\ucc98 : https:\/\/cdn.openai.com\/research-covers\/language-unsupervised\/language_understanding_paper.pdf )<\/p>\n<p>&nbsp;<\/p>\n<p>Transformer\uc758 Decoder layer \uc640 \ube44\uad50\ud574\uc11c Masked Multi-Head Attention Layer \uc774\ud6c4 \ubc14\ub85c Feed Forward \ub85c \ub4e4\uc5b4\uac00\ub294 \uad6c\uc870\uc785\ub2c8\ub2e4.<\/p>\n<p>Encoder\uc5d0\uc11c \ub118\uc5b4\uc624\ub294 \uc785\ub825\uc774 \uc5c6\uae30 \ub54c\ubb38\uc5d0 \ud574\ub2f9 layer\ub294 \uc0dd\ub7b5\ub41c\ucc44 Transformer\uc758 Decoder \uac00 \uad6c\uc131\ub41c \uac78\ub85c \ubcfc \uc218 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>Huggingface\uc5d0\uc11c\ub294 GPT \ubaa8\ub378\ub3c4 \uc81c\uacf5\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ub3d9\uc77c\ud558\uac8c AutoModel \ub85c GPT\uc758 \uad6c\uc870\ub97c \uba3c\uc800 \uc0b4\ud3b4 \ubd05\ub2c8\ub2e4.<\/p>\n<pre><code data-language=\"python\">from transformers import AutoModel\r\nmodel = AutoModel.from_pretrained(\"openai-gpt\")<\/code><\/pre>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/d9ce7a1cdae95fb01f21bd790c55542e845c2fb0c1417c5056b8ba44a1065284\" alt=\"image.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>BERT \ubaa8\ub378\uacfc \ub3d9\uc77c\ud558\uac8c \uc785\ub825 \uac12\uc744 Token Embedding + Positional Embedding \ud558\ub294 Embedding layer\ub4e4\uc744 \uc9c0\ub098 Masked Multi-Head Attention \uc744 \uacc4\uc0b0\ud558\ub294 12\uac1c\uc758 Attention layer\uac00 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ud558\uc9c0\ub9cc BERT\uc640 \ub2e4\ub974\uac8c Feed Forward netword\ub97c \uad6c\uc131\ud558\uae30 \uc704\ud574 Fully Connected( pytorch \uae30\uc900 Linear Layer ) \uac00 \uc544\ub2cc Conv1D Layer\ub97c \uc0ac\uc6a9 \ud588\ub294\ub370\uc694.<\/p>\n<p>FC(Fully Connected) Layer\uc640 Conv1D Layer\ub294 \uc720\uc0ac\ud55c \uacb0\uacfc\ub97c \uc5bb\uc744 \uc218 \uc788\ub2e4\uace0 \uc54c\ub824\uc838 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ub2e4\ub9cc, GPT \ucc98\ub7fc \ub300\uaddc\ubaa8 \ud559\uc2b5 \ub370\uc774\ud130\ub97c \uc0ac\uc6a9\ud560\ub54c\ub294 Conv1D \uac00 \ubaa8\ub378\uc758 \uc131\ub2a5\uc744 \ub192\uc774\ub294\ub370 \ub3c4\uc6c0\uc744 \uc900\ub2e4\uace0 \uc54c\ub824\uc838 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>\uc774\uc81c Decoder \uae30\ubc18\uc758 GPT \ubaa8\ub378\uc774 \uc798 \uc218\ud589\ud55c\ub2e4\uace0 \uc54c\ub824\uc838 \uc788\ub294 \ud14d\uc2a4\ud2b8 \uc0dd\uc131 Task\ub97c \uc704\ud574 AutoModelForCausalLM \ub97c \uc0ac\uc6a9\ud574\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ud754\ud788, \uc2dc\uc791 \ud504\ub86c\ud504\ud2b8\uac00 \uc8fc\uc5b4 \uc9c8 \ub54c \ud14d\uc2a4\ud2b8\uc5d0 \ub4f1\uc7a5\ud558\ub294 \ud1a0\ud070 \uc2dc\ud000\uc2a4\ub97c \ucd94\uc815\ud558\ub294 \uc870\uac74\ubd80 \ud655\ub960 \ubaa8\ub378\uc744 \uc790\uae30 \ud68c\uadc0 \ubaa8\ub378(Autoregressive model) \ud639\uc740 \uc778\uacfc \uc5b8\uc5b4 \ubaa8\ub378(Causal Language Model) \uc774\ub77c\uace0 \ubd80\ub974\uace0 Huggingface\uc5d0\uc11c\ub294 {xx}ForCausalLM \uc758 \ud615\ud0dc\ub85c \ubb38\uc7a5 \uc0dd\uc131 \ubaa8\ub378\uc744 \uc81c\uacf5\ud569\ub2c8\ub2e4.<\/p>\n<pre><code data-language=\"python\">from transformers import AutoModelForCausalLM\r\nmodel = AutoModelForCausalLM.from_pretrained(\"openai-gpt\")<\/code><\/pre>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/e0f1ba81d0fa1a7beeef5761d4357563eb3125bab07f78653d0d236b7a355f8d\" alt=\"image.png\" \/>GPT \ub17c\ubb38\uc5d0\uc11c \uc81c\uc2dc\ud55c Task \ubcc4 layer \ucc98\ub7fc \ub9c8\uc9c0\ub9c9 Layer\uc5d0 Linear Layer\uac00 \ucd94\uac00 \ub418\uc5c8\uc2b5\ub2c8\ub2e4.<\/p>\n<p>\ucd94\uac00\ub41c Linear Layer\uc758 out_features \uac12\uc740 \ub2e8\uc5b4 \uc0ac\uc804\uc758 \ud06c\uae30 (vocab size) \uac00 \ub420 \uac83 \uc785\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>Huggingfafce \uc5d0\uc11c\ub294 GPT-2, CTRL, GPT-Neo \uac19\uc740 Decoer-only \ubaa8\ub378\uc744 \uc81c\uacf5\ud558\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<h2><strong>4. Encoder-Decoder<\/strong><\/h2>\n<p>BERT\uc640 GPT \uc758 \uc131\uacf5 \uc774\ud6c4 Transformer\uc758 Encoder\ub098 Decoder \uc2a4\ud0dd\uc744 \uc0ac\uc6a9\ud574 \ubaa8\ub378\uc744 \ub9cc\ub4dc\ub294\uac83\uc774 \uc77c\ubc18\uc801\uc778 \uc77c\uc774 \ub418\uc5c8\uc9c0\ub9cc, \ubc88\uc5ed \uac19\uc740 \uc804\ud1b5\uc801\uc778 Seq2Seq Task \ub97c \ud574\uacb0\ud558\uae30 \uc704\ud574 Encoder-Decoder \uc544\ud0a4\ud14d\uccd0\ub294 \uc5ec\uc804\ud788 \uc720\ud6a8\ud569\ub2c8\ub2e4.<\/p>\n<p>Encoder-Decoder \ubaa8\ub378\uc758 \uac00\uc7a5 \ub300\ud45c\uc801\uc778 \ubaa8\ub378\uc740 2019\ub144\uc5d0 \uad6c\uae00\uc5d0\uc11c \ubc1c\ud45c\ud55c T5(<a href=\"https:\/\/arxiv.org\/abs\/1910.10683\">\ub17c\ubb38<\/a>) \uc785\ub2c8\ub2e4.<\/p>\n<p>T5\ub294 Text-to-Text Transfer Transformer \uc758 \uc57d\uc790\uc778\ub370 NLP Task \ub4e4\uc744 \ubaa8\ub450 Text to Text Task \ub85c \uc811\uadfc\ud558\ub294 \ubaa8\ub378\uc785\ub2c8\ub2e4.<\/p>\n<p>\uc544\ub798 \uc774\ubbf8\uc9c0\ucc98\ub7fc \ubc88\uc5ed\/\ud310\uc815\/\uac10\uc815\uc720\uc0ac\ub3c4\/\uc694\uc57d \uac19\uc740 NLP Task \ub4e4\uc744 prefix\ub97c \ud3ec\ud568\ud55c Text \ud615\ud0dc\ub85c \uc785\ub825\ud558\uace0 text \ub97c \ucd9c\ub825\ud558\ub294 \ud615\ud0dc\uc778\ub370\uc694.<\/p>\n<p>BERT\ub098 GPT \uc640 \uac00\uc7a5 \ud070 \ucc28\uc774\uc810\uc740 Transformer \uc6d0\ubcf8 \uc544\ud0a4\ud14d\uccd0\ub97c \uc0ac\uc6a9\ud55c\ub2e4\ub294 \uc810\uc785\ub2c8\ub2e4.<\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/5aaa151d55303225dde4ac11c74a78bc046a07afe9ea2b0d15514cf65c5d3c35\" alt=\"image.png\" \/><\/p>\n<p>( \ucd9c\ucc98 : https:\/\/arxiv.org\/pdf\/1910.10683.pdf )<\/p>\n<p>&nbsp;<\/p>\n<p>BERT\/GPT \uc640 \ub3d9\uc77c\ud558\uac8c Huggingface \uc758 AutoModel\ub97c \uc0ac\uc6a9\ud558\uc5ec \ubaa8\ub378 \uad6c\uc870\ub97c \uc0b4\ud3b4\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.<\/p>\n<pre><code data-language=\"python\">from transformers import AutoModel\r\nmodel = AutoModel.from_pretrained(\"google-t5\/t5-base\")<\/code><\/pre>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/b3f4653930ee2be37a1572a5e7a485b92a550acabd6f957ed9ab0dcd21ea030d\" alt=\"image.png\" \/><\/p>\n<p><img src=\"https:\/\/devocean.sk.com\/editorImg\/2024\/2\/25\/c55be28322976801f165bd3bd8c21039ecac6ce56a1df72877ee296e38a71cd8\" alt=\"image.png\" \/><\/p>\n<p>&nbsp;<\/p>\n<p>Encoder\/Decoder\ub97c \uac00\uc9c4 Transformer \uc6d0\ubcf8 \uc544\ud0a4\ud14d\uccd0\ub97c \uc0ac\uc6a9\ud558\uae30 \ub54c\ubb38\uc5d0 BERT\/GPT \uc5d0 \ube44\ud574 \ubaa8\ub378 \uad6c\uc870\uac00 \uae38\uac8c \ucd9c\ub825 \ub429\ub2c8\ub2e4.<\/p>\n<p>\ud2b9\uc774\ud55c \uc810\uc740 Multi-Head Attention layer\uc744 \ubc14\ub85c 12\ubc88 \ubc18\ubcf5\ud558\ub294\uac8c \uc544\ub2c8\ub77c \ucd5c\ucd08 1\ud68c\ub294 FeedForward layer\ub97c \ud1b5\uacfc\ud558\uace0 \ub098\uba38\uc9c0 11\ubc88\uc744 \ubc18\ubcf5\ud558\ub294 \uad6c\uc870\uac00 Encoder\/Decoder\uc5d0 \ub098\ud0c0\ub098\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>T5 \ubaa8\ub378\uc758 \uacbd\uc6b0 AutoModelFor{xx} \uac19\uc740 Task \ubcc4 \ubaa8\ub378\uc744 \uc0ac\uc6a9\ud558\ub294\uac8c \uc544\ub2cc T5ForConditionalGeneration \ub77c\ub294 \ubaa8\ub378\uc744 \ud638\ucd9c\ud558\uc5ec prefix\uc5d0 \ub530\ub978 seq2seq Task\ub97c \uc218\ud589\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>Encoder-Decoder \uc720\ud615\uc73c\ub85c BART, M2M-100, BigBird \uac19\uc740 \ubaa8\ub378\uc744 huggingface \uc5d0\uc11c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<h2><strong>5. \ub9c8\ubb34\ub9ac<\/strong><\/h2>\n<p>Hugging Face \uc5d0\uc11c \uc81c\uacf5\ud558\ub294 transformers \ub77c\uc774\ube0c\ub7ec\ub9ac\ub97c \uc0ac\uc6a9\ud558\uc5ec Transformer \uae30\ubc18\uc758 \ub2e4\uc591\ud55c \ubaa8\ub378\ub4e4\uc744 \uc0b4\ud3b4 \ubd24\uc2b5\ub2c8\ub2e4.<\/p>\n<p>Encoer-Only, Decoder-Only \uadf8\ub9ac\uace0 Encoder\/Dedocer \uc544\ud0a4\ud14d\uccd0\uc758 \ub300\ud45c \ubaa8\ub378\ub4e4\uc744 \uc0b4\ud3b4\ubd24\ub294\ub370, \uc544\ud0a4\ud14d\uccd0\uc5d0 \ub530\ub77c Task \ubcc4 \uc131\ub2a5\uc774 \ucc28\uc774\uac00 \uc788\ub294\uac78\ub85c \uc54c\ub824\uc838 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>&nbsp;<\/p>\n<p>\ud604\uc7ac\uae4c\uc9c0\ub294 chatGPT\uc758 \uc131\uacf5\uc73c\ub85c GPT \uacc4\uc5f4\uc758 Decoder-Only \uc544\ud0a4\ud14d\uccd0\uac00 \uac00\uc7a5 \uc131\uacfc\ub97c \ub0b4\uace0 \uc788\uc2b5\ub2c8\ub2e4.<\/p>\n<p>Encoder, Encoder\/Decoder \uc544\ud0a4\ud14d\uccd0\ub4e4\uc774 \ubd84\ubc1c\ud574\uc11c LLM \uc601\uc5ed\uc5d0\uc11c \uc0c8\ub85c\uc6b4 \uc131\uacfc\ub97c \ub9cc\ub4e4\uc5b4 \ub0bc \uc218 \uc788\uc744\uc9c0, \ud639\uc740 \uc5ed\uc0ac \uc18d \ubaa8\ub378\ub4e4 \uc911 \ud558\ub098\uac00 \ub420\uc9c0 \uc55e\uc73c\ub85c \uc9c0\ucf1c\ubcf4\ub294\uac83\ub3c4 \uc88b\uc744\uac83 \uac19\uc2b5\ub2c8\ub2e4.<\/p>\n<div>\n<hr \/>\n<\/div>\n<p>\ucd9c\ucc98 :<\/p>\n<p>https:\/\/github.com\/nlp-with-transformers\/notebooks,<\/p>\n<p>Natural Language Processing with Transformers, O&#8217;Reilly Media,<\/p>\n<p>https:\/\/blog.tensorflow.org\/2019\/05\/transformer-chatbot-tutorial-with-tensorflow-2.html<\/p>\n<p>https:\/\/huggingface.co\/blog\/bert-101<\/p>\n<p>https:\/\/openai.com\/research\/language-unsupervised<\/p>\n<p>https:\/\/huggingface.co\/docs\/transformers\/en\/model_doc\/openai-gpt<\/p>\n<p>https:\/\/arxiv.org\/pdf\/1910.10683.pdf<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>\uc774 \uae00\uc740 SK \uadf8\ub8f9\uc758 \uac1c\ubc1c\uc790 \ubaa8\uc784 Devocean \uc5d0 \ub3d9\uc77c\ud558\uac8c \uc791\uc131\ub41c \uae00\uc785\ub2c8\ub2e4.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>LLM \uae30\ubc18\uc758 Application \uc744 \uac1c\ubc1c\ud558\uae30 \uc2dc\uc791\ud588\ub2e4\uba74 \uac00\uc7a5 \ub9ce\uc774 \ubc29\ubb38\ud558\ub294 \ud398\uc774\uc9c0\ub294 \uc544\ub9c8 \bHugging Face \uc77c\uac83 \uac19\uc2b5\ub2c8\ub2e4. &nbsp; Transformer \uc758 \ub4f1\uc7a5\uacfc \uc774\ud6c4 GPT \/ BERT \uc758 \uc131\uacf5\uc73c\ub85c&hellip;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"spay_email":""},"categories":[3],"tags":[104,18,102,103],"aioseo_notices":[],"jetpack_featured_media_url":"","_links":{"self":[{"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/posts\/987"}],"collection":[{"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/mukgee.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=987"}],"version-history":[{"count":3,"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/posts\/987\/revisions"}],"predecessor-version":[{"id":990,"href":"http:\/\/mukgee.com\/index.php?rest_route=\/wp\/v2\/posts\/987\/revisions\/990"}],"wp:attachment":[{"href":"http:\/\/mukgee.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=987"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/mukgee.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=987"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/mukgee.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=987"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}