Skip to content

Commit 9d2dca9

Browse files
committed
Rebuild
1 parent 5a0ae8d commit 9d2dca9

224 files changed

Lines changed: 245569 additions & 243960 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

โ€Ždocs/_downloads/0e6615c5a7bc71e01ff3c51217ea00da/tensorqs_tutorial.ipynbโ€Ž

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n[\ud30c\uc774\ud1a0\uce58(PyTorch) \uae30\ubcf8 \uc775\ud788\uae30](intro.html) ||\n[\ube60\ub978 \uc2dc\uc791](quickstart_tutorial.html) ||\n**\ud150\uc11c(Tensor)** ||\n[Dataset\uacfc Dataloader](data_tutorial.html) ||\n[\ubcc0\ud615(Transform)](transforms_tutorial.html) ||\n[\uc2e0\uacbd\ub9dd \ubaa8\ub378 \uad6c\uc131\ud558\uae30](buildmodel_tutorial.html) ||\n[Autograd](autogradqs_tutorial.html) ||\n[\ucd5c\uc801\ud654(Optimization)](optimization_tutorial.html) ||\n[\ubaa8\ub378 \uc800\uc7a5\ud558\uace0 \ubd88\ub7ec\uc624\uae30](saveloadrun_tutorial.html)\n\n# \ud150\uc11c(Tensor)\n\n\ud150\uc11c(tensor)\ub294 \ubc30\uc5f4(array)\uc774\ub098 \ud589\ub82c(matrix)\uacfc \ub9e4\uc6b0 \uc720\uc0ac\ud55c \ud2b9\uc218\ud55c \uc790\ub8cc\uad6c\uc870\uc785\ub2c8\ub2e4.\nPyTorch\uc5d0\uc11c\ub294 \ud150\uc11c\ub97c \uc0ac\uc6a9\ud558\uc5ec \ubaa8\ub378\uc758 \uc785\ub825(input)\uacfc \ucd9c\ub825(output), \uadf8\ub9ac\uace0 \ubaa8\ub378\uc758 \ub9e4\uac1c\ubcc0\uc218\ub4e4\uc744 \ubd80\ud638\ud654(encode)\ud569\ub2c8\ub2e4.\n\n\ud150\uc11c\ub294 GPU\ub098 \ub2e4\ub978 \ud558\ub4dc\uc6e8\uc5b4 \uac00\uc18d\uae30\uc5d0\uc11c \uc2e4\ud589\ud560 \uc218 \uc788\ub2e4\ub294 \uc810\ub9cc \uc81c\uc678\ud558\uba74 [NumPy](https://numpy.org) \uc758 ndarray\uc640 \uc720\uc0ac\ud569\ub2c8\ub2e4.\n\uc2e4\uc81c\ub85c \ud150\uc11c\uc640 NumPy \ubc30\uc5f4(array)\uc740 \uc885\uc885 \ub3d9\uc77c\ud55c \ub0b4\ubd80(underly) \uba54\ubaa8\ub9ac\ub97c \uacf5\uc720\ud560 \uc218 \uc788\uc5b4 \ub370\uc774\ud130\ub97c \ubcf5\uc218\ud560 \ud544\uc694\uac00 \uc5c6\uc2b5\ub2c8\ub2e4. (`bridge-to-np-label` \ucc38\uace0)\n\ud150\uc11c\ub294 \ub610\ud55c ([Autograd](autogradqs_tutorial.html)_ \uc7a5\uc5d0\uc11c \uc0b4\ud3b4\ubcfc) \uc790\ub3d9 \ubbf8\ubd84(automatic differentiation)\uc5d0 \ucd5c\uc801\ud654\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.\nndarray\uc5d0 \uc775\uc219\ud558\ub2e4\uba74 Tensor API\ub97c \ubc14\ub85c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc744 \uac83\uc785\ub2c8\ub2e4. \uc544\ub2c8\ub77c\uba74, \uc544\ub798 \ub0b4\uc6a9\uc744 \ud568\uaed8 \ubcf4\uc2dc\uc8e0!\n"
18+
"\n[\ud30c\uc774\ud1a0\uce58(PyTorch) \uae30\ubcf8 \uc775\ud788\uae30](intro.html) ||\n[\ube60\ub978 \uc2dc\uc791](quickstart_tutorial.html) ||\n**\ud150\uc11c(Tensor)** ||\n[Dataset\uacfc Dataloader](data_tutorial.html) ||\n[\ubcc0\ud615(Transform)](transforms_tutorial.html) ||\n[\uc2e0\uacbd\ub9dd \ubaa8\ub378 \uad6c\uc131\ud558\uae30](buildmodel_tutorial.html) ||\n[Autograd](autogradqs_tutorial.html) ||\n[\ucd5c\uc801\ud654(Optimization)](optimization_tutorial.html) ||\n[\ubaa8\ub378 \uc800\uc7a5\ud558\uace0 \ubd88\ub7ec\uc624\uae30](saveloadrun_tutorial.html)\n\n# \ud150\uc11c(Tensor)\n\n\ud150\uc11c(tensor)\ub294 \ubc30\uc5f4(array)\uc774\ub098 \ud589\ub82c(matrix)\uacfc \ub9e4\uc6b0 \uc720\uc0ac\ud55c \ud2b9\uc218\ud55c \uc790\ub8cc\uad6c\uc870\uc785\ub2c8\ub2e4.\nPyTorch\uc5d0\uc11c\ub294 \ud150\uc11c\ub97c \uc0ac\uc6a9\ud558\uc5ec \ubaa8\ub378\uc758 \uc785\ub825(input)\uacfc \ucd9c\ub825(output), \uadf8\ub9ac\uace0 \ubaa8\ub378\uc758 \ub9e4\uac1c\ubcc0\uc218\ub4e4\uc744 \ubd80\ud638\ud654(encode)\ud569\ub2c8\ub2e4.\n\n\ud150\uc11c\ub294 GPU\ub098 \ub2e4\ub978 \ud558\ub4dc\uc6e8\uc5b4 \uac00\uc18d\uae30\uc5d0\uc11c \uc2e4\ud589\ud560 \uc218 \uc788\ub2e4\ub294 \uc810\ub9cc \uc81c\uc678\ud558\uba74 [NumPy](https://numpy.org) \uc758 ndarray\uc640 \uc720\uc0ac\ud569\ub2c8\ub2e4.\n\uc2e4\uc81c\ub85c \ud150\uc11c\uc640 NumPy \ubc30\uc5f4(array)\uc740 \uc885\uc885 \ub3d9\uc77c\ud55c \ub0b4\ubd80(underly) \uba54\ubaa8\ub9ac\ub97c \uacf5\uc720\ud560 \uc218 \uc788\uc5b4 \ub370\uc774\ud130\ub97c \ubcf5\uc0ac\ud560 \ud544\uc694\uac00 \uc5c6\uc2b5\ub2c8\ub2e4. (`bridge-to-np-label` \ucc38\uace0)\n\ud150\uc11c\ub294 \ub610\ud55c ([Autograd](autogradqs_tutorial.html)_ \uc7a5\uc5d0\uc11c \uc0b4\ud3b4\ubcfc) \uc790\ub3d9 \ubbf8\ubd84(automatic differentiation)\uc5d0 \ucd5c\uc801\ud654\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.\nndarray\uc5d0 \uc775\uc219\ud558\ub2e4\uba74 Tensor API\ub97c \ubc14\ub85c \uc0ac\uc6a9\ud560 \uc218 \uc788\uc744 \uac83\uc785\ub2c8\ub2e4. \uc544\ub2c8\ub77c\uba74, \uc544\ub798 \ub0b4\uc6a9\uc744 \ud568\uaed8 \ubcf4\uc2dc\uc8e0!\n"
1919
]
2020
},
2121
{

โ€Ždocs/_downloads/3fb82dc8278b08d5e5dee31ec1c16170/tensorqs_tutorial.pyโ€Ž

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@
1616
PyTorch์—์„œ๋Š” ํ…์„œ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋ชจ๋ธ์˜ ์ž…๋ ฅ(input)๊ณผ ์ถœ๋ ฅ(output), ๊ทธ๋ฆฌ๊ณ  ๋ชจ๋ธ์˜ ๋งค๊ฐœ๋ณ€์ˆ˜๋“ค์„ ๋ถ€ํ˜ธํ™”(encode)ํ•ฉ๋‹ˆ๋‹ค.
1717
1818
ํ…์„œ๋Š” GPU๋‚˜ ๋‹ค๋ฅธ ํ•˜๋“œ์›จ์–ด ๊ฐ€์†๊ธฐ์—์„œ ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ๋‹ค๋Š” ์ ๋งŒ ์ œ์™ธํ•˜๋ฉด `NumPy <https://numpy.org>`_ ์˜ ndarray์™€ ์œ ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
19-
์‹ค์ œ๋กœ ํ…์„œ์™€ NumPy ๋ฐฐ์—ด(array)์€ ์ข…์ข… ๋™์ผํ•œ ๋‚ด๋ถ€(underly) ๋ฉ”๋ชจ๋ฆฌ๋ฅผ ๊ณต์œ ํ•  ์ˆ˜ ์žˆ์–ด ๋ฐ์ดํ„ฐ๋ฅผ ๋ณต์ˆ˜ํ•  ํ•„์š”๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค. (:ref:`bridge-to-np-label` ์ฐธ๊ณ )
19+
์‹ค์ œ๋กœ ํ…์„œ์™€ NumPy ๋ฐฐ์—ด(array)์€ ์ข…์ข… ๋™์ผํ•œ ๋‚ด๋ถ€(underly) ๋ฉ”๋ชจ๋ฆฌ๋ฅผ ๊ณต์œ ํ•  ์ˆ˜ ์žˆ์–ด ๋ฐ์ดํ„ฐ๋ฅผ ๋ณต์‚ฌํ•  ํ•„์š”๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค. (:ref:`bridge-to-np-label` ์ฐธ๊ณ )
2020
ํ…์„œ๋Š” ๋˜ํ•œ (`Autograd <autogradqs_tutorial.html>`__ ์žฅ์—์„œ ์‚ดํŽด๋ณผ) ์ž๋™ ๋ฏธ๋ถ„(automatic differentiation)์— ์ตœ์ ํ™”๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.
2121
ndarray์— ์ต์ˆ™ํ•˜๋‹ค๋ฉด Tensor API๋ฅผ ๋ฐ”๋กœ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์•„๋‹ˆ๋ผ๋ฉด, ์•„๋ž˜ ๋‚ด์šฉ์„ ํ•จ๊ป˜ ๋ณด์‹œ์ฃ !
2222
"""

โ€Ždocs/_downloads/462f53ac0f7c6840743ad8655c43102c/torchtext_translation.pyโ€Ž

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,14 +36,14 @@
3636
#
3737
# ::
3838
#
39-
# python -m spacy download en_core_web_sm
40-
# python -m spacy download de_core_news_sm
39+
# python -m spacy download en
40+
# python -m spacy download de
4141

4242
import torchtext
4343
import torch
4444
from torchtext.data.utils import get_tokenizer
4545
from collections import Counter
46-
from torchtext.vocab import vocab
46+
from torchtext.vocab import Vocab, vocab
4747
from torchtext.utils import download_from_url, extract_archive
4848
import io
4949

โ€Ždocs/_downloads/6fbbb25a2ddfe5bf93b618f53cf7077e/torchtext_translation.ipynbโ€Ž

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@
2222
"cell_type": "markdown",
2323
"metadata": {},
2424
"source": [
25-
"## \ub370\uc774\ud130 \ucc98\ub9ac\ud558\uae30\n\n``torchtext`` \uc5d0\ub294 \uc5b8\uc5b4 \ubcc0\ud658 \ubaa8\ub378\uc744 \ub9cc\ub4e4 \ub54c \uc27d\uac8c \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 \ub370\uc774\ud130\uc14b\uc744 \ub9cc\ub4e4\uae30 \uc801\ud569\ud55c \ub2e4\uc591\ud55c \ub3c4\uad6c\uac00 \uc788\uc2b5\ub2c8\ub2e4.\n\uc774 \uc608\uc81c\uc5d0\uc11c\ub294 \uac00\uacf5\ub418\uc9c0 \uc54a\uc740 \ud14d\uc2a4\ud2b8 \ubb38\uc7a5(raw text sentence)\uc744 \ud1a0\ud070\ud654(tokenize)\ud558\uace0, \uc5b4\ud718\uc9d1(vocabulary)\uc744 \ub9cc\ub4e4\uace0,\n\ud1a0\ud070\uc744 \ud150\uc11c\ub85c \uc22b\uc790\ud654(numericalize)\ud558\ub294 \ubc29\ubc95\uc744 \uc54c\uc544\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.\n\n\ucc38\uace0 : \uc774 \ud29c\ud1a0\ub9ac\uc5bc\uc5d0\uc11c\uc758 \ud1a0\ud070\ud654(tokenization)\uc5d0\ub294 [Spacy](https://spacy.io)_ \uac00 \ud544\uc694\ud569\ub2c8\ub2e4.\nSpacy\ub294 \uc601\uc5b4 \uc774 \uc678\uc758 \ub2e4\ub978 \uc5b8\uc5b4\uc5d0 \ub300\ud55c \uac15\ub825\ud55c \ud1a0\ud070\ud654 \uae30\ub2a5\uc744 \uc81c\uacf5\ud558\uae30 \ub54c\ubb38\uc5d0 \uc0ac\uc6a9\ud569\ub2c8\ub2e4. ``torchtext`` \ub294\n`basic_english`` \ud1a0\ud06c\ub098\uc774\uc800\ub97c \uc81c\uacf5\ud560 \ubfd0 \uc544\ub2c8\ub77c \uc601\uc5b4\uc5d0 \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 \ub2e4\ub978 \ud1a0\ud06c\ub098\uc774\uc800\ub4e4(\uc608\ucee8\ub370\n[Moses](https://bitbucket.org/luismsgomes/mosestokenizer/src/default/)_ )\uc744 \uc9c0\uc6d0\ud569\ub2c8\ub2e4\ub9cc, \uc5b8\uc5b4 \ubc88\uc5ed\uc744 \uc704\ud574\uc11c\ub294 \ub2e4\uc591\ud55c \uc5b8\uc5b4\ub97c\n\ub2e4\ub8e8\uc5b4\uc57c \ud558\uae30 \ub54c\ubb38\uc5d0 Spacy\uac00 \uac00\uc7a5 \uc801\ud569\ud569\ub2c8\ub2e4.\n\n\uc774 \ud29c\ud1a0\ub9ac\uc5bc\uc744 \uc2e4\ud589\ud558\ub824\uba74, \uc6b0\uc120 ``pip`` \ub098 ``conda`` \ub85c ``spacy`` \ub97c \uc124\uce58\ud558\uc138\uc694. \uadf8 \ub2e4\uc74c,\nSpacy \ud1a0\ud06c\ub098\uc774\uc800\uac00 \uc4f8 \uc601\uc5b4\uc640 \ub3c5\uc77c\uc5b4\uc5d0 \ub300\ud55c \ub370\uc774\ud130\ub97c \ub2e4\uc6b4\ub85c\ub4dc \ubc1b\uc2b5\ub2c8\ub2e4.\n\n::\n\n python -m spacy download en_core_web_sm\n python -m spacy download de_core_news_sm\n\n"
25+
"## \ub370\uc774\ud130 \ucc98\ub9ac\ud558\uae30\n\n``torchtext`` \uc5d0\ub294 \uc5b8\uc5b4 \ubcc0\ud658 \ubaa8\ub378\uc744 \ub9cc\ub4e4 \ub54c \uc27d\uac8c \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 \ub370\uc774\ud130\uc14b\uc744 \ub9cc\ub4e4\uae30 \uc801\ud569\ud55c \ub2e4\uc591\ud55c \ub3c4\uad6c\uac00 \uc788\uc2b5\ub2c8\ub2e4.\n\uc774 \uc608\uc81c\uc5d0\uc11c\ub294 \uac00\uacf5\ub418\uc9c0 \uc54a\uc740 \ud14d\uc2a4\ud2b8 \ubb38\uc7a5(raw text sentence)\uc744 \ud1a0\ud070\ud654(tokenize)\ud558\uace0, \uc5b4\ud718\uc9d1(vocabulary)\uc744 \ub9cc\ub4e4\uace0,\n\ud1a0\ud070\uc744 \ud150\uc11c\ub85c \uc22b\uc790\ud654(numericalize)\ud558\ub294 \ubc29\ubc95\uc744 \uc54c\uc544\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.\n\n\ucc38\uace0 : \uc774 \ud29c\ud1a0\ub9ac\uc5bc\uc5d0\uc11c\uc758 \ud1a0\ud070\ud654(tokenization)\uc5d0\ub294 [Spacy](https://spacy.io)_ \uac00 \ud544\uc694\ud569\ub2c8\ub2e4.\nSpacy\ub294 \uc601\uc5b4 \uc774 \uc678\uc758 \ub2e4\ub978 \uc5b8\uc5b4\uc5d0 \ub300\ud55c \uac15\ub825\ud55c \ud1a0\ud070\ud654 \uae30\ub2a5\uc744 \uc81c\uacf5\ud558\uae30 \ub54c\ubb38\uc5d0 \uc0ac\uc6a9\ud569\ub2c8\ub2e4. ``torchtext`` \ub294\n`basic_english`` \ud1a0\ud06c\ub098\uc774\uc800\ub97c \uc81c\uacf5\ud560 \ubfd0 \uc544\ub2c8\ub77c \uc601\uc5b4\uc5d0 \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 \ub2e4\ub978 \ud1a0\ud06c\ub098\uc774\uc800\ub4e4(\uc608\ucee8\ub370\n[Moses](https://bitbucket.org/luismsgomes/mosestokenizer/src/default/)_ )\uc744 \uc9c0\uc6d0\ud569\ub2c8\ub2e4\ub9cc, \uc5b8\uc5b4 \ubc88\uc5ed\uc744 \uc704\ud574\uc11c\ub294 \ub2e4\uc591\ud55c \uc5b8\uc5b4\ub97c\n\ub2e4\ub8e8\uc5b4\uc57c \ud558\uae30 \ub54c\ubb38\uc5d0 Spacy\uac00 \uac00\uc7a5 \uc801\ud569\ud569\ub2c8\ub2e4.\n\n\uc774 \ud29c\ud1a0\ub9ac\uc5bc\uc744 \uc2e4\ud589\ud558\ub824\uba74, \uc6b0\uc120 ``pip`` \ub098 ``conda`` \ub85c ``spacy`` \ub97c \uc124\uce58\ud558\uc138\uc694. \uadf8 \ub2e4\uc74c,\nSpacy \ud1a0\ud06c\ub098\uc774\uc800\uac00 \uc4f8 \uc601\uc5b4\uc640 \ub3c5\uc77c\uc5b4\uc5d0 \ub300\ud55c \ub370\uc774\ud130\ub97c \ub2e4\uc6b4\ub85c\ub4dc \ubc1b\uc2b5\ub2c8\ub2e4.\n\n::\n\n python -m spacy download en\n python -m spacy download de\n\n"
2626
]
2727
},
2828
{
@@ -33,7 +33,7 @@
3333
},
3434
"outputs": [],
3535
"source": [
36-
"import torchtext\nimport torch\nfrom torchtext.data.utils import get_tokenizer\nfrom collections import Counter\nfrom torchtext.vocab import vocab\nfrom torchtext.utils import download_from_url, extract_archive\nimport io\n\nurl_base = 'https://raw.githubusercontent.com/multi30k/dataset/master/data/task1/raw/'\ntrain_urls = ('train.de.gz', 'train.en.gz')\nval_urls = ('val.de.gz', 'val.en.gz')\ntest_urls = ('test_2016_flickr.de.gz', 'test_2016_flickr.en.gz')\n\ntrain_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in train_urls]\nval_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in val_urls]\ntest_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in test_urls]\n\nde_tokenizer = get_tokenizer('spacy', language='de')\nen_tokenizer = get_tokenizer('spacy', language='en')\n\ndef build_vocab(filepath, tokenizer):\n counter = Counter()\n with io.open(filepath, encoding=\"utf8\") as f:\n for string_ in f:\n counter.update(tokenizer(string_))\n return vocab(counter, specials=['<unk>', '<pad>', '<bos>', '<eos>'])\n\nde_vocab = build_vocab(train_filepaths[0], de_tokenizer)\nen_vocab = build_vocab(train_filepaths[1], en_tokenizer)\n\ndef data_process(filepaths):\n raw_de_iter = iter(io.open(filepaths[0], encoding=\"utf8\"))\n raw_en_iter = iter(io.open(filepaths[1], encoding=\"utf8\"))\n data = []\n for (raw_de, raw_en) in zip(raw_de_iter, raw_en_iter):\n de_tensor_ = torch.tensor([de_vocab[token] for token in de_tokenizer(raw_de)],\n dtype=torch.long)\n en_tensor_ = torch.tensor([en_vocab[token] for token in en_tokenizer(raw_en)],\n dtype=torch.long)\n data.append((de_tensor_, en_tensor_))\n return data\n\ntrain_data = data_process(train_filepaths)\nval_data = data_process(val_filepaths)\ntest_data = data_process(test_filepaths)"
36+
"import torchtext\nimport torch\nfrom torchtext.data.utils import get_tokenizer\nfrom collections import Counter\nfrom torchtext.vocab import Vocab, vocab\nfrom torchtext.utils import download_from_url, extract_archive\nimport io\n\nurl_base = 'https://raw.githubusercontent.com/multi30k/dataset/master/data/task1/raw/'\ntrain_urls = ('train.de.gz', 'train.en.gz')\nval_urls = ('val.de.gz', 'val.en.gz')\ntest_urls = ('test_2016_flickr.de.gz', 'test_2016_flickr.en.gz')\n\ntrain_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in train_urls]\nval_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in val_urls]\ntest_filepaths = [extract_archive(download_from_url(url_base + url))[0] for url in test_urls]\n\nde_tokenizer = get_tokenizer('spacy', language='de')\nen_tokenizer = get_tokenizer('spacy', language='en')\n\ndef build_vocab(filepath, tokenizer):\n counter = Counter()\n with io.open(filepath, encoding=\"utf8\") as f:\n for string_ in f:\n counter.update(tokenizer(string_))\n return vocab(counter, specials=['<unk>', '<pad>', '<bos>', '<eos>'])\n\nde_vocab = build_vocab(train_filepaths[0], de_tokenizer)\nen_vocab = build_vocab(train_filepaths[1], en_tokenizer)\n\ndef data_process(filepaths):\n raw_de_iter = iter(io.open(filepaths[0], encoding=\"utf8\"))\n raw_en_iter = iter(io.open(filepaths[1], encoding=\"utf8\"))\n data = []\n for (raw_de, raw_en) in zip(raw_de_iter, raw_en_iter):\n de_tensor_ = torch.tensor([de_vocab[token] for token in de_tokenizer(raw_de)],\n dtype=torch.long)\n en_tensor_ = torch.tensor([en_vocab[token] for token in en_tokenizer(raw_en)],\n dtype=torch.long)\n data.append((de_tensor_, en_tensor_))\n return data\n\ntrain_data = data_process(train_filepaths)\nval_data = data_process(val_filepaths)\ntest_data = data_process(test_filepaths)"
3737
]
3838
},
3939
{

โ€Ždocs/_downloads/d9398fce39ca80dc4bb8b8ea55b575a8/nn_tutorial.ipynbโ€Ž

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -94,7 +94,7 @@
9494
"cell_type": "markdown",
9595
"metadata": {},
9696
"source": [
97-
"## torch.nn \uc5c6\uc774 \ubc11\ubc14\ub2e5\ubd80\ud130 \uc2e0\uacbd\ub9dd \ub9cc\ub4e4\uae30\n\nPyTorch \ud150\uc11c \uc5f0\uc0b0\ub9cc\uc73c\ub85c \uccab \ubaa8\ub378\uc744 \ub9cc\ub4e4\uc5b4\ubd05\uc2dc\ub2e4.\n\uc5ec\ub7ec\ubd84\uc774 \uc2e0\uacbd\ub9dd\uc758 \uae30\ucd08\uc5d0 \ub300\ud574\uc11c \uc774\ubbf8 \uc775\uc219\ud558\ub2e4\uace0 \uac00\uc815\ud569\ub2c8\ub2e4.\n(\ub9cc\uc57d \uc775\uc219\ud558\uc9c0 \uc54a\ub2e4\uba74 [course.fast.ai](https://course.fast.ai) \uc5d0\uc11c \ud559\uc2b5\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4).\n\nPyTorch\ub294 \ub79c\ub364 \ub610\ub294 0\uc73c\ub85c\ub9cc \uc774\ub8e8\uc5b4\uc9c4 \ud150\uc11c\ub97c \uc0dd\uc131\ud558\ub294 \uba54\uc11c\ub4dc\ub97c \uc81c\uacf5\ud558\uace0,\n\uc6b0\ub9ac\ub294 \uac04\ub2e8\ud55c \uc120\ud615 \ubaa8\ub378\uc758 \uac00\uc911\uce58(weights)\uc640 \uc808\ud3b8(bias)\uc744 \uc0dd\uc131\ud558\uae30 \uc704\ud574\uc11c \uc774\uac83\uc744 \uc0ac\uc6a9\ud560 \uac83\uc785\ub2c8\ub2e4.\n\uc774\ub4e4\uc740 \uc77c\ubc18\uc801\uc778 \ud150\uc11c\uc5d0 \ub9e4\uc6b0 \ud2b9\ubcc4\ud55c \ud55c \uac00\uc9c0\uac00 \ucd94\uac00\ub41c \uac83\uc785\ub2c8\ub2e4: \uc6b0\ub9ac\ub294 PyTorch\uc5d0\uac8c \uc774\ub4e4\uc774\n\uae30\uc6b8\uae30(gradient)\uac00 \ud544\uc694\ud558\ub2e4\uace0 \uc54c\ub824\uc90d\ub2c8\ub2e4.\n\uc774\ub97c \ud1b5\ud574 PyTorch\ub294 \ud150\uc11c\uc5d0 \ud589\ud574\uc9c0\ub294 \ubaa8\ub4e0 \uc5f0\uc0b0\uc744 \uae30\ub85d\ud558\uac8c \ud558\uace0,\n\ub530\ub77c\uc11c *\uc790\ub3d9\uc801\uc73c\ub85c* \uc5ed\uc804\ud30c(back-propagation) \ub3d9\uc548\uc5d0 \uae30\uc6b8\uae30\ub97c \uacc4\uc0b0\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4!\n\n\uac00\uc911\uce58\uc5d0 \ub300\ud574\uc11c\ub294 ``requires_grad`` \ub97c \ucd08\uae30\ud654(initialization) **\ub2e4\uc74c\uc5d0** \uc124\uc815\ud569\ub2c8\ub2e4,\n\uc65c\ub0d0\ud558\uba74 \uc6b0\ub9ac\ub294 \ud574\ub2f9 \ub2e8\uacc4\uac00 \uae30\uc6b8\uae30\uc5d0 \ud3ec\ud568\ub418\ub294 \uac83\uc744 \uc6d0\uce58 \uc54a\uae30 \ub54c\ubb38\uc785\ub2c8\ub2e4.\n(PyTorch\uc5d0\uc11c ``_`` \ub2e4\uc74c\uc5d0 \uc624\ub294 \uba54\uc11c\ub4dc \uc774\ub984\uc740 \uc5f0\uc0b0\uc774 \uc778\ud50c\ub808\uc774\uc2a4(in-place)\ub85c \uc218\ud589\ub418\ub294 \uac83\uc744 \uc758\ubbf8\ud569\ub2c8\ub2e4.)\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>[Xavier initialisation](http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf)\n \uae30\ubc95\uc744 \uc774\uc6a9\ud558\uc5ec \uac00\uc911\uce58\ub97c \ucd08\uae30\ud654 \ud569\ub2c8\ub2e4. (1/sqrt(n)\uc744 \uacf1\ud574\uc8fc\ub294 \uac83\uc744 \ud1b5\ud574\uc11c \ucd08\uae30\ud654).</p></div>\n\n"
97+
"## torch.nn \uc5c6\uc774 \ubc11\ubc14\ub2e5\ubd80\ud130 \uc2e0\uacbd\ub9dd \ub9cc\ub4e4\uae30\n\nPyTorch \ud150\uc11c \uc5f0\uc0b0\ub9cc\uc73c\ub85c \uccab \ubaa8\ub378\uc744 \ub9cc\ub4e4\uc5b4\ubd05\uc2dc\ub2e4.\n\uc5ec\ub7ec\ubd84\uc774 \uc2e0\uacbd\ub9dd\uc758 \uae30\ucd08\uc5d0 \ub300\ud574\uc11c \uc774\ubbf8 \uc775\uc219\ud558\ub2e4\uace0 \uac00\uc815\ud569\ub2c8\ub2e4.\n(\ub9cc\uc57d \uc775\uc219\ud558\uc9c0 \uc54a\ub2e4\uba74 [course.fast.ai](https://course.fast.ai) \uc5d0\uc11c \ud559\uc2b5\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4).\n\nPyTorch\ub294 \ub79c\ub364 \ub610\ub294 0\uc73c\ub85c\ub9cc \uc774\ub8e8\uc5b4\uc9c4 \ud150\uc11c\ub97c \uc0dd\uc131\ud558\ub294 \uba54\uc18c\ub4dc\ub97c \uc81c\uacf5\ud558\uace0,\n\uc6b0\ub9ac\ub294 \uac04\ub2e8\ud55c \uc120\ud615 \ubaa8\ub378\uc758 \uac00\uc911\uce58(weights)\uc640 \uc808\ud3b8(bias)\uc744 \uc0dd\uc131\ud558\uae30 \uc704\ud574\uc11c \uc774\uac83\uc744 \uc0ac\uc6a9\ud560 \uac83\uc785\ub2c8\ub2e4.\n\uc774\ub4e4\uc740 \uc77c\ubc18\uc801\uc778 \ud150\uc11c\uc5d0 \ub9e4\uc6b0 \ud2b9\ubcc4\ud55c \ud55c \uac00\uc9c0\uac00 \ucd94\uac00\ub41c \uac83\uc785\ub2c8\ub2e4: \uc6b0\ub9ac\ub294 PyTorch\uc5d0\uac8c \uc774\ub4e4\uc774\n\uae30\uc6b8\uae30(gradient)\uac00 \ud544\uc694\ud558\ub2e4\uace0 \uc54c\ub824\uc90d\ub2c8\ub2e4.\n\uc774\ub97c \ud1b5\ud574 PyTorch\ub294 \ud150\uc11c\uc5d0 \ud589\ud574\uc9c0\ub294 \ubaa8\ub4e0 \uc5f0\uc0b0\uc744 \uae30\ub85d\ud558\uac8c \ud558\uace0,\n\ub530\ub77c\uc11c *\uc790\ub3d9\uc801\uc73c\ub85c* \uc5ed\uc804\ud30c(back-propagation) \ub3d9\uc548\uc5d0 \uae30\uc6b8\uae30\ub97c \uacc4\uc0b0\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4!\n\n\uac00\uc911\uce58\uc5d0 \ub300\ud574\uc11c\ub294 ``requires_grad`` \ub97c \ucd08\uae30\ud654(initialization) **\ub2e4\uc74c\uc5d0** \uc124\uc815\ud569\ub2c8\ub2e4,\n\uc65c\ub0d0\ud558\uba74 \uc6b0\ub9ac\ub294 \ud574\ub2f9 \ub2e8\uacc4\uac00 \uae30\uc6b8\uae30\uc5d0 \ud3ec\ud568\ub418\ub294 \uac83\uc744 \uc6d0\uce58 \uc54a\uae30 \ub54c\ubb38\uc785\ub2c8\ub2e4.\n(PyTorch\uc5d0\uc11c ``_`` \ub2e4\uc74c\uc5d0 \uc624\ub294 \uba54\uc18c\ub4dc \uc774\ub984\uc740 \uc5f0\uc0b0\uc774 \uc778\ud50c\ub808\uc774\uc2a4(in-place)\ub85c \uc218\ud589\ub418\ub294 \uac83\uc744 \uc758\ubbf8\ud569\ub2c8\ub2e4.)\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>[Xavier initialisation](http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf)\n \uae30\ubc95\uc744 \uc774\uc6a9\ud558\uc5ec \uac00\uc911\uce58\ub97c \ucd08\uae30\ud654 \ud569\ub2c8\ub2e4. (1/sqrt(n)\uc744 \uacf1\ud574\uc8fc\ub294 \uac83\uc744 \ud1b5\ud574\uc11c \ucd08\uae30\ud654).</p></div>\n\n"
9898
]
9999
},
100100
{

0 commit comments

Comments
ย (0)