Path: blob/master/translate_cache/experiments/cifar10.ja.json
4923 views
{1"<h1>CIFAR10 Experiment</h1>\n": "<h1>CIFAR10 \u5b9f\u9a13</h1>\n",2"<h2>Configurations</h2>\n<p>This extends from CIFAR 10 dataset configurations from <a href=\"https://github.com/labmlai/labml/tree/master/helpers\"><span translate=no>_^_0_^_</span></a> and <a href=\"mnist.html\"><span translate=no>_^_1_^_</span></a>.</p>\n": "<h2>\u30b3\u30f3\u30d5\u30a3\u30ae\u30e5\u30ec\u30fc\u30b7\u30e7\u30f3</h2>\n<p>\u3053\u308c\u306f\u3001\u304a\u3088\u3073\u306e CIFAR 10 \u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u69cb\u6210\u3092\u62e1\u5f35\u3057\u305f\u3082\u306e\u3067\u3059<a href=\"https://github.com/labmlai/labml/tree/master/helpers\"><span translate=no>_^_0_^_</span></a>\u3002<a href=\"mnist.html\"><span translate=no>_^_1_^_</span></a></p>\n",3"<h3>Augmented CIFAR 10 train dataset</h3>\n": "<h3>\u62e1\u5f35\u3055\u308c\u305f CIFAR 10 \u30c8\u30ec\u30a4\u30f3\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8</h3>\n",4"<h3>Non-augmented CIFAR 10 validation dataset</h3>\n": "<h3>\u62e1\u5f35\u3055\u308c\u3066\u3044\u306a\u3044 CIFAR 10 \u691c\u8a3c\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8</h3>\n",5"<h3>VGG model for CIFAR-10 classification</h3>\n": "<h3>CIFAR-10 \u5206\u985e\u7528\u306e VGG \u30e2\u30c7\u30eb</h3>\n",6"<p> </p>\n": "<p></p>\n",7"<p> Convolution and activation combined</p>\n": "<p>\u30b3\u30f3\u30dc\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\u3068\u30a2\u30af\u30c6\u30a3\u30d9\u30fc\u30b7\u30e7\u30f3\u306e\u7d44\u307f\u5408\u308f\u305b</p>\n",8"<p>5 <span translate=no>_^_0_^_</span> pooling layers will produce a output of size <span translate=no>_^_1_^_</span>. CIFAR 10 image size is <span translate=no>_^_2_^_</span> </p>\n": "<p><span translate=no>_^_0_^_</span><span translate=no>_^_1_^_</span>5\u3064\u306e\u30d7\u30fc\u30ea\u30f3\u30b0\u30ec\u30a4\u30e4\u30fc\u3067\u30b5\u30a4\u30ba\u306e\u51fa\u529b\u304c\u5f97\u3089\u308c\u307e\u3059\u3002CIFAR 10 \u306e\u753b\u50cf\u30b5\u30a4\u30ba\u306f <span translate=no>_^_2_^_</span></p>\n",9"<p>Convolution, Normalization and Activation layers </p>\n": "<p>\u30b3\u30f3\u30dc\u30ea\u30e5\u30fc\u30b7\u30e7\u30f3\u3001\u30ce\u30fc\u30de\u30e9\u30a4\u30bc\u30fc\u30b7\u30e7\u30f3\u3001\u30a2\u30af\u30c6\u30a3\u30d9\u30fc\u30b7\u30e7\u30f3\u30ec\u30a4\u30e4\u30fc</p>\n",10"<p>Create a sequential model with the layers </p>\n": "<p>\u30ec\u30a4\u30e4\u30fc\u3092\u542b\u3080\u30b7\u30fc\u30b1\u30f3\u30b7\u30e3\u30eb\u30e2\u30c7\u30eb\u306e\u4f5c\u6210</p>\n",11"<p>Final linear layer </p>\n": "<p>\u6700\u7d42\u7dda\u5f62\u30ec\u30a4\u30e4\u30fc</p>\n",12"<p>Final logits layer </p>\n": "<p>\u6700\u7d42\u30ed\u30b8\u30c3\u30c8\u30ec\u30a4\u30e4\u30fc</p>\n",13"<p>Max pooling at end of each block </p>\n": "<p>\u5404\u30d6\u30ed\u30c3\u30af\u7d42\u4e86\u6642\u306e\u6700\u5927\u30d7\u30fc\u30ea\u30f3\u30b0</p>\n",14"<p>Number of channels in each layer in each block </p>\n": "<p>\u5404\u30d6\u30ed\u30c3\u30af\u306e\u5404\u30ec\u30a4\u30e4\u30fc\u306e\u30c1\u30e3\u30f3\u30cd\u30eb\u6570</p>\n",15"<p>Pad and crop </p>\n": "<p>\u30d1\u30c3\u30c9\u3068\u30af\u30ed\u30c3\u30d7</p>\n",16"<p>RGB channels </p>\n": "<p>RGB \u30c1\u30e3\u30f3\u30cd\u30eb</p>\n",17"<p>Random horizontal flip </p>\n": "<p>\u30e9\u30f3\u30c0\u30e0\u6c34\u5e73\u53cd\u8ee2</p>\n",18"<p>Reshape for classification layer </p>\n": "<p>\u5206\u985e\u30ec\u30a4\u30e4\u30fc\u306e\u5f62\u72b6\u3092\u5909\u66f4</p>\n",19"<p>The VGG layers </p>\n": "<p>VGG \u30ec\u30a4\u30e4\u30fc</p>\n",20"<p>Use CIFAR10 dataset by default </p>\n": "<p>\u30c7\u30d5\u30a9\u30eb\u30c8\u3067 CIFAR10 \u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u3092\u4f7f\u7528</p>\n",21"CIFAR10 Experiment": "CIFAR10 \u5b9f\u9a13",22"This is a reusable trainer for CIFAR10 dataset": "\u3053\u308c\u306fCIFAR10\u30c7\u30fc\u30bf\u30bb\u30c3\u30c8\u7528\u306e\u518d\u5229\u7528\u53ef\u80fd\u306a\u30c8\u30ec\u30fc\u30ca\u30fc\u3067\u3059"23}2425