Path: blob/master/translate_cache/experiments/cifar10.zh.json
4923 views
{1"<h1>CIFAR10 Experiment</h1>\n": "<h1>CIFAR10 \u5b9e\u9a8c</h1>\n",2"<h2>Configurations</h2>\n<p>This extends from CIFAR 10 dataset configurations from <a href=\"https://github.com/labmlai/labml/tree/master/helpers\"><span translate=no>_^_0_^_</span></a> and <a href=\"mnist.html\"><span translate=no>_^_1_^_</span></a>.</p>\n": "<h2>\u914d\u7f6e</h2>\n<p>\u8fd9\u662f\u4ece\u548c\u5f00\u59cb\u7684 CIFAR 10 \u6570\u636e\u96c6\u914d\u7f6e\u6269\u5c55<a href=\"https://github.com/labmlai/labml/tree/master/helpers\"><span translate=no>_^_0_^_</span></a>\u800c\u6765\u7684<a href=\"mnist.html\"><span translate=no>_^_1_^_</span></a>\u3002</p>\n",3"<h3>Augmented CIFAR 10 train dataset</h3>\n": "<h3>\u589e\u5f3a\u7684 CIFAR 10 \u8bad\u7ec3\u6570\u636e\u96c6</h3>\n",4"<h3>Non-augmented CIFAR 10 validation dataset</h3>\n": "<h3>\u975e\u589e\u5f3a CIFAR 10 \u9a8c\u8bc1\u6570\u636e\u96c6</h3>\n",5"<h3>VGG model for CIFAR-10 classification</h3>\n": "<h3>\u7528\u4e8e CIFAR-10 \u5206\u7c7b\u7684 VGG \u6a21\u578b</h3>\n",6"<p> </p>\n": "<p></p>\n",7"<p> Convolution and activation combined</p>\n": "<p>\u5377\u79ef\u548c\u6fc0\u6d3b\u76f8\u7ed3\u5408</p>\n",8"<p>5 <span translate=no>_^_0_^_</span> pooling layers will produce a output of size <span translate=no>_^_1_^_</span>. CIFAR 10 image size is <span translate=no>_^_2_^_</span> </p>\n": "<p>5 \u4e2a<span translate=no>_^_0_^_</span>\u6c60\u5316\u56fe\u5c42\u5c06\u751f\u6210\u5927\u5c0f\u4e3a size \u7684\u8f93\u51fa<span translate=no>_^_1_^_</span>\u3002CIFAR 10 \u56fe\u50cf\u5927\u5c0f\u4e3a<span translate=no>_^_2_^_</span></p>\n",9"<p>Convolution, Normalization and Activation layers </p>\n": "<p>\u5377\u79ef\u3001\u5f52\u4e00\u5316\u548c\u6fc0\u6d3b\u5c42</p>\n",10"<p>Create a sequential model with the layers </p>\n": "<p>\u4f7f\u7528\u5c42\u521b\u5efa\u987a\u5e8f\u6a21\u578b</p>\n",11"<p>Final linear layer </p>\n": "<p>\u6700\u540e\u7684\u7ebf\u6027\u5c42</p>\n",12"<p>Final logits layer </p>\n": "<p>\u6700\u540e\u7684 logits \u5c42</p>\n",13"<p>Max pooling at end of each block </p>\n": "<p>\u6bcf\u4e2a\u533a\u5757\u672b\u7aef\u7684\u6700\u5927\u6c60\u6570</p>\n",14"<p>Number of channels in each layer in each block </p>\n": "<p>\u6bcf\u4e2a\u533a\u5757\u4e2d\u6bcf\u5c42\u7684\u901a\u9053\u6570</p>\n",15"<p>Pad and crop </p>\n": "<p>\u586b\u5145\u548c\u88c1\u526a</p>\n",16"<p>RGB channels </p>\n": "<p>RGB \u901a\u9053</p>\n",17"<p>Random horizontal flip </p>\n": "<p>\u968f\u673a\u6c34\u5e73\u7ffb\u8f6c</p>\n",18"<p>Reshape for classification layer </p>\n": "<p>\u4fee\u6539\u5206\u7c7b\u56fe\u5c42\u7684\u5f62\u72b6</p>\n",19"<p>The VGG layers </p>\n": "<p>VGG \u5c42</p>\n",20"<p>Use CIFAR10 dataset by default </p>\n": "<p>\u9ed8\u8ba4\u4f7f\u7528 CIFAR10 \u6570\u636e\u96c6</p>\n",21"CIFAR10 Experiment": "CIFAR10 \u5b9e\u9a8c",22"This is a reusable trainer for CIFAR10 dataset": "\u8fd9\u662f CIFAR10 \u6570\u636e\u96c6\u7684\u53ef\u91cd\u590d\u4f7f\u7528\u7684\u8bad\u7ec3\u5668"23}2425