Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
labmlai
GitHub Repository: labmlai/annotated_deep_learning_paper_implementations
Path: blob/master/translate_cache/sampling/top_k.ja.json
4931 views
1
{
2
"<h1>Top-k Sampling</h1>\n<p>Here we first pick the top-k tokens from the distribution of logits, and then sample from them.</p>\n<p>Here&#x27;s an <a href=\"experiment.html\">experiment</a> that uses these sampling techniques.</p>\n": "<h1>\u30c8\u30c3\u30d7k\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0</h1>\n<p>\u3053\u3053\u3067\u306f\u3001\u6700\u521d\u306b\u30ed\u30b8\u30c3\u30c8\u306e\u5206\u5e03\u304b\u3089\u4e0a\u4f4dk\u500b\u306e\u30c8\u30fc\u30af\u30f3\u3092\u9078\u629e\u3057\u3001\u6b21\u306b\u305d\u308c\u3089\u304b\u3089\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0\u3057\u307e\u3059\u3002</p>\n<p>\u3053\u308c\u306f\u3001<a href=\"experiment.html\">\u3053\u308c\u3089\u306e\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0\u624b\u6cd5\u3092\u4f7f\u7528\u3057\u305f\u5b9f\u9a13\u3067\u3059</a>\u3002</p>\n",
3
"<h2>Top-k Sampler</h2>\n": "<h2>\u30c8\u30c3\u30d7k\u30b5\u30f3\u30d7\u30e9\u30fc</h2>\n",
4
"<p> Sample from logits</p>\n": "<p>\u30ed\u30b8\u30c3\u30c8\u304b\u3089\u306e\u30b5\u30f3\u30d7\u30eb</p>\n",
5
"<p>New logits filled with <span translate=no>_^_0_^_</span>; i.e. zero probability </p>\n": "<p><span translate=no>_^_0_^_</span>\u65b0\u3057\u3044\u30ed\u30b8\u30c3\u30c8\u3092\u57cb\u3081\u308b\u3001\u3064\u307e\u308a\u78ba\u7387\u304c\u30bc\u30ed</p>\n",
6
"<p>Pick the largest <span translate=no>_^_0_^_</span> logits and their indices </p>\n": "<p><span translate=no>_^_0_^_</span>\u6700\u5927\u306e\u30ed\u30b8\u30c3\u30c8\u3068\u305d\u306e\u30a4\u30f3\u30c7\u30c3\u30af\u30b9\u3092\u9078\u629e\u3057\u3066\u304f\u3060\u3055\u3044</p>\n",
7
"<p>Sample from the top-k logits with the specified sampler. </p>\n": "<p>\u6307\u5b9a\u3055\u308c\u305f\u30b5\u30f3\u30d7\u30e9\u30fc\u3092\u4f7f\u7528\u3057\u3066\u3001\u4e0a\u304b\u3089k\u500b\u306e\u30ed\u30b8\u30c3\u30c8\u3092\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0\u3057\u307e\u3059\u3002</p>\n",
8
"<p>Set the values of the top-k selected indices to actual logits. Logits of other tokens remain <span translate=no>_^_0_^_</span> </p>\n": "<p>\u9078\u629e\u3057\u305f\u4e0a\u4f4dk\u306e\u30a4\u30f3\u30c7\u30c3\u30af\u30b9\u306e\u5024\u3092\u5b9f\u969b\u306e\u30ed\u30b8\u30c3\u30c8\u306b\u8a2d\u5b9a\u3057\u307e\u3059\u3002\u4ed6\u306e\u30c8\u30fc\u30af\u30f3\u306e\u30ed\u30b8\u30c3\u30c8\u306f\u6b8b\u308a\u307e\u3059 <span translate=no>_^_0_^_</span></p>\n",
9
"<ul><li><span translate=no>_^_0_^_</span> is the number of tokens to pick </li>\n<li><span translate=no>_^_1_^_</span> is the sampler to use for the top-k tokens</li></ul>\n<p><span translate=no>_^_2_^_</span> can be any sampler that takes a logits tensor as input and returns a token tensor; e.g. <a href=\"temperature.html\">`TemperatureSampler&#x27;</a>.</p>\n": "<ul><li><span translate=no>_^_0_^_</span>\u9078\u629e\u3059\u308b\u30c8\u30fc\u30af\u30f3\u306e\u6570\u3067\u3059</li>\n<li><span translate=no>_^_1_^_</span>\u30c8\u30c3\u30d7k\u306e\u30c8\u30fc\u30af\u30f3\u306b\u4f7f\u7528\u3059\u308b\u30b5\u30f3\u30d7\u30e9\u30fc\u3067\u3059</li></ul>\n<p><span translate=no>_^_2_^_</span><a href=\"temperature.html\">\u30ed\u30b8\u30c3\u30c4\u30c6\u30f3\u30bd\u30eb\u3092\u5165\u529b\u3068\u3057\u3066\u53d7\u3051\u53d6\u308a\u3001\u30c8\u30fc\u30af\u30f3\u30c6\u30f3\u30bd\u30eb\u3092\u8fd4\u3059\u30b5\u30f3\u30d7\u30e9\u30fc\u306a\u3089\u3069\u308c\u3067\u3082\u304b\u307e\u3044\u307e\u305b\u3093\uff08\u4f8b\uff1a`TemperatureSampler'\uff09\u3002</a></p>\n",
10
"A PyTorch implementation of top-k sampling from language models.": "\u8a00\u8a9e\u30e2\u30c7\u30eb\u304b\u3089\u306e top-k \u30b5\u30f3\u30d7\u30ea\u30f3\u30b0\u306e PyTorch \u5b9f\u88c5\u3002",
11
"Top-k Sampling": "\u30c8\u30c3\u30d7k\u30b5\u30f3\u30d7\u30ea\u30f3\u30b0"
12
}
13