diff --git a/translate_cache/transformers/__init__.zh.json b/translate_cache/transformers/__init__.zh.json index 935f463c..b9d47583 100644 --- a/translate_cache/transformers/__init__.zh.json +++ b/translate_cache/transformers/__init__.zh.json @@ -38,7 +38,7 @@ "
This is an implementation of the paper An Attention Free Transformer.
\n": "\u8fd9\u662f\u8bba\u6587\u300a\u65e0\u6ce8\u610f\u529b\u53d8\u538b\u5668\u300b\u7684\u5b9e\u73b0\u3002
\n", "This is an implementation of the paper Primer: Searching for Efficient Transformers for Language Modeling.
\n": "\u8fd9\u662f\u8bba\u6587\u300a\u5165\u95e8\uff1a\u4e3a\u8bed\u8a00\u5efa\u6a21\u5bfb\u627e\u9ad8\u6548\u7684\u53d8\u6362\u5668\u300b\u7684\u5b9e\u73b0\u3002
\n", "This is an implementation of the paper Hierarchical Transformers Are More Efficient Language Models
\n": "\u8fd9\u662f\u8bba\u6587\u300a\u5206\u5c42\u53d8\u6362\u5668\u662f\u66f4\u6709\u6548\u7684\u8bed\u8a00\u6a21\u578b\u300b\u7684\u5b9e\u73b0
\n", - "This module contains PyTorch implementations and explanations of original transformer from paper Attention Is All You Need, and derivatives and enhancements of it.
\n": "\u672c\u6a21\u5757\u5305\u542b PyTorch \u5b9e\u73b0\u548c\u8bba\u6587 Attronger Is All You Need \u4e2d\u5bf9\u539f\u521b\u53d8\u538b\u5668\u7684\u89e3\u91ca\uff0c\u4ee5\u53ca\u5b83\u7684\u884d\u751f\u54c1\u548c\u589e\u5f3a\u529f\u80fd\u3002
\n", + "This module contains PyTorch implementations and explanations of original transformer from paper Attention Is All You Need, and derivatives and enhancements of it.
\n": "\u672c\u6a21\u5757\u5305\u542b PyTorch \u5b9e\u73b0\u548c\u8bba\u6587 Attention Is All You Need \u4e2d\u5bf9\u539f\u521b\u53d8\u538b\u5668\u7684\u89e3\u91ca\uff0c\u4ee5\u53ca\u5b83\u7684\u884d\u751f\u54c1\u548c\u589e\u5f3a\u529f\u80fd\u3002
\n", "\n": "\n", "This is a collection of PyTorch implementations/tutorials of transformers and related techniques.": "\u8fd9\u662f\u53d8\u538b\u5668\u548c\u76f8\u5173\u6280\u672f\u7684 PyTorch \u5b9e\u73b0/\u6559\u7a0b\u7684\u96c6\u5408\u3002", "Transformers": "\u53d8\u538b\u5668"