mirror of
https://github.com/labmlai/annotated_deep_learning_paper_implementations.git
synced 2025-11-02 21:40:15 +08:00
{
"<h1><a href=\"https://nn.labml.ai/graphs/gatv2/index.html\">Graph Attention Networks v2 (GATv2)</a></h1>\n<p>This is a <a href=\"https://pytorch.org\">PyTorch</a> implementation of the GATv2 operator from the paper <a href=\"https://arxiv.org/abs/2105.14491\">How Attentive are Graph Attention Networks?</a>.</p>\n<p>GATv2s work on graph data. A graph consists of nodes and edges connecting nodes. For example, in Cora dataset the nodes are research papers and the edges are citations that connect the papers.</p>\n<p>The GATv2 operator fixes the static attention problem of the standard GAT: since the linear layers in the standard GAT are applied right after each other, the ranking of attended nodes is unconditioned on the query node. In contrast, in GATv2, every node can attend to any other node.</p>\n<p>Here is <a href=\"https://nn.labml.ai/graphs/gatv2/experiment.html\">the training code</a> for training a two-layer GATv2 on Cora dataset. </p>\n": "<h1><a href=\"https://nn.labml.ai/graphs/gatv2/index.html\">Graph \u6ce8\u610f\u529b\u7f51\u7edc v2 (Gatv2)</a></h1>\n<p>\u8fd9\u662f <a href=\"https://pytorch.org\">PyTorch</a> \u5bf9 Gatv2 \u8fd0\u7b97\u7b26\u7684\u5b9e\u73b0\uff0c\u6458\u81ea\u300a<a href=\"https://arxiv.org/abs/2105.14491\">\u56fe\u6ce8\u610f\u529b\u7f51\u7edc\u6709\u591a\u4e13\u5fc3\uff1f</a>\u300b</p>\u3002\n<p>Gatv2 \u5904\u7406\u56fe\u8868\u6570\u636e\u3002\u56fe\u7531\u8282\u70b9\u548c\u8fde\u63a5\u8282\u70b9\u7684\u8fb9\u7ec4\u6210\u3002\u4f8b\u5982\uff0c\u5728 Cora \u6570\u636e\u96c6\u4e2d\uff0c\u8282\u70b9\u662f\u7814\u7a76\u8bba\u6587\uff0c\u8fb9\u7f18\u662f\u8fde\u63a5\u8bba\u6587\u7684\u5f15\u6587\u3002</p>\nG@@ <p>atv2 \u8fd0\u7b97\u7b26\u4fee\u590d\u4e86\u6807\u51c6 GAT \u7684\u9759\u6001\u6ce8\u610f\u529b\u95ee\u9898\uff1a\u7531\u4e8e\u6807\u51c6 GAT \u4e2d\u7684\u7ebf\u6027\u5c42\u662f\u7d27\u63a5\u5e94\u7528\u7684\uff0c\u56e0\u6b64\u6709\u4eba\u503c\u5b88\u8282\u70b9\u7684\u6392\u540d\u4e0d\u53d7\u67e5\u8be2\u8282\u70b9\u7684\u9650\u5236\u3002\u76f8\u6bd4\u4e4b\u4e0b\uff0c\u5728 Gatv2 \u4e2d\uff0c\u6bcf\u4e2a\u8282\u70b9\u90fd\u53ef\u4ee5\u7ba1\u7406\u4efb\u4f55\u5176\u4ed6\u8282\u70b9\u3002</p>\n<p>\u4ee5\u4e0b\u662f<a href=\"https://nn.labml.ai/graphs/gatv2/experiment.html\">\u5728 Cora \u6570\u636e\u96c6\u4e0a\u8bad\u7ec3\u53cc\u5c42 Gatv2 \u7684\u8bad\u7ec3\u4ee3\u7801</a>\u3002</p>\n",
"Graph Attention Networks v2 (GATv2)": "Graph \u6ce8\u610f\u529b\u7f51\u7edc v2 (GATv2)"
}