HyperAI
HyperAI超神经
首页
资讯
论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
释义识别
Paraphrase Identification On Quora Question
Paraphrase Identification On Quora Question
评估指标
Accuracy
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy
Paper Title
Repository
MwAN
89.12
Multiway Attention Networks for Modeling Sentence Pairs
XLNet-Large (ensemble)
90.3
XLNet: Generalized Autoregressive Pretraining for Language Understanding
-
RoBERTa-large 355M + Entailment as Few-shot Learner
-
Entailment as Few-Shot Learner
-
ERNIE
-
ERNIE: Enhanced Language Representation with Informative Entities
-
ASA + BERT-base
-
Adversarial Self-Attention for Language Understanding
-
TRANS-BLSTM
88.28
TRANS-BLSTM: Transformer with Bidirectional LSTM for Language Understanding
-
RealFormer
91.34
RealFormer: Transformer Likes Residual Attention
-
SplitEE-S
-
SplitEE: Early Exit in Deep Neural Networks with Split Computing
-
SMART-BERT
-
SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
-
MT-DNN
89.6
Multi-Task Deep Neural Networks for Natural Language Understanding
-
GenSen
87.01
Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning
-
ASA + RoBERTa
-
Adversarial Self-Attention for Language Understanding
-
DIIN
89.06
Natural Language Inference over Interaction Space
-
FNet-Large
-
FNet: Mixing Tokens with Fourier Transforms
-
1-3[0.8pt/2pt] Random
80
Self-Explaining Structures Improve NLP Models
-
StructBERTRoBERTa ensemble
90.7
StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding
-
BERT-Base
-
Intrinsic Dimensionality Explains the Effectiveness of Language Model Fine-Tuning
-
BiMPM
88.17
Bilateral Multi-Perspective Matching for Natural Language Sentences
-
FreeLB
74.8
SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
-
BERT-LARGE
-
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
-
0 of 31 row(s) selected.
Previous
Next
Paraphrase Identification On Quora Question | SOTA | HyperAI超神经