HyperAI
HyperAI超神经
首页
资讯
论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
首页
SOTA
神经架构搜索
Neural Architecture Search On Nas Bench 201
Neural Architecture Search On Nas Bench 201
评估指标
Accuracy (Test)
Accuracy (Val)
Search time (s)
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy (Test)
Accuracy (Val)
Search time (s)
Paper Title
Repository
DiNAS
45.41
46.66
15.36
Multi-conditioned Graph Diffusion for Neural Architecture Search
-
ParZC
46.34
46.37
-
ParZC: Parametric Zero-Cost Proxies for Efficient NAS
-
AG-Net
46.42
46.73
-
Learning Where To Look -- Generative NAS is Surprisingly Efficient
-
GDAS
41.71
-
28926
Searching for A Robust Neural Architecture in Four GPU Hours
-
α-DARTS
46.34
46.17
-
$α$ DARTS Once More: Enhancing Differentiable Architecture Search by Masked Image Modeling
-
EPE-NAS (N=100)
38.80
-
20.5
EPE-NAS: Efficient Performance Estimation Without Training for Neural Architecture Search
-
NASBOT
46.37
-
75600
Neural Architecture Search with Bayesian Optimisation and Optimal Transport
-
DARTS-
45.12
44.87
-
DARTS-: Robustly Stepping out of Performance Collapse Without Indicators
-
arch2vec
46.27
-
-
Does Unsupervised Architecture Representation Learning Help Neural Architecture Search?
-
Shapley-NAS
46.85
46.57
-
Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search
-
GEA
46.04
-
-
Guided Evolutionary Neural Architecture Search With Efficient Performance Estimation
-
iDARTS
40.89
40.38
-
iDARTS: Differentiable Architecture Search with Stochastic Implicit Gradients
-
PNAS +
-
44.75
-
Progressive Neural Architecture Search
-
DARTS-SaBN
45.85
-
-
Sandwich Batch Normalization: A Drop-In Replacement for Feature Distribution Heterogeneity
-
DARTS (second order)
16.43
-
29902
DARTS: Differentiable Architecture Search
-
Λ-DARTS
46.34
46.37
-
$Λ$-DARTS: Mitigating Performance Collapse by Harmonizing Operation Selection among Cells
-
Local search
46.38
-
151200
Exploring the Loss Landscape in Neural Architecture Search
-
CATCH-meta
-
46.07
18000
CATCH: Context-based Meta Reinforcement Learning for Transferrable Architecture Search
-
GAEA DARTS (ERM)
46.36
-
-
Geometry-Aware Gradient Algorithms for Neural Architecture Search
-
NAS without training (N=10)
38.33
-
1.7
Neural Architecture Search without Training
-
0 of 49 row(s) selected.
Previous
Next
Neural Architecture Search On Nas Bench 201 | SOTA | HyperAI超神经