⚰️ Papers With No Code ⚰️
"Where code promises go to die."
369.6K
Total Tracked
303.9K
Code Unavailable
26.4K
Twilight
3.8K
Ethereal
35.5K
Code Available
9.6%
Survival Rate
⚱️ Freshly Buried
⏳
⏳ Grace Period
R.I.P.
👻
Ghosted
Morality in AI. A plea to embed morality in LLM architectures and frameworks
R.I.P.
👻
Ghosted
Diffusion Models are Molecular Dynamics Simulators
R.I.P.
👻
Ghosted
Interfacial and bulk switching MoS2 memristors for an all-2D reservoir computing framework
⏳
⏳ Grace Period
Transfer Learning of Linear Regression with Multiple Pretrained Models: Benefiting from More Pretrained Models via Overparameterization Debiasing
⏳
⏳ Grace Period
Steering diffusion models with quadratic rewards: a fine-grained analysis
R.I.P.
👻
Ghosted
MuISQA: Multi-Intent Retrieval-Augmented Generation for Scientific Question Answering
R.I.P.
👻
Ghosted
E$^3$-Pruner: Towards Efficient, Economical, and Effective Layer Pruning for Large Language Models
R.I.P.
👻
Ghosted
Analog Physical Systems Can Exhibit Double Descent
R.I.P.
👻
Ghosted
Enhancing Quranic Learning: A Multimodal Deep Learning Approach for Arabic Phoneme Recognition
R.I.P.
👻
Ghosted
Learning Tractable Distributions Of Language Model Continuations
R.I.P.
👻
Ghosted
Motion Transfer-Enhanced StyleGAN for Generating Diverse Macaque Facial Expressions
💀 Hall of Shame
Most cited papers with no code. The bigger they are, the harder they fall.
R.I.P.
👻
Ghosted
Language Models are Few-Shot Learners
R.I.P.
👻
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
👻
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
👻
Ghosted
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
R.I.P.
👻
Ghosted
You Only Look Once: Unified, Real-Time Object Detection
R.I.P.
👻
Ghosted
Semi-Supervised Classification with Graph Convolutional Networks
🌅 The Twilight Zone
Pioneers from the pre-code-sharing era. Too influential to shame, too old to expect a repo.
🌅
🌅
Old Age
Deep Residual Learning for Image Recognition
🌅
🌅
Old Age
Attention Is All You Need
🌅
🌅
Old Age
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
🌅
🌅
Old Age
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks
🌅
🌅
Old Age
SSD: Single Shot MultiBox Detector
🌅
🌅
Old Age
Squeeze-and-Excitation Networks
🌅 The Resurrection Corner
Papers that finally released their code. There is hope! 🎉