Ilya 30u30
A Folder from Cosmic
Favicon
The Annotated Transformer
↗
Favicon
The First Law of Complexodynamics
↗
Favicon
The Unreasonable Effectiveness of RNNs
↗
Favicon
Understanding LSTM Networks
↗
Favicon
Recurrent Neural Network Regularization
↗
Favicon
Keeping Neural Networks Simple by Minimizing the Description Length of the Weights
↗
Favicon
Pointer Networks
↗
Favicon
ImageNet Classification with Deep CNNs
↗
Favicon
Order Matters: Sequence to sequence for sets
↗
Favicon
GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism
↗
Favicon
Deep Residual Learning for Image Recognition
↗
Favicon
Multi-Scale Context Aggregation by Dilated Convolutions
↗
Favicon
Neural Quantum Chemistry
↗
Favicon
Attention Is All You Need
↗
Favicon
Neural Machine Translation by Jointly Learning to Align and Translate
↗
Favicon
Identity Mappings in Deep Residual Networks
↗
Favicon
A Simple NN Module for Relational Reasoning
↗
Favicon
Variational Lossy Autoencoder
↗
Favicon
Relational RNNs
↗
Favicon
Quantifying the Rise and Fall of Complexity in Closed Systems: The Coffee Automaton
↗
Favicon
Neural Turing Machines
↗
Favicon
Deep Speech 2: End-to-End Speech Recognition in English and Mandarin
↗
Favicon
Scaling Laws for Neural LMs
↗
Favicon
A Tutorial Introduction to the Minimum Description Length Principle
↗
Favicon
Machine Super Intelligence Dissertation
↗
Favicon
PAGE 434 onwards: Komogrov Complexity
↗
Favicon
CS231n Convolutional Neural Networks for Visual Recognition
↗