arxiv
PublishedApril 24, 2026 at 4:00 AM
—neutral
Explicit Dropout: Deterministic Regularization for Transformer Architectures
Publisher summary· verbatim
arXiv:2604.20505v1 Announce Type: new Abstract: Dropout is a widely used regularization technique in deep learning, but its effects are typically realized through stochastic masking rather than explicit optimization objectives. We propose a deterministic formulation that expresses dropout as an addi
Discussion
No replies yet. Be first.
Originally published on arxiv ↗