Loading…

LAuReL: Learned Augmented Residual Layer

One of the core pillars of efficient deep learning methods is architectural improvements such as the residual/skip connection, which has led to significantly better model convergence and quality. Since then the residual connection has become ubiquitous in not just convolutional neural networks but a...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2024-11
Main Authors: Menghani, Gaurav, Kumar, Ravi, Kumar, Sanjiv
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:One of the core pillars of efficient deep learning methods is architectural improvements such as the residual/skip connection, which has led to significantly better model convergence and quality. Since then the residual connection has become ubiquitous in not just convolutional neural networks but also transformer-based architectures, the backbone of LLMs. In this paper we introduce \emph{Learned Augmented Residual Layer} (LAuReL) -- a novel generalization of the canonical residual connection -- with the goal to be an in-situ replacement of the latter while outperforming on both model quality and footprint metrics. Our experiments show that using \laurel can help boost performance for both vision and language models. For example, on the ResNet-50, ImageNet 1K task, it achieves \(60\%\) of the gains from adding an extra layer, while only adding \(0.003\%\) more parameters, and matches it while adding \(2.6\times\) fewer parameters.
ISSN:2331-8422