Article Review - How far can we scale up? Deep Learning's Diminishing Returns (Video)

Deep Learning has achieved impressive results in the last years, not least due to the massive increases in computational power and data that has gone into these models. Scaling up currently promises to be a reliable way to create more performant systems, but how far can we go? This article explores the limits of exponential scaling in AI, and what people are doing to get around this problem

0:00 - Intro & Overview
1:00 - Deep Learning at its limits
3:10 - The cost of overparameterization
5:40 - Extrapolating power usage and CO2 emissions
10:45 - We cannot just continue scaling up
13:25 - Current solution attempts
15:25 - Aside: ImageNet V2
17:50 - Are symbolic methods the way out?


Image by Ralf Vetterle from Pixabay: