Importance sampling in object detection at Zenseact
Information
Författare: Emma Lindberg, Mathias JohanssonBeräknat färdigt: 2022-06
Handledare: Erik Werner
Handledares företag/institution: Zenseact AB
Ämnesgranskare: Per Mattsson
Övrigt: -
Presentationer
Presentation av Emma LindbergPresentationstid: 2022-06-08 16:15
Presentation av Mathias Johansson
Presentationstid: 2022-06-08 17:15
Opponenter: Vilhelm Söderström, Kasper Knudsen
Abstract
Available computing resources play a large part in enabling the training of modern deep neural networks to complete complex computer vision tasks. Improving the efficiency with which this computational power is utilized is highly important for enterprises to improve their networks rapidly.
The first few training iterations over the data set often result in substantial gradients from seeing the samples and quick improvements in the network. At later stages, most of the training time is spent on samples that produce tiny gradient updates and are already properly handled. To make neural network training more efficient, researchers have used methods that give more attention to the samples that still produce relatively large gradient updates for the network. The methods used are called ”Importance Sampling”. When used, it reduces the variance in sampling and concentrates the training on the more informative examples.
This thesis contributes to the studies on importance sampling by investigating its effectiveness in different contexts. In comparison to other studies, we more extensively examine image classification by exploring different network architectures over a wide range of parameter counts. Similar to earlier studies, we apply several ways of doing importance sampling across several datasets. While most previous research on importance sampling strategies applies it to image classification, our research aims at generalizing the results by applying it to object detection problems on top of image classification.
Our research on image classification tasks conclusively suggests that importance sampling can speed up the training of deep neural networks. When performance in convergence is the vital metric, our importance sampling methods show mixed results. For the object detection tasks, preliminary experiments have been conducted. However, the findings lack enough data to demonstrate the effectiveness of importance sampling in object detection conclusively.