Sciweavers

ICPR
2010
IEEE

Fast Training of Object Detection Using Stochastic Gradient Descent

13 years 7 months ago
Fast Training of Object Detection Using Stochastic Gradient Descent
Training datasets for object detection problems are typically very large and Support Vector Machine (SVM) implementations are computationally complex. As opposed to these complex techniques, we use Stochastic Gradient Descent (SGD) algorithms that use only a single new training sample in each iteration and process samples in a stream-like fashion. We have incorporated SGD optimization in an object detection framework. The object detection problem is typically highly asymmetric, because of the limited variation in object appearance, compared to the background. Incorporating SGD speeds up the optimization process significantly, requiring only a single iteration over the training set to obtain results comparable to state-of-the-art SVM techniques. SGD optimization is linearly scalable in time and the obtained speedup in computation time is two to three orders of magnitude. We show that by considering only part of the total training set, SGD converges quickly to the overall optimum.
Rob Wijnhoven, Peter H. N. De With
Added 29 Sep 2010
Updated 29 Sep 2010
Type Conference
Year 2010
Where ICPR
Authors Rob Wijnhoven, Peter H. N. De With
Comments (0)