Name: | Description: | Size: | Format: | |
---|---|---|---|---|
1.48 MB | Adobe PDF |
Authors
Advisor(s)
Abstract(s)
Convolutional neural networks (CNNs) were inspired by biology. They are hierarchical neural
networks whose convolutional layers alternate with subsampling layers, reminiscent of simple
and complex cells in the primary visual cortex [Fuk86a]. In the last years, CNNs have emerged
as a powerful machine learning model and achieved the best results in many object recognition
benchmarks [ZF13, HSK+12, LCY14, CMMS12].
In this dissertation, we introduce two new proposals for convolutional neural networks. The
first, is a method to combine the output probabilities of CNNs which we call Weighted
Convolutional Neural Network Ensemble. Each network has an associated weight that makes
networks with better performance have a greater influence at the time to classify a pattern
when compared to networks that performed worse. This new approach produces better results
than the common method that combines the networks doing just the average of the output
probabilities to make the predictions. The second, which we call DropAll, is a generalization
of two well-known methods for regularization of fully-connected layers within convolutional
neural networks, DropOut [HSK+12] and DropConnect [WZZ+13]. Applying these methods
amounts to sub-sampling a neural network by dropping units. When training with DropOut, a
randomly selected subset of the output layer’s activations are dropped, when training with
DropConnect we drop a randomly subsets of weights. With DropAll we can perform both methods
simultaneously. We show the validity of our proposals by improving the classification error on a
common image classification benchmark.
Description
Keywords
Convolutional Neural Networks Network Ensemble Object Recognition Regularization