Universal pooling – A new pooling method for convolutional neural networks

Junhyuk Hyun, Hongje Seong, Euntai Kim

Research output: Contribution to journalArticlepeer-review

21 Scopus citations

Abstract

Pooling is one of the key elements in a convolutional neural network. It reduces the feature map size, thereby enabling training with a limited amount of computation. The most common pooling methods are average pooling, max pooling, and stride pooling. The common pooling methods, however, have the disadvantage that they can perform only specified and fixed pooling functions and thus have limited expressive power. In this paper, we propose a new pooling method named universal pooling (UP). UP performs different pooling functions depending on the training samples. UP is a general pooling and includes the previous common pooling methods as special cases. The structure of UP is inspired by attention methods. UP can actually be considered as a channel-wise local spatial attention module. It is quite different from attention-based feature reduction methods. We insert UP into a couple of popular networks and apply the networks to benchmark sets in two applications, namely, image recognition and semantic segmentation. The experiment results show that complex poolings are trained in the proposed UP and that UP achieves better performance than the previous pooling methods.

Original languageEnglish
Article number115084
JournalExpert Systems with Applications
Volume180
DOIs
StatePublished - 15 Oct 2021

Keywords

  • Attention methods
  • Average pooling
  • Convolutional neural network
  • Max pooling
  • Stride pooling
  • Universal pooling

Fingerprint

Dive into the research topics of 'Universal pooling – A new pooling method for convolutional neural networks'. Together they form a unique fingerprint.

Cite this