Dependence representation learning with convolutional neural networks and 2D histograms

Taejun Kim, Han joon Kim

Research output: Contribution to journalArticlepeer-review

Abstract

Researchers frequently use visualizations such as scatter plots when trying to understand how random variables are related to each other, because a single image represents numerous pieces of information. Dependency measures have been widely used to automatically detect dependencies, but these measures only take into account a few types of data, such as the strength and direction of the dependency. Based on advances in the applications of deep learning to vision, we believe that convolutional neural networks (CNNs) can come to understand dependencies by analyzing visualizations, as humans do. In this paper, we propose a method that uses CNNs to extract dependency representations from 2D histograms. We carried out three sorts of experiments and found that CNNs can learn from visual representations. In the first experiment, we used a synthetic dataset to show that CNNs can perfectly classify eight types of dependency. Then, we showed that CNNs can predict correlations based on 2D histograms of real datasets and visualize the learned dependency representation space. Finally, we applied our method and demonstrated that it performs better than the AutoLearn feature generation algorithm in terms of average classification accuracy, while generating half as many features.

Original languageEnglish
Article number955
JournalApplied Sciences (Switzerland)
Volume10
Issue number3
DOIs
StatePublished - 1 Feb 2020

Keywords

  • Convolutional neural network
  • Deep learning
  • Dependence
  • Feature engineering
  • Histogram
  • Machine learning

Fingerprint

Dive into the research topics of 'Dependence representation learning with convolutional neural networks and 2D histograms'. Together they form a unique fingerprint.

Cite this