Learning Dependence Representations with CNNs

Taejun Kim, Han Joon Kim

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

To measure statistical dependence, correlations have been widely used in the fields of statistics and machine learning. However, due to a large amount of types of nonlinear dependence, it is very challenging to distinguish the types with a correlation coefficient. In this paper, we present a new approach to capturing the dependence from 2D histograms, which has a potential to learn task-specific representations of dependence. With the representations learned on 427 datasets, our models are able to predict Pearson's correlation coefficient and distance correlation coefficient almost perfectly. Furthermore, in terms of computing speed of distance correlation, our proposed method is faster than Huo's method [1], when a sample size is large.

Original languageEnglish
Title of host publication2018 IEEE Conference on Big Data and Analytics, ICBDA 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages87-92
Number of pages6
ISBN (Electronic)9781538671283
DOIs
StatePublished - 2 Jul 2018
Event2018 IEEE Conference on Big Data and Analytics, ICBDA 2018 - Langkawi, Kedah, Malaysia
Duration: 21 Nov 201822 Nov 2018

Publication series

Name2018 IEEE Conference on Big Data and Analytics, ICBDA 2018

Conference

Conference2018 IEEE Conference on Big Data and Analytics, ICBDA 2018
Country/TerritoryMalaysia
CityLangkawi, Kedah
Period21/11/1822/11/18

Keywords

  • Convolutional Neural Networks
  • Distance Correlation
  • Learning Representations
  • Statistical Dependence

Fingerprint

Dive into the research topics of 'Learning Dependence Representations with CNNs'. Together they form a unique fingerprint.

Cite this