Skip navigation

상단메뉴

글로벌메뉴

좌측메뉴

학술행사

검색

논문

tab menu

  • View
  • All
  • 수학부
  • 물리학부
  • 계산과학부
  • Center for Advanced Computation

Seminar View

Seminar
TITLE Scale-invariant representation of machine learning
KIAS AUTHORS Jo, Junghyo
JOURNAL PHYSICAL REVIEW E, 2022
ARCHIVE  
ABSTRACT The success of machine learning has resulted from its structured representation of data. Similar data have close internal representations as compressed codes for classification or emerged labels for clustering. We observe that the frequency of internal codes or labels follows power laws in both supervised and unsupervised learning models. This scale-invariant distribution implies that machine learning largely compresses frequent typical data, and simultaneously, differentiates many atypical data as outliers. In this study, we derive the process by which these power laws can naturally arise in machine learning. In terms of information theory, the scale-invariant representation corresponds to a maximally uncertain data grouping among possible representations that guarantee a given learning accuracy.
  • before page
  • list
  • next page
Seminar List

keyword

fiel&date

~