Maximum auto-mutual-information factor analysis

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

144 Downloads (Pure)

Abstract

Based on the information theoretical measure mutual information derived from entropy and Kullback-Leibler divergence, an alternative to maximum autocorrelation factor analysis is sketched.
Original languageEnglish
Title of host publicationProceedings Volume 10427, Image and Signal Processing for Remote Sensing XXIII
Number of pages2
Volume10427
PublisherSPIE - International Society for Optical Engineering
Publication date2017
DOIs
Publication statusPublished - 2017
EventSPIE Remote Sensing 2017 - DoubleTree Hilton Hotel & Conference Centre, Warsaw, Poland
Duration: 11 Sep 201714 Sep 2017
http://spie.org/spieremotesensing

Conference

ConferenceSPIE Remote Sensing 2017
LocationDoubleTree Hilton Hotel & Conference Centre
CountryPoland
CityWarsaw
Period11/09/201714/09/2017
Internet address
SeriesProceedings of SPIE, the International Society for Optical Engineering
Volume104270
ISSN0277-786X

Fingerprint Dive into the research topics of 'Maximum auto-mutual-information factor analysis'. Together they form a unique fingerprint.

Cite this