Information Theoretic Signal Processing and Its Applications [Bookshelf]

Research output: Contribution to journalReview articlepeer-review

Abstract

The roots of information theory are almost 100 years old and include early works by Fisher [1], Hartley [2], and others. According to a history of information theory [3], motivated to understand how to draw information from experiments, Fisher [1] stated 'the nature and degree of the uncertainty [must] be capable of rigorous expression.' Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon's work [4] that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.

Original languageEnglish (US)
Pages (from-to)97-99 and 109
JournalIEEE Control Systems
Volume43
Issue number2
DOIs
StatePublished - Apr 1 2023

All Science Journal Classification (ASJC) codes

  • Control and Systems Engineering
  • Modeling and Simulation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Information Theoretic Signal Processing and Its Applications [Bookshelf]'. Together they form a unique fingerprint.

Cite this