Abstract
The roots of information theory are almost 100 years old and include early works by Fisher [1], Hartley [2], and others. According to a history of information theory [3], motivated to understand how to draw information from experiments, Fisher [1] stated 'the nature and degree of the uncertainty [must] be capable of rigorous expression.' Subsequently, he defined statistical information as the reciprocal of the variance of a statistical sample. However, it was Shannon's work [4] that laid the mathematical foundations of information theory and revolutionized communications. Shannon developed two fundamental bounds, one on data compression and the other on transmission rate. He proved that even in the presence of noise, an arbitrarily small probability of error may be achieved as long as the transmission rate is below a quantity he defined as channel capacity.
Original language | English (US) |
---|---|
Pages (from-to) | 97-99 and 109 |
Journal | IEEE Control Systems |
Volume | 43 |
Issue number | 2 |
DOIs | |
State | Published - Apr 1 2023 |
All Science Journal Classification (ASJC) codes
- Control and Systems Engineering
- Modeling and Simulation
- Electrical and Electronic Engineering