This section contains 455 words (approx. 2 pages at 400 words per page) |
The Information: A History, a Theory, a Flood Summary & Study Guide Description
The Information: A History, a Theory, a Flood Summary & Study Guide includes comprehensive information and analysis to help you understand the book. This study guide contains the following sections:
This detailed literature summary also contains Topics for Discussion and a Free Quiz on The Information: A History, a Theory, a Flood by James Gleick.
"The Information" is an examination of the history of information theory as well as an essay on how computers and the internet have changed the way in which people interact with and approach information. Information now floods our society, Gleick explains, requiring us to filter and search it to find what we want to know.
Gleick traces advances in information technology from the two-tone drums used by sub-Saharan Africans to communicate over long distances through the development of the telegraph, telephone and internet. The author traces the development of the alphabet to represent language and form words and the transformation of the alphabet into codes to make it transferable along telegraph lines.
Alongside these technological developments, the concept of information as a measurable quantity also developed, beginning with the early attempts of Charles Babbage to construct a mechanical machine that would solve mathematical equations. Gleick describes the theories of the American mathematician Claude Shannon, who was among the first thinkers to propose a way to look at information as something apart from the meaning of a message. Gleick finds the influence of Shannon's information theory extending through to modern computing methods.
The concept was so powerful, Gleick explains, that scientists from other fields such as psychology, biology, and physics took it as a model within their own fields. The brain could be imagined as an immense series of binary switches that affected thought and behavior. The human genome was really a code for the construction of an organism and could potentially be deciphered using information theory. In physics, the theory could be applied to infer the states of quantum particles and to examine the nature of space and time.
Gleick also looks at the relationship between language and mathematics, exploring the ideas by thinkers such as Leibniz that complex thoughts could be represented by symbols and calculated. George Boole expanded on this idea and introduced a system of logic that is now used in computer programming. Bertrand Russell constructed a theory of numbers from the ground up that was plagued by the presence of certain paradoxes. Kurt Gödel proved that these paradoxes were necessary.
After a complete discussion of the history of information technology and theory, Gleick turns to the modern flood of information that has resulted. He looks at the rise of Wikipedia, a collaborative encyclopedia that exists only online and the ways in which modern people cope with the glut of available information. Everything is being saved somewhere. It would seem that there is so much information now that it is hopeless to find anything useful or true, but Gleick is optimistic that we will find new ways to search and filter and continue to learn and create.
Read more from the Study Guide
This section contains 455 words (approx. 2 pages at 400 words per page) |