Quantcast
Channel: MIT – Digital Insider
Viewing all articles
Browse latest Browse all 5

Siri, meet the family

$
0
0

The UK cover is more interesting than the US cover, which is, somewhat appropriately, covered with the repeated words "The Information."

James Gleick nearly won a Pulitzer Prize for a biography about Isaac Newton, and another about Richard Feynmann, a colorful physicist who pioneered nanotechnology and quantum mechanics. His best selling book (more than a million copies sold) is the step-by-step, scientist-by-scientist, idea-by idea story of chaos theory entitled Chaos: Making a New Science.

Gleick’s 2011 book is called The Information. it begins with the European discovery of African talking drums in the 1840s, a percussive idea he eventually connects to Samuel Morse’s dots-and-dashes telegraph code, and, we’re off on a long tale not unlike the best of James Burke’s TV series, Connections. Gleick takes us through the development of letters and alphabets, numbers and mathematics, numerical tables and algorithms, dictionaries and encyclopedias. These stories, and their many tangents, set us up for Charles Babbage whose boredom with the Cambridge curriculum in mathematics leads to an early, impossible-to-build, 25,000 piece machine, awesome in its analog, mechanical, Victorian design. This, then, leads to the further develop of the telegraph, now caught up in a new conception called a “network” that connected much of France, for example.

By the early 20th century, MIT becomes one of several institutions concerned with the training of electrical engineers–then, a new discipline–and with it, machinery to solve second-order differential equations (“rates of change within rates of change: from position to velocity to acceleration”). This, plus the logic associated with relay switches in telegraph networks, provides MIT graduate student Claude Shannon with his thesis idea: connecting electricity with logical interactions in a network. Shannon’s path leads to Bell Labs, where he works on the “transmission of intelligence.” By 1936, a 22-year old Cambridge graduate named Alan Turing had begun thinking about a machine that could compute.

Well, that’s about half the book. Now, things become more complex, harder to follow, dull for all but the most interested reader. The interweaving connects DNA and memes (and, inevitably, memetics, which is the study of memes), cybernetics and randomness, quantification of information, and Jorge Luis Borges’ 1941 conception of an ultimate library with “all books, in all languages, books of apology and prophecy, the gospel and commentary on that gospel, and commentary upon the commentary upon the gospel…”

Eventually we obsolete CD-ROMs (too much information, too little space), and create Wikipedia and the whole of the Internet. In the global googleplex, the term “information overload” becomes inadequate. And yet, Gleick promises, it is not the quantity that matters, it is the meaning that matters. After 420 pages of historical text, I’m still wondering what it all means–and whether the purpose is mere conveyance as opposed to deeper meaning or its hopeful result, understanding.


Filed under: Books & Print, Learning, Trends Tagged: connection, information theory, institution, Isaac Newton, James Gleick, knowledge, library, MIT, Wikipedia

Viewing all articles
Browse latest Browse all 5

Latest Images

Trending Articles





Latest Images