Books to Get You Thinking


Tucked away in the Technology section of the New York Times was an article that caught my attention the other day. Computer scientists at Google had built an intricate software system that had the extraordinary capability of recognizing faces. The unique feature of this software was that it taught itself to learn - a rudimentary prototype of how the human brain works. Navigating through vast amounts of data, it extracts the key knowledge components and uses them to make an inference. In 2011, IBM had introduced a powerful computer, Watson who triumphed in a game of Jeopardy against two all- time champions of the Game. The increase in computing power by several orders of magnitude and the sophistication of the computer algorithms have led to giant strides in the fields of searching, pattern recognition and artificial intelligence. In a short span of fifty years, computers and processors have pervaded almost all spectrums of business. They are at the core of communication networks, trading and banking operations, manufacturing and supply chain infrastructures, health care delivery, aviation and transport systems, national defense, education and research.
The pace of innovation in computer technology has been truly remarkable and new systems design continues to open new frontiers and expand the realm of human experience. Here is a selection of books that will help readers explore the history of computing and provide a glimpse into some key concepts in computer science and networks, as well as some of the exciting applications that are extending the boundaries of scientific innovation in the 21st century.
In this compelling book George Dyson provides a fascinating description of the origins of digital computing and the birth of software and how it has profoundly changed the world around us. He traces back the history to the British mathematician, Turing who laid out the mathematical foundation on which modern computers are based. Groundbreaking research continued in this field at the Institute of Advanced Studies at Princeton with the initial work focused on the numerical solution of complex equations governing thermo nuclear explosion and the simulation of replication processes in biology; however the core ideas led to the development of powerful computers and new methods to solve problems in physics, chemistry, engineering and business. The book is interspersed throughout with captivating illustrations and is rich in historical references to scientists like Von Neumann who shaped significant breakthroughs in the aftermath of World War II . The growth of Princeton as a center of scientific research and technology innovation is brilliantly chronicled in the book.

Providing a lucid insight into the theoretical ideas underlying the discipline of computer science, this absorbing book is equally engaging to both the general audience as well as to readers with a more advanced technology background. The author highlights the organization and logical structure of computations performed by computers underlying such mainstream tasks as database and Google search, web based commerce, electronic financial transactions, and the transmission of large data sets over networks. The focus of the book is on explaining the fundamental and core constructs that make the efficient execution of these complex operations possible. In each of these algorithms a sequence of mathematical operations are used to achieve the specific results. Algorithms of PageRank, Data compression, Public Key Cryptography, Search Engine Indexing, Error Correcting Codes, Pattern recognition and Relational Databases are explained in clear simple terms by MacCormick who succeeds in creating a sense of wonder and a desire to dig deeper into the fascinating world of computer science.
Over the last 40 years computers have penetrated every facet of our lives - enabling electronic commerce, processing of bank transactions, weather predictions and space explorations . They have become faster, bigger and very portable. What, however, is the future of computing? Would computers be able to think, sift through data and provide answers? Could computers be programmed to interact with humans? Could computers be made to understand human language with all its nuances and contextual references? In 2011 IBM scientists answered these questions with Project Jeopardy and designed a question answering computer named Watson that could play the game of Jeopardy and win against formidable opponents. Readers get an upfront and close look at the years of deep research and innovation that went into the creation of Watson. The process signals the beginnings of a new frontier in artificial intelligence and cognitive computing that has opened up a whole new range of possibilities in diverse areas e.g. a research collaboration between Columbia University and the University Maryland is focused on a project where Watson will be used to aid physicians in making medical diagnosis – imagine a doctor who has awareness of every case ever published in a journal and recalls them all instantly!
A writer for Wired magazine, the author provides interesting insights into how millions of computers are connected through the Internet. The focus is on understanding the physical infrastructure (tubes) which supports this connectivity and how this infrastructure is constructed in a very modular way using switches, routers, fiber optics and other data center equipments. Andrew Blum tells a fascinating story of how internet networking started, how data moves through cables stretching over continents and under vast oceans, and how hubs and spoke architecture extend the reach of the Internet to remote places. He also discusses the logical layers (software protocols) which enables Internet networking to function with minimum supervision or arbitration. Blum also provides a view of how Internet giants (like Facebook) are harnessing new technologies to ensure that the connectivity, bandwidth and data delivery through the network of Tubes continues to keep pace with our requirements to communicate, collaborate and innovate.

- Nita Mathur

Comments