Learning computer science and programming can – and should! – be fun. It’s time to investigate interesting tech-related books. They aren’t study books, so you’ll combine pleasure and learning while reading them.
The author of this book is a programmer, a musician, a futurist, and the “father of virtual reality.” It’s hard to describe the genre: childhood memories are combined with philosophy, thoughts about the human brain, and, of course, virtual reality itself. Jaron tells his story from the very beginning, remembering the loss of his mother, his path as a scientist, all his endeavors – and the book catches you and keeps you till the last page.
Once, when Jaron was a kid, the local telephone network experienced a failure. He picked up the phone and could hear hundreds of voices. Some were closer, some far away, but they seemed somehow connected in one virtual space. Anonymity made it easy to talk to strangers, so kids started communicating. But the next day at school, nobody talked about it. And Jaron was looking at kids around him and trying to guess who they were. It was the first time he started thinking about technologies capable of bringing people closer.
If you wonder what virtual reality's roots are, its real value, and how tech scientists think – it's a must-read for you.
We know Walter Isaacson because he's the author of the famous Steve Jobs biography. But this book is worth your attention, too. Isaacson tells the stories of the people who contributed to computer science. For example, you can find Ada, Countess of Lovelace, in the book. She was Lord Byron's daughter but also a pioneer in computer programming. Coding goes back to the 1840s; did you know that?
There's also a story of Vannevar Bush, who came up with the idea of the differential analyzer (an analog electromechanical computer) in the 1930s. Alan Turing and his work on deciphering German codes during WWII, the invention of a transistor at Bell Labs in 1947, the first women in computer science, the launch of Sputnik, the birth of the "hypertext," and then the Internet, and many other topics are covered in this book. It's inspirational, sometimes eye-opening, and 100% necessary for anyone who works in the tech sphere.
Elon Musk and Bill Gates recommend this book, so it’s definitely worth your time. The author is a philosopher with a background in theoretical physics (nice combo!). Also, Nick Bostrom is a founding director of the Future of Humanity Institute at Oxford University. What will happen when real artificial intelligence appears? Should we be happy about our progress in this field or be afraid?
The author looks at different scenarios and analyses them. Many studies show that the “smarter” artificial intelligence gets, the safer it is. But what if it’s just a trick to fool humanity and make us work faster? What if after AI becomes extremely powerful, it will attack us? After all, AI may have totally different principles and values compared to people. Or it may have no value at all.
This book is thrilling because you read about things that may happen quite soon and have an enormous impact on the world as we know it.
Every day we face problems that can be solved with the help of algorithms. For example, finding an apartment to rent or dating. Both these life tasks belong to the class of math problems called "optimal stopping.” And if you know the most efficient way to act, you can save precious time, money, and effort. For instance, when searching for a place to live, the authors recommend you spend 37% of your time (say, 11 days in a month) looking at apartments without making a decision. After such calibration, you should rent the first apartment that seems better than every place you've seen so far. That's how you minimize the time spent and maximize the value you get.
It's just one example of the problems described in this book. The authors help the reader apply computer science principles to daily life: from finding a free parking spot to dealing with a mess at your home. If you enjoy practical tips with a touch of science, this book is right for you.
Being a programmer is not only about code. It’s also about understanding where the world of technology is heading and what results and consequences humans’ actions will have. According to the author, now we can see the dawn of the “fourth age.”
The first age began when people discovered fire and started cooking food using it. The primary technology of the second age was agriculture. Thanks to it, humans had more food, and they started building cities and living there. The third age began 5,000 years ago after the invention of the written language. And until recently, we’ve been living in the third age, despite all the technologies we created, from the steam engine to power plants.
The most important invention of the third age was the computer. As Reese says, computing is the heartbeat of the universe. Computers opened the door to the fourth age for humanity. They helped us create artificial intelligence and robots, and these technologies will change our lives. How exactly? You should read the book to find out.
"What does this book have to do with computer science?" you may ask. But the truth is programmers need critical thinking. Also, they need to know how to interpret data correctly and make conclusions.
We all have a common problem: we are systematically wrong when estimating. The authors suggest a simple multiple-choice test to illustrate this idea. For example, one of the questions is about poverty: how has a part of the population living in poverty changed over the last 20 years: almost doubled, stayed roughly the same, or practically halved? Only 7% of the people answered the question correctly (almost halved).
As it turns out, we think much worse of the world than it really is. It happens to teachers, scientists, journalists, and even Nobel Prize winners. The authors explain why it is so and what to do about it.
This book is fresh-from-the-oven, but it’s already been shortlisted for the Financial Times Business Book of the Year award and received excellent reviews in New York Times and other media.
Computer chips play a central part in the whole tech sphere. Almost everything needs chips: smartphones, cars, and even the stock market. Chips define the country with the most significant military, economic, and geopolitical power. No wonder the real war is breaking out around them. The USA could dominate the global arena because they controlled advances in computer chips. But now the situation is changing. For example, China tries to become a major player (because this country spends more on buying chips than any other product, including oil) and floods this area with money.
Chris Miller, an economic historian, tells the story of the computer chips development, explains the current state of this market, and speculates about its future.
What do you think about this set of books? Share your opinion on them or recommend your favorites!
Also published here.