# Chris Bernhardt

# info

I am a professor emeritus of mathematics at Fairfield University. Originally from England, I came to the United States after completing a PhD in mathematics from the University of Warwick.

My books are on the the theory of computation and information— areas that encompass mathematics, computer science and physics. They contain many beautiful and counterintuitive ideas. My aim is to introduce non-specialists to some of the most important of these ideas by keeping everything as simple as possible.

A concise, accessible, and elegant approach to mathematics that not only illustrates concepts but also conveys the surprising nature of the digital information age.

Most of us know something about the grand theories of physics that transformed our views of the universe at the start of the twentieth century: quantum mechanics and general relativity. But we are much less familiar with the brilliant theories that make up the backbone of the digital revolution. In Beautiful Math, Chris Bernhardt explores the mathematics at the very heart of the information age. He asks questions such as: What is information? What advantages does digital information have over analog? How do we convert analog signals into digital ones? What is an algorithm? What is a universal computer? And how can a machine learn?

The four major themes of Beautiful Math are information, communication, computation, and learning. Bernhardt typically starts with a simple mathematical model of an important concept, then reveals a deep underlying structure connecting concepts from what, at first, appear to be unrelated areas. His goal is to present the concepts using the least amount of mathematics, but nothing is oversimplified. Along the way, Bernhardt also discusses alphabets, the telegraph, and the analog revolution; information theory; redundancy and compression; errors and noise; encryption; how analog information is converted into digital information; algorithms; and, finally, neural networks. Historical anecdotes are included to give a sense of the technology at that time, its impact, and the problems that needed to be solved.

Taking its readers by the hand, regardless of their math background, Beautiful Math is a fascinating journey through the mathematical ideas that undergird our everyday digital interactions.

An accessible introduction to an exciting new area in computation, explaining such topics as qubits, entanglement, and quantum teleportation for the general reader.

Quantum computing is a beautiful fusion of quantum physics and computer science, incorporating some of the most stunning ideas from twentieth-century physics into an entirely new way of thinking about computation. In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader. Bernhardt, a mathematician himself, simplifies the mathematics as much as he can and provides elementary examples that illustrate both how the math works and what it means.

Bernhardt introduces the basic unit of quantum computing, the qubit, and explains how the qubit can be measured; discusses entanglement—which, he says, is easier to describe mathematically than verbally—and what it means when two qubits are entangled (citing Einstein's characterization of what happens when the measurement of one entangled qubit affects the second as “spooky action at a distance”); and introduces quantum cryptography. He recaps standard topics in classical computing—bits, gates, and logic—and describes Edward Fredkin's ingenious billiard ball computer. He defines quantum gates, considers the speed of quantum algorithms, and describes the building of quantum computers. By the end of the book, readers understand that quantum computing and classical computing are not two distinct disciplines, and that quantum computing is the fundamental form of computing. The basic unit of computation is the qubit, not the bit.

An accessible and fascinating exploration of how Alan Turing’s mathematical theory gave rise to modern computer science and applications—from the desktops to cell phones

In 1936, when he was just 24 years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing’s Vision, Chris Bernhardt explains the theory for the general reader, beginning with its foundations and systematically building to its surprising conclusions. He also views Turing’s theory in the context of mathematical history, other views of computation (including those of Alonzo Church), Turing’s later work, and the birth of the modern computer.

Turing wanted to show that there were problems that were beyond any computer’s ability to solve; in particular, he wanted to find a decision problem that he could prove was undecidable. To explain Turing’s ideas, Bernhardt examines 3 well-known decision problems to explore the concept of undecidability; investigates theoretical computing machines, including Turing machines; explains universal machines; and proves that certain problems are undecidable, including Turing’s problem concerning computable numbers.

Errata:

I have a list of errata here . I hope these will be corrected in subsequent printings.

Fujio Yamamoto's blog

Fujio Yamamoto has a blog that contains explanations and apps he has developed to illustrate topics covered in the book. In particular, there are apps to illustrate the Eckert protocol and the polarized filter experiment.

Short intro to IBM quantum computer:

IBM has put a few quantum computers in the cloud that anyone can play with for free. I have written a very brief introduction for the absolute beginner. A Taste of Quantum Computing

Articles:

I believe that quantum computing will be taught in schools soon. Here's an article from The Conversation In the future, everyone might use quantum computers

I am not sure that ten minutes is enough to get much of an understanding of quantum computing, but here is my attempt Quantum computing in ten minutes