Chris Bernhardt

info

I am a professor emeritus of mathematics at Fairfield University. Originally from England, I came to the United States after completing a PhD in mathematics from the University of Warwick.

My books are on the Theory of Computation — an area that encompasses mathematics, computer science and physics. This area contains many beautiful and counterintuitive ideas. My aim is to introduce non-specialists to some of the most important of these ideas by keeping everything as simple as possible.


Quantum computing is a beautiful fusion of quantum physics and computer science, incorporating some of the most stunning ideas from twentieth-century physics into an entirely new way of thinking about computation. In this book, Chris Bernhardt offers an introduction to quantum computing that is accessible to anyone who is comfortable with high school mathematics. He explains qubits, entanglement, quantum teleportation, quantum algorithms, and other quantum-related topics as clearly as possible for the general reader.



In 1936, when he was just twenty-four years old, Alan Turing wrote a remarkable paper in which he outlined the theory of computation, laying out the ideas that underlie all modern computers. This groundbreaking and powerful theory now forms the basis of computer science. In Turing's Vision, Chris Bernhardt explains the theory, Turing's most important contribution, for the general reader.


Errata:

I have a list of errata here . I hope these will be corrected in subsequent printings.

Short intro to IBM quantum computer:

IBM has put a few quantum computers in the cloud that anyone can play with for free. I have written a very brief introduction for the absolute beginner. A Taste of Quantum Computing

Articles:

I believe that quantum computing will be taught in schools soon. Here's an article from The Conversation In the future, everyone might use quantum computers

I am not sure that ten minutes is enough to get much of an understanding of quantum computing, but here is my attempt Quantum computing in ten minutes