I worked at one of the quantum computing co's on their compiler stack (so pretty much pure classical compute stuff), but in order to have even a baseline understanding of the computations and programming using qubits, I had to first get a better intuition for the underlying quantum mechanics at play. This was a great introduction to the physics underpinning the computations:
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
It does! They also still have all their summer schools up that you can go through step by step. Although I must promote Strawberry fields as I believe photonic integrated systems really is the better option.
1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The classic text is Nielsen and Chuang's "Quantum Computation and Quantum Information" [0]. Whatever else you choose to supplement this book with, it is worth having in your library.
Nielsen and Chuang has the clearest exposition of quantum mechanics I've seen anywhere. Last year I was trying to learn quantum mechanics, not necessarily quantum computation, just out of a general interest in theoretical physics. I started with physics textbooks (Griffiths and Shankar) but it only really "clicked" for me when I read the first few chapters of Nielsen and Chuang.
I have "Essential Mathematics for Quantum Computing" by Woody and "Non-Standard Computation" by Gramß, et al. Both were worth reading, but assumed a bit of background with "foundations of computation."
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.
I worked at one of the quantum computing co's on their compiler stack (so pretty much pure classical compute stuff), but in order to have even a baseline understanding of the computations and programming using qubits, I had to first get a better intuition for the underlying quantum mechanics at play. This was a great introduction to the physics underpinning the computations:
https://www.youtube.com/watch?v=lZ3bPUKo5zc&list=PLUl4u3cNGP...
It's long, and the subject matter is intimidating at times, but watch, re-watch, then go deep by finding papers on subjects like superposition and entanglement, which are the key quantum phenomena that unlock quantum computing.
It also helps to understand a bit about how various qubit modalities are physically operated and affected by the control systems (e.g. how does a program turn into qubit rotations, readouts, and other instruction executions). Some are superconducting chips using electromagnetic wave impulses, some are suspending an ion/atom and using lasers to mutate states, or photonic chips moving light through gates - among a handful of other modalities in the industry and academia.
IBM's Qiskit platform may still have tooling, simulators, and visualizers that help you write a program and step through the operations on the qubit(s) managed by the program:
https://www.ibm.com/quantum/qiskit
It does! They also still have all their summer schools up that you can go through step by step. Although I must promote Strawberry fields as I believe photonic integrated systems really is the better option.
1) Generally the two models of QC are the digital/circuit model (analogous to digital logic gates, with some caveats, such as reversibility of operations, no-cloning theorem), and analog computation (tuning the parameters of a continuous-time quantum system in your lab such that the system produces useful output)
2) The physics/architecture/organization depends heavily on the type of computer being discussed. In classical computing, one "type" of computer has won the arms race. This is not yet the case for quantum computers, there are several different physical processes through which people are trying to generate computation, trapped ions, superconducting qubits, photonics, quantum dots, neutral atoms, etc.
3) There are several ways that you can simulate quantum computation on classical hardware, perhaps the most common would be through something like IBM's Qiskit, where you can keep track of the degrees of freedom of the quantum computer throughout the computation, and apply quantum logic gates in circuits. Another, more complicated method, would be something like tensor network simulations, which are efficient classical simulators of a restricted subset of quantum states.
4) In terms of research, one particularly interesting (although I'm biased by working in the field) application is quantum algorithms for nuclear/high energy physics. Classical methods (Lattice QCD) suffer from extreme computational drawbacks (factorial scaling in the number of quarks, NP-Hard Monte Carlo sign problems), and one potential way around this is using quantum computers to simulate nuclear systems instead of classical computers ("The best model of a cat is another cat, the best model of a quantum system is another quantum system")
If you're interested in learning more about QC, I would highly recommend looking at Nielsen and Chuang's "Quantum Computation and Quantum Information", it's essentially the standard primer on the world of quantum computation.
The classic text is Nielsen and Chuang's "Quantum Computation and Quantum Information" [0]. Whatever else you choose to supplement this book with, it is worth having in your library.
[0] https://a.co/d/aPsexRB
Nielsen and Chuang has the clearest exposition of quantum mechanics I've seen anywhere. Last year I was trying to learn quantum mechanics, not necessarily quantum computation, just out of a general interest in theoretical physics. I started with physics textbooks (Griffiths and Shankar) but it only really "clicked" for me when I read the first few chapters of Nielsen and Chuang.
I'd do zero requisites "Quantum Computing for Computer Scientists" by Yanofsky. That is a nice base.
I have "Essential Mathematics for Quantum Computing" by Woody and "Non-Standard Computation" by Gramß, et al. Both were worth reading, but assumed a bit of background with "foundations of computation."
https://en.wikipedia.org/wiki/Quantum_Computing_Since_Democr...
Quantum computation and information, by Nielsen and Chung
Standard textbook: Isaac Chuang and Michael Nielsen, "Quantum Computation and Quantum Information"
More mathy: A. Yu. Kitaev, A. H. Shen, M. N. Vyalyi, "Classical and Quantum Computation"
A killer app: Peter Shor, "Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer"
Some course notes: https://math.mit.edu/~shor/435-LN/
QC Researcher here!
1/ Digital and analog - where digital equals qubits and analog equals photonics, diamonds, or a range of other bit replacements.
2/ Qubits and gates are the building blocks and operations in digital. Photons, diamonds, electrons, and so on are the bits in analog, you can encode any of these with information in various ways.
3/Strawberry fields for analog qc, and IBM's qiskit for digital
I work on photonic integrated circuits and adapt them to remove the physical limitations on capacity, such as heat, and information loss.