Thursday, December 31, 2015

Google's Quantum Computer :100 Million Times Faster Than a Normal Computer

GOOGLE IS UPGRADING its quantum computer. Known as the D-Wave, Google’s machine is making the leap from 512 qubits—the fundamental building block of a quantum computer—to more than a 1000 qubits. And according to the company that built the system, this leap doesn’t require a significant increase in power, something that could augur well for the progress of quantum machines.


Together with NASA and the Universities Space Research Association, or USRA, Google operates its quantum machine at the NASA Ames Research center not far from its Mountain View, California headquarters.
Though the D-Wave machine is less powerful than many scientists hope quantum computers will one day be, the leap to 1000 qubits represents an exponential improvement in what the machine is capable of. What is it capable of? Google and its partners are still trying to figure that out. But Google has said it’s confident there are situations where the D-Wave can outperform today’s non-quantum machines, and scientists at the University of Southern California have published research suggesting that the D-Wave exhibits behavior beyond classical physics.
At a time when today’s computer chip makers are struggling to get more performance out of the same power envelope, the D-Wave goes against the trend.
    # What happened:
  • After running a series of tests between their D-Wave quantum computer (which uses quantum bits, or "qubits") and a normal single core computer (which uses normal bits), Google has concluded that the operation run on the quantum computer was 100 million times faster.
    # Why it's important:
  • As Moore's Law(this law states that processor speeds, or overall processing power for computers will double every two years. )continues to power exponential improvements in computing power and capacity, advances in quantum computing and other new computing mechanisms will help overcome current limitations and accelerate even more. These advances will be critical as we develop more advanced artificial intelligence systems, virtual simulations and optimization problems.

No comments:

Post a Comment