IBM’s latest trick: Turning noisy quantum bits into machine learning magic


IBM’s latest trick: Turning noisy quantum bits into machine learning magic
IBM‘s figured out how to ignore noisy qubits and run machine learning algorithms in quantum feature spaces. Eureka-cadabra! The age of quantum algorithms is upon us. A team of IBM researchers created a pair of quantum classification algorithms and then experimentally implemented them on a hybrid system utilizing a 2-qubit quantum computer and a classical superconductor. Basically, they demonstrated that quantum computers can provide advantages in machine learning that classical computers, alone, cannot. According to the researchers’ white paper: Here we propose and experimentally implement two quantum algorithms on a superconducting processor. A key component in both methods is the…

This story continues at The Next Web
Or just read more coverage about: IBM

Tags: Startups, Science, Tech, Artificial Intelligence, Ibm, Insider

Source:  http://feedproxy.google.com/~r/TheNextWeb/~3/kTy8lQni6DI/



Related:
July 24, 2018 at 3:13 PM Scientists determine ‘shooter bias’ extends to black robots
May 7, 2018 at 11:30 AM Microsoft Excel gets custom JavaScript Functions and Power BI visualizations
April 8, 2018 at 2:50 PM Goldman Sachs made a big hire from Amazon to lead its artificial-intelligence efforts (GS, AMZN)
April 3, 2018 at 6:43 PM Goldman Sachs has made a big hire from Amazon to lead its artificial intelligence efforts (GS, AMZN)