Following on from the 4th meeting of the Atos Quantum Scientific Council held at its headquarters, Atos announces unprecedented simulation features in Quantum computing.
Atos QLM: simulating real Qubits for increased efficiency
Researchers at the Atos Quantum Laboratory have successfully modeled ‘quantum noise’ and as a result, simulation is more realistic than ever before, and is closer to fulfilling researchers’ requirements.
The Atos Quantum Learning Machine (QLM) now has advanced quantum hardware modeling capabilities such as physics-based realistic qubit noise simulation and optimization of quantum software for real quantum processors.
Thanks to this innovation, Atos QLM users are now able to optimize their quantum algorithms on any targeted quantum hardware. This major step has been recognized by the Atos Quantum Scientific Council as a breakthrough in the quantum computing research field.
Atos Chairman and CEO Thierry Breton surrounded by the Atos Quantum Scientific Council members on April 6 : Nobel prize laureate in Physics Serge Haroche, Daniel Estève, Alain Aspect, David DiVincenzo, Artur Ekert, (Fields Medal laureate Cédric Villani – member of the Atos Quantum Scientific Council excused) and the members of the managing team of Atos Quantum, next to an Atos QLM.[/caption]
Following the meeting of the Atos Quantum Scientific Council, Thierry Breton, Chairman and CEO of Atos, said:
“We are thrilled by the remarkable progress that the Atos Quantum program has delivered as of today. Since the launch of the Atos QLM in July 2017, the machine has benefited from continuous innovations by the Atos Quantum teams. By successfully modeling Quantum noise, researchers at the Atos Quantum Laboratory are now bringing simulation to a whole new dimension, ever closer to reality. This R&D breakthrough, supported by a world-class scientific Council, paves the way for researchers around the globe, enabling them today to test certain algorithms that will provide future quantum computers with their full capabilities, notably in artificial intelligence-related applications.”