Media Search:



The Route to Robust Quantum Computing: Interview with Shruti Puri | The New York Academy of Sciences – The New York Academy of Sciences

Quantum computing is a radically new way to store and process information based on the principles of quantum mechanics. While conventional computers store information in binary bits that are either 0s or 1s, quantum computers store information in quantum bits, or qubits. A qubit can be both 0 and 1 at the same time, and a series of qubits together remember many different things simultaneously.

Everyone agrees on the huge computational power this technology may bring about, but why are we still not there yet? To understand the challenges in this field and its potential solutions, we recently interviewed Shruti Puri, PhD, who works at the frontier of this exciting field. Puri is an Assistant Professor in the Department of Applied Physics at Yale University, and a Physical Sciences & Engineering Finalist of the 2020 Blavatnik Regional Awards for Young Scientists, recognized for her remarkable theoretical discoveries in quantum error correction that may pave the way for robust quantum computing technologies.

What is the main challenge you are addressing in quantum computing?

Thanks to recent advances in research and development, there are already small to mid-sized quantum computers made available by big companies. But these quantum computers have not been able to implement any practical applications such as drug and materials discovery. The reason is that quantum computers at this moment are extremely fragile, and even very small noise from their working environment can very quickly destroy the delicate quantum states. As it is almost impossible to completely isolate the quantum states from the environment, we need a way to correct quantum states before they are destroyed.

At a first glance, quantum error correction seems impossible. Due to the measurement principle of quantum mechanics, we cannot directly probe a quantum state to check if there was an error in it or not, because such operations will destroy the quantum state itself.

Fortunately, in the 1990s, people found indirect ways to faithfully detect and correct errors in quantum states. They are, however, at a cost of large resource overheads. If one qubit is affected by noise, we have to use at least five additional qubits to correct this error. The more errors we want to correct, the larger number of additional qubits it will consume. A lot of research efforts, including my own, are devoted to improving quantum error correction techniques.

What is your discovery? How will this discovery help solve the challenge you mention above?

In recent years, I have been interested in new qubit designs that have some in-built protection against noise. In particular, I developed the Kerr-cat qubit, in which one type of quantum error is automatically suppressed by design. This reduces the total number of quantum errors by half! So, quantum computers that adopt Kerr-cat require far fewer physical qubits for error correction than the other quantum computers.

Kerr-cat is not the only qubit with this property, but what makes the Kerr-cat special is that it is possible to maintain this protection while a user tries to modify the quantum state in a certain non-trivial way. As a comparison, for ordinary qubits, the act of the user modifying the state automatically destroys the protection. Since its discovery, the Kerr-cat has generated a lot of interest in the community and opened up a new direction for quantum error correction.

As a theoretician, do you collaborate with experimentalists? How are these synergized efforts helping you?

Yes, I do collaborate quite closely with experimentalists. The synergy between experiments and theory is crucial for solving the practical challenges facing quantum information science. Sometimes an experimental observation or breakthrough will provide a new tool for a theorist with which they can explore or model new quantum effects. Other times, a new theoretical prediction will drive experimental progress.

At Yale, I have the privilege to work next to the theoretical group of Steve Girvin and the experimental groups of Michel Devoret and Rob Schoelkopf, who are world leaders in superconducting quantum information processing. The theoretical development of the Kerr-cat qubit was actually a result of trying to undo a bug in the experiment. Members of Michels group also contributed to the development of this theory. What is more, Michels group first experimentally demonstrated the Kerr-cat qubit. It was just an amazing feeling to see this theory come to life in the lab!

Are there any other experimental developments that you are excited about?

I am very excited about a new generation of qubits that are being developed in several other academic groups, which have some inherent protection against noise. Kerr-cat is one of them, along with Gottesman-Kitaev-Preskill qubit, cat-codes, binomial codes, 0 qubit, etc. Several of these designs were developed by theorists in the early 2000s, and were not considered to be practical. But with experimental progress, these have now been demonstrated and are serious contenders for practical quantum information processing. In the coming years, the field of quantum error correction is going to be strongly influenced by the capabilities that will be enabled by these new qubit designs. So, I really look forward to learning how the experiments progress.

Interested in the latest experimental developments in quantum computer design and architecture? Register for the webinar Scaling up: New Advances in Building Quantum Computers, hosted by the New York Academy of Sciences on April 7. Featured speakers of this webinar include Andrew Houck, PhD, Professor of Electrical Engineering at Princeton University and Deputy Director of the Co-design Center for Quantum Advantage, and Christopher Monroe, PhD, Professor of Electrical and Computer Engineering and Physics at Duke University and Director of the Duke Quantum Center.

Read the original post:
The Route to Robust Quantum Computing: Interview with Shruti Puri | The New York Academy of Sciences - The New York Academy of Sciences

Texas Warned Us What Quantum Computers Will Do To The Power Grid – Forbes

Millions without power; stores and banks shut down; vital services running on emergency generators if at all; lines of hapless people waiting for food and water. The experience that the state of Texas underwent this February will be only a preview of what we would all face in the event of a major cyberattack on our ever-vulnerable energy grid.

In the case of an attack by a future quantum computer, with its unprecedented power to decrypt existing encryption systems, the shutdown could be the most catastrophic disaster our country has ever experienced. Using data supplied by the global econometrics firm Oxford Economics, our researchers at Hudson Institutes Quantum Alliance Initiative have been working on a quantitative study of a future quantum cyberattack on the grid.Our preliminary data shows that protection of our power networks, needs to be an urgent national priority.

Experts have been warning us for years about how vulnerable the national power grid is to attacks by malicious actors like Russia, China, and Iran. The Department of Energy has a major task force, the North American Energy Resiliency Model (NAERM), looking into how to protect our energy grid from natural disasters but also terrorism and cyber assaults.

But a quantum computer attack would be far more protracted and far worse in its effects.Indeed, the smarter the grid is, with more supervision and control by computers, the more vulnerable it would be.

This is because a large-scale quantum computer in the future will be able to break into any encryption system currently protecting the Supervisory Control and Data Acquisition computers that oversee the power grid.The structural design of a standard SCADA industrial control system relies on Remote Terminal Units (RTUs) and Programmable Logic Controllers (PLCs). These are the microprocessors that communicate and interact with field devices such as valves, pumps, and Human Machine Interface (HMI) software application that presents information to an operator or user about the state of an on-going process.That communication data is then routed from the processors to the SCADA computers, where the software displays and interprets the data allowing for operators to analyze and react to system events.

The danger is that a quantum computer will be able to gain access to these major nerve centers of the grid as if the attacker were a bona fide operator.This will allow the attacker to spread malware undetected throughout the grid, which will severely hinder response and recovery for weeks or months.

The notion of resilience in the nations power grid becomes obsolete.And instead of triggering a complete shutdown, a quantum intrusion can lead to sudden inexplicable power losses and sudden power surges that can melt down transformers and render entire power plants inoperable.

In short, the damage will be similar to that of an Electro-Magnetic Pulse (EMP) attack terrorism experts have feared for yearsbut stealthier, more unpredictable, and more protracted.

Even if the nations nuclear power plants are insulated from such an attack, the economic costs would be catastrophic.

How bad could the damage be?Our study indicates the direct economic cost of this quantum-led electricity outage would be over $8.6 trillion, with a disruptive impact extending over six fiscal quarters. Everything from financial markets to manufacturing and healthcare would be disrupted, for weeks or even months. Looking at the cost in terms of GDP at Risk or the integrated difference between the forecasted GDP growth for the economy and the estimations for GDP growth under the attack scenario, we have found that the total economic loss could extend over eight years or more at a cost of more than $20 trillion-roughly equivalent to the loss of an entire years output for the U.S. economy.

These numbers do not include the impact on Canadas economy, which is part of the North American Power Grid, or the global impact of a U.S. economy in a powerless free fall. Ironically, if Texas ignores the advice of Bill Gates and others that it join the national grid, it could be the one part of the country to emerge from such a disaster relatively unscathed.

What are the steps necessary to avoid such a scenario, and the devastating economic loss such an attack would entail?

First, we need to incentivize power companies to speed up protections for SCADA systems against conventional cyberattacks on the grid as well as future quantum ones. This means moving on deploying post-quantum cryptography, i.e. encryption based on algorithms that will withstand quantum intrusion, and quantum cryptography, i.e. encryption using quantum random number generation for its keys, to secure networks from hackers.

Second, we need to develop a national strategic reserve of Large Power Transformers (LPTs) that can be deployed in case of cyberattacks that specifically target LPTs, the essential sinews of the North American Power Grid.

Third, we need closer cooperation with Canada in working together on that grid, in order to mitigate the risks of attackwhether conventional today or quantum-based tomorrowas well as the damage done by natural disasters including climate change.

In the end, avoiding a Texas-like national shutdown of our power grid will be a matter of spending billions to offset the risk of losing trillions.Thats not a bad bargain when our entire economy, and economies around the world, are at risk.

Here is the original post:
Texas Warned Us What Quantum Computers Will Do To The Power Grid - Forbes

Crucial Milestone for Scalable Quantum Technology: 2D Array of Semiconductor Qubits That Functions as a Quantum Processor – SciTechDaily

Schematic of the four-qubit quantum processor made using semiconductor manufacturing technology. Credit: Nico Hendrickx (QuTech)

The heart of any computer, its central processing unit, is built using semiconductor technology, which is capable of putting billions of transistors onto a single chip. Now, researchers from the group of Menno Veldhorst at QuTech, a collaboration between TU Delft and TNO, have shown that this technology can be used to build a two-dimensional array of qubits to function as a quantum processor. Their work, a crucial milestone for scalable quantum technology, was published today (March 24, 2021) in Nature.

Quantum computers have the potential to solve problems that are impossible to address with classical computers. Whereas current quantum devices hold tens of qubits the basic building block of quantum technology a future universal quantum computer capable of running any quantum algorithm will likely consist of millions to billions of qubits. Quantum dot qubits hold the promise to be a scalable approach as they can be defined using standard semiconductor manufacturing techniques. Veldhorst: By putting four such qubits in a two-by-two grid, demonstrating universal control over all qubits, and operating a quantum circuit that entangles all qubits, we have made an important step forward in realizing a scalable approach for quantum computation.

Electrons trapped in quantum dots, semiconductor structures of only a few tens of nanometres in size, have been studied for more than two decades as a platform for quantum information. Despite all promises, scaling beyond two-qubit logic has remained elusive. To break this barrier, the groups of Menno Veldhorst and Giordano Scappucci decided to take an entirely different approach and started to work with holes (i.e. missing electrons) in germanium. Using this approach, the same electrodes needed to define the qubits could also be used to control and entangle them. No large additional structures have to be added next to each qubit such that our qubits are almost identical to the transistors in a computer chip, says Nico Hendrickx, graduate student in the group of Menno Veldhorst and first author of the article. Furthermore, we have obtained excellent control and can couple qubits at will, allowing us to program one, two, three, and four-qubit gates, promising highly compact quantum circuits.

Menno Veldhorst and Nico Hendrickx standing next to the setup hosting the germanium quantum processor. Credit: Marieke de Lorijn (QuTech)

After successfully creating the first germanium quantum dot qubit in 2019, the number of qubits on their chips has doubled every year. Four qubits by no means makes a universal quantum computer, of course, Veldhorst says. But by putting the qubits in a two-by-two grid we now know how to control and couple qubits along different directions. Any realistic architecture for integrating large numbers of qubits requires them to be interconnected along two dimensions.

Demonstrating four-qubit logic in germanium defines the state-of-the-art for the field of quantum dots and marks an important step toward dense, and extended, two-dimensional semiconductor qubit grids. Next to its compatibility with advanced semiconductor manufacturing, germanium is also a highly versatile material. It has exciting physics properties such as spin-orbit coupling and it can make contact to materials like superconductors. Germanium is therefore considered as an excellent platform in several quantum technologies. Veldhorst: Now that we know how to manufacture germanium and operate an array of qubits, the germanium quantum information route can truly begin.

Reference: A four-qubit germanium quantum processor by Nico W. Hendrickx, William I. L. Lawrie, Maximilian Russ, Floor van Riggelen, Sander L. de Snoo, Raymond N. Schouten, Amir Sammak, Giordano Scappucci and Menno Veldhorst, 24 March 2021, Nature.DOI: 10.1038/s41586-021-03332-6

Funding: The research is supported by NWO, the Dutch Research Council.

See more here:
Crucial Milestone for Scalable Quantum Technology: 2D Array of Semiconductor Qubits That Functions as a Quantum Processor - SciTechDaily

Quantum Computing and IBM i – IT Jungle

March 24, 2021Alex Woodie

At first glance, IBM i servers and quantum computers appear to be worlds apart. But considering the rapid advance of quantum computing today and the midrange servers place in a long line of advances in business computing, they may not be as far removed as one might think.

Thats the general conclusion one could draw from listening to Jack Woehr and Jesse Gorzinski discussing the topic of quantum computing during Woehrs presentation, From Hamilton to Hollerith: Whats the Use of Quantum Computers? during last weeks IBM i Futures Conference, which was sponsored by COMMON.

In addition to writing IBM i code and working with open source software on the platform, Woehr, who previously was an editor at the now-defunct Dr. Dobbs Journal, is also active in the quantum computing community. That activity, plus his 40 years of experience as a programmer, gives him a unique perspective into how the future lines of quantum computing and IBM i may intersect.

In Woehrs view, quantum computing is something that younger developers should keep an eye on. The technology is not necessarily ready for mainstream adoption today, but its moving so quickly and showing such promise that ignoring it would be a mistake, he said.

Like so many other things, its where the world is going, and if you want to stay competitive, youre going to have to deal with this, Woehr said. And if youre young, [quantum computing is] going to be there before you retire.

The IBM Q System One.

Just as it took some time for organizations to accept that digital binary computers were the future and to give up their punch card systems back in the 1940s and 1950s, there will be a period of transition between todays digital binary computers and the quantum computers of tomorrow, Woehr predicted.

When digital binary computers first came in, they were attaching to punch card machines and saying, look what we can do? And theyd say, well, we can already do that. Why would you want to buy this expensive machine to do that? Woehr said. Well, we know the answer to that now. But it wasnt as obvious from 1946 to 1953 as it is now.

Its hard to overstate the changes that modern computers have had on our lives. Many aspects of how we work and play have been digitized, and the digitization has increased during COVID-19. The most valuable companies in the world are technology companies (although some would call them data companies).

We have built all this technology on a platform of digital binary computing, which has Boolean algebra as its foundation. Everything were doing today electronically is these three operators, and, or, and not, which is all that digital binary computers actually do, Woehr said. Its had this tremendous effect on our world. But this was again not obvious to the people who would become very adept at operating the paper punch card machines.

Quantum computing promises to fundamentally transform how we calculate, how we program, and how we develop applications. Instead of two bits and three basic operators, quantum computing brings a much more capable mathematical underpinning that will unlock new capabilities, Woehr said.

Quantum computing is multi-dimensional compared to [digital binary computers] because its not based on Boolean algebra, Woehr said. Its based on linear algebra matrices multiplied [by] vectors, and the matrices and vectors are matrices and vectors of complex numbers.

In digital binary computing, only amplitude factors in, giving us ones and zeros. Well, quantum computing makes up amplitude and phase, for a start, and theres a lot of other things that are different about them. Theyre digital binary bits, but its more multi-dimensional than the way we compute now. And its likely to transform our world in ways that we cannot imagine.

The miracle of digital binary computers allows IBM i developer Jack Woehr to appear to be in Hawaii (he actually lives in Colorado).

IBM, Google, and Microsoft arguably are the leaders in developing quantum computers today, but there are a lot of other companies from around the world making a play, with a variety of designs, some of which will go further than others. Its hard to tell who the leaders will be in the near future because the field is so new and moving so quickly, Woehr said. Were in a caucus race with quantum computing, he said, referencing the tumultuous footrace that took place in Alice in Wonderland.

In fact, there is a technical term for the raucous quantum din: Noisy Intermediate-Stage Quantum, or NISQ. What that means is, it sort of works, but its hard to get the right answer from it. You have to really look at what its saying, Woehr said.

One of the problems is that quantum computers are not very good at holding onto their state. Getting materials into the quantum state, and keeping them there, is currently a work in progress. That presents a problem when trying to get quantum computers to do useful work, such as solving an optimization problem (which is one class of applications that quantum computers excel at).

There is plenty of work to do in quantum computing, and that work is moving extremely fast. Its unclear exactly when some of these problems are going to be solved, and when quantum computers will be practical for adoption by businesses. But theres one thing that nobody doubts any more: whether quantum computing actually works.

But that wasnt the case 11 years ago.

There was some doubt in 2010 if this was real or not, Woehr said. Even in scientific circles there were doubts whether this was real. But it does work and we know it works now.

The main benefit is a time advantage, he said. With its richer space of operators and states, quantum computers solve some problems significantly faster than traditional binary computers.

Optimization problems are one of the most promising areas for quantum computers, Woehr said. For example, some companies put a lot of time into calculating how much ore theyre likely to remove from a mine. They consider the placement of ore in the mine, along with variables like fuel and labor costs, the weather, and market prices.

You have these huge, huge optimization problems that have many, many variable and they put them on supercomputers and run for weeks, Woehr said. Quantum computing happens to be very good at optimization.

This is where the futures of quantum computing and IBM i may intersect. Today, developers are programming quantum computers using open source frameworks like Qiskit (pronounced kiss-kit), which is a project that IBM is behind. Woehr and Gorzinski are both active in the Qiskit community. During their chat, Woehr demonstrated how an optimization problem could be solved on the IBM Q computer using Grovers Algorithm developed in Qiskit.

The problem they were solving00the optimal combination of ingredients to brew a batch of beer could be extended to many industries and use cases. Grovers can be used to solve the types of application problems that IBM i folks are familiar with, Woehr said.

Suddenly everyone here whos listening in will realize this could be any problem, he said. Grovers will be any kind of problem any kind of problem where we have multiple variables and some combination of their state is a valid solution and some combinations are not.

Gorzinski agreed. I think it comes back to this notion that theres a tidal wave coming but there are real applied use case that are out there for a lot of industries, especially the industries that the IBM listeners today are probably part of, he said. Its a competitive scenario in the future. People are going to want to adopt this technology, in my opinion.

Quantum computing may not be mainstream in 10 or even 20 years. But the pace of advances in the field is quickening, Woehr says, and folks who are starting their computing careers today would be wise to keep an eye on how the field develops.

The reason to look at it now is just to orient yourself, he says. It may be a while before you see it in your organization. But its coming. If you are in your 20s now its certainly going to be there before youre my age, before youre ready to retire.

Long-Term Impacts of COVID-19 Predicted for IBM i Shops

Mad Dog 21/21: Qubits Rubes

As I See It: Sub-Atomic Dreams

More:
Quantum Computing and IBM i - IT Jungle

Japan’s first leading-edge quantum computer to be installed this year – The Mainichi – The Mainichi

This photo shows IBM Corp.'s quantum computer that will be installed at Kawasaki Business Incubation Center in Kawasaki, Kanagawa Prefecture. (Photo courtesy of IBM Japan Ltd.)

TOKYO -- Japan will be getting its first leading-edge quantum computer this year.

IBM Japan Ltd. announced on March 23 that the computer made by its U.S. parent IBM Corp. will be installed at Kawasaki Business Incubation Center (KBIC) in the city of Kawasaki, Kanagawa Prefecture, just south of Tokyo. It will be in place within a few months, and will be in operation by the end of the year. The University of Tokyo, which holds exclusive access rights, will seek to put the machine to practical tasks in cooperation with companies through a dedicated consortium.

Quantum computers use quanta -- such as light -- which have the characteristics of both waves and particles, and can make multiple calculations simultaneously using a completely different process from conventional computers. It is expected to be used for purposes including developing new drugs and materials, and managing assets. Japan's first machine will be a "gate-model quantum computer," which theoretically has very broad applications. IBM and Google LLC are both developing this type of computer.

The University of Tokyo signed a partnership with IBM Japan in December 2019, and established the Quantum Innovation Initiative Consortium in July 2020 to turn quantum computers to practical use through the cooperation of government, industry and academia. The two universities and 12 companies that make up the consortium include Keio University, Toshiba Corp., Mitsubishi Chemical Holding Corp. and Mitsubishi UFJ Financial Group Inc. The consortium members will be able to access the quantum computer in Kawasaki through cloud technology.

IBM Corp. currently has more than 30 quantum computers in New York, and at least 140 companies and universities around the world access them through cloud technology. Many members of the Japanese consortium have also used the New York machines, but they are forced to compete for time on the systems with people around the world, limiting access periods. When the quantum computer has been installed in Japan, the consortium members will be able to use it for their research for longer stretches.

Hiroaki Aihara, the consortium's project leader and vice president of the University of Tokyo, said, "It's overwhelmingly advantageous to be able to get a lot of time on a cutting-edge computer. We want to develop quantum computer apps through industry-academia cooperation and accelerate the technology's use." Outside Japan, another quantum computer is set to enter operation in Germany in 2021.

KBIC is a research and development office space equipped with labs for start-ups. IBM Japan also uses the facility as a research center.

(Japanese original by Mayumi Nobuta, Science & Environment News Department)

Read the original post:
Japan's first leading-edge quantum computer to be installed this year - The Mainichi - The Mainichi