Archive for the ‘Quantum Computer’ Category

Sanctuary Cognitive Systems Closes C$75.5 Million (US$58.5 Million) Series A Funding, Accelerating its Mission to Create the World’s First Human-Like…

Sanctuary is focused on creating the world's first human-like intelligence in general-purpose robots.

The strategic industry investors reflect applications for human-like intelligence in general-purpose robots across a wide range of industry verticals and tasks. Using breakthrough technology in artificial intelligence (AI), cognition, and robotics, Sanctuary will improve the quality of the work experience, assist humans with difficult or dangerous tasks, create new jobs, bring new opportunities to those who might be less capable of physical work, and reduce the impact of labour shortages around the world.

Many companies are developing special-purpose robots and AI to address singular tasks or activities. In contrast, the Sanctuary team is taking a much more general-purpose approach to both hardware and software, drawing from fields including AI, cognition, computer vision, machine learning, theoretical physics, and quantum computing. Sanctuary general-purpose robots are similar to a person in size and shape because the world is designed for people. The robot's cognitive architecture is also designed to mimic the different subsystems in a person's brain. This approach defines the scope of the work to something that can be broken down into manageable and achievable pieces.

Sanctuary also announced today that the first female private space explorer, Anousheh Ansari, and former astronaut and commander of the International Space Station, Chris Hadfield, joined the Sanctuary Advisory Board. In the not-too-distant future, Sanctuary technology will help people explore, settle, and prosper in outer space. But first, Sanctuary robots will train alongside people here on Earth, helping them work more safely, efficiently, and sustainably.

"With unfilled vacancies, workplace safety considerations, increasing employee turnover, worldwide aging populations, and declining workplace participation, one thing is clear: many labour-related challenges are outside the scope of current specialized AI and robotics technology," said Geordie Rose, Co-Founder and CEO of Sanctuary. "We are addressing a systemic problem across the global economy. I am excited about the group of industry partners and investors we assembled. With interest from customers representing a dozen different industry verticals, we are working hard to make work safer, more accessible, and ultimately more productive."

"At Verizon Ventures, we believe that breakthrough ideas need extraordinary support and execution," said Michelle McCarthy, Managing Director, Verizon Ventures. "Sanctuary's novel approach and progress set them apart, as they look to build transformative innovations of tomorrow with the potential to reshape the future of work."

"Our investment in Sanctuary is in complete alignment with our mission at Evok Innovations to protect the environment and strengthen the economy," said Marty Reed, Partner, Evok Innovations. "We believe that Sanctuary has the fastest, lowest-cost, and most commercially viable path to building human-like intelligence in machines. It's been exciting to participate in this funding round, and I look forward to what's in store for the future."

"Globally-leading networks like Bell's pure fibre broadband and Bell 5G enable emerging technologies like Sanctuary's to flourish," said Mirko Bibic, President and CEO, BCE Inc. and Bell Canada. "Bell is proud to help support the work Sanctuary is doing and the opportunities it will bring for Canadian industry in the years to come."

Interested parties can learn more at the official Sanctuary website: http://www.sanctuary.ai.

About SanctuaryFounded in 2018 by Geordie Rose, Suzanne Gildert, Olivia Norton, and Ajay Agrawal, Sanctuary is a Vancouver, Canada-based company. Sanctuary is on a mission to create the world's first human-like intelligence in general-purpose robots that will help us work more safely, efficiently, and sustainably. And in the not-too-distant future, help us explore, settle, and prosper in outer space.

Members of the Sanctuary team foundedD-Wave(a pioneer in the quantum computing industry),Kindred(first use of reinforcement learning in a production robot), and theCreative Destruction Lab(pioneered a revolutionary method for the commercialization of science for the betterment of humankind). The team has experience launching market-defining innovations rooted in previously unsolved and deep scientific problems.

SOURCE Sanctuary Cognitive Systems Corporation

Read more:
Sanctuary Cognitive Systems Closes C$75.5 Million (US$58.5 Million) Series A Funding, Accelerating its Mission to Create the World's First Human-Like...

How a quantum computer could break 2048-bit RSA encryption …

That raises an important questionwhen will quantum computers be powerful enough to do this? After that date, any information protected by this form of encryption becomes insecure.

So computer scientists have attempted to calculate the resources such a quantum computer might need and then work out how long it will be until such a machine can be built. And the answer has always been decades.

Today, that thinking needs to be revised thanks to the work of Craig Gidney at Google in Santa Barbara and Martin Eker at the KTH Royal Institute of Technology in Stockholm, Sweden. These guys have found a more efficient way for quantum computers to perform the code-breaking calculations, reducing the resources they require by orders of magnitude.

Consequently, these machines are significantly closer to reality than anyone suspected. The result will make uncomfortable reading for governments, military and security organizations, banks, and anyone else who needs to secure data for 25 years or longer.

First some background. Back in 1994, the American mathematician Peter Shor discovered a quantum algorithm that outperformed its classical equivalent. Shors algorithm factors large numbers and is the crucial element in the process for cracking trapdoor-based codes.

Trapdoor functions are based on the process of multiplication, which is easy to perform in one direction but much harder to do in reverse. For example, it is trivial to multiply two numbers together: 593 times 829 is 491,597. But it is hard to start with the number 491,597 and work out which two prime numbers must be multiplied to produce it.

And it becomes increasingly difficult as the numbers get larger. Indeed, computer scientists consider it practically impossible for a classical computer to factor numbers that are longer than 2048 bits, which is the basis of the most commonly used form of RSA encryption.

Shor showed that a sufficiently powerful quantum computer could do this with ease, a result that sent shock waves through the security industry.

And since then, quantum computers have been increasing in power. In 2012, physicists used a four-qubit quantum computer to factor 143. Then in 2014 they used a similar device to factor 56,153.

Its easy to imagine that at this rate of progress, quantum computers should soon be able to outperform the best classical ones.

Not so. It turns out that quantum factoring is much harder in practice than might otherwise be expected. The reason is that noise becomes a significant problem for large quantum computers. And the best way currently to tackle noise is to use error-correcting codes that require significant extra qubits themselves.

Taking this into account dramatically increases the resources required to factor 2048-bit numbers. In 2015, researchers estimated that a quantum computer would need a billion qubits to do the job reliably. Thats significantly more than the 70 qubits in todays state-of-the-art quantum computers.

On that basis, security experts might well have been able to justify the idea that it would be decades before messages with 2048-bit RSA encryption could be broken by a quantum computer.

Now Gidney and Eker have shown how a quantum computer could do the calculation with just 20 million qubits. Indeed, they show that such a device would take just eight hours to complete the calculation. [As a result], the worst case estimate of how many qubits will be needed to factor 2048 bit RSA integers has dropped nearly two orders of magnitude, they say.

Their method focuses on a more efficient way to perform a mathematical process called modular exponentiation. This is the process of finding the remainder when a number is raised to a certain power and then divided by another number.

This process is the most computationally expensive operation in Shors algorithm. But Gidney and Eker have found various ways to optimize it, significantly reducing the resources needed to run the algorithm.

Thats interesting work that should have important implications for anyone storing information for the future. A 20-million-qubit quantum computer certainly seems a distant dream today. But the question these experts should be asking themselves is whether such a device could be possible within the 25 years they want to secure the information. If they think it is, then they need a new form of encryption.

Indeed, security experts have developed post-quantum codes that even a quantum computer will not be able to crack. So it is already possible to safeguard data today against future attack by quantum computers. But these codes are not yet used as standard.

For ordinary people, there is little risk. Most people use 2048-bit encryption, or something similar, for tasks like sending credit card details over the internet. If these transactions are recorded today and broken in 25 years, little will be lost.

But for governments, there is more at stake. The messages they send todaybetween embassies or the military, for examplemay well be significant in 20 years and so worth keeping secret. If such messages are still being sent via 2048-bit RSA encryption, or something similar, then these organizations should start worryingquickly.

Ref: arxiv.org/abs/1905.09749 : How To Factor 2048 Bit RSA Integers In 8 Hours Using 20 Million Noisy Qubits

Read more here:
How a quantum computer could break 2048-bit RSA encryption ...

Whos winning the quantum computing race? China and the U.S. are neck and neck – The Business of Business

If you can barely wrap your head around the basic functioning of a PC (as is my situation), then prepare to be even more confused by quantum computers. Luckily, you dont need a physics doctorate to get how they will likely change the world.

Quantum computing applies the properties of quantum (also known as subatomic) particles to solve computation problems more efficiently than traditional computers. The extra computational power it unleashes could be a game changer for companies running simulations of biochemical reactions for drug discovery or modeling scenarios for investment portfolios. Meanwhile, quantum communication allows for the transmission of classified information in a virtually unhackable way. It can also be weaponized to decrypt messages transmitted with traditional encryption methods.

The discipline of quantum computing has existed as an offshoot of quantum theory since the 1980s. Superposition or the ability of a quantum particle to take multiple values at once underpins the functioning of a quantum computer. Unlike traditional computers that run on bits that take the value of either 0 or 1, quantum computers use qubits (short for quantum bits), which can be equal to both values at once. This makes them exponentially more powerful than computing devices that run on regular bits, capable of solving certain problems that are out of reach even for the mightiest Stanford-built supercomputer. Given the possibilities, its not hard to appreciate why countries are in a race to conquer this new technology. The competition is especially heated between the U.S. and China. We looked at the state of research in the field and its industry applications across both nations to see who may have an edge in reaching quantum supremacy.

While a number of governments and private sector entities around the world are pushing forward quantum computing innovation, the U.S and China have emerged as leaders in the field. Both countries are home to some of the foremost research institutions in quantum technology and have produced remarkable achievements in the field.

It is difficult to identify a clear winner since each country has its own strong suits and weak points. We looked at a number of metrics to compare the state of quantum computing research in the U.S and China. They included the number of patents granted, the development of the private sector ecosystem, quantum and the level of government support for the quantum computing ecosystem.

The U.S is winning the patent game with over a thousand quantum computing patents granted within its borders in the ten years preceding 2021. It is important to note that some of these patents have been filed by entities located outside the U.S. But the high number of granted patents within the U.S is an indication that the U.S is an attractive market for quantum innovation.

In the same time frame, the Chinese patent office granted less than 400 quantum computing patents. China has ho surpassed the U.S in the total number of international patents filed in 2019 and has maintained that lead into 2020.

The advantage of the U.S lies in its rich private sector quantum computing research and development ecosystem, including Google, Microsoft, IBM and 12 other public companies. It is also home to 78 quantum computing startups, according to Tracxn, a website that tracks companies by industry. The U.S leads the world in quantum computing private equity, with 110 deals closed between 2016 and 2021, compared with Chinas 30.

In 2019, Google was first to achieve so-called quantum supremacy, or the creation of a quantum computer that is able to solve specific problems faster than a classical computer. The landmark achievement has since been surpassed by researchers from the University of Science and Technology of China (USTC), who claim to have built two quantum computers that are more powerful than Googles.

China is ahead of the game in quantum communications, with considerable victories in the domain. In 2016, it became the first country to launch a quantum satellite that enables extremely secure transmission of information without traditional encryption methods. It followed up on this achievement with the establishment of an integrated quantum communication network in 2019, connecting two satellites to fiber optic cables on the ground.

Chinas success in quantum technology innovation can be credited to strong government support for basic research in the field. This is manifested by the fact that the universities that are responsible for Chinas strides in quantum technology are all public. In its 13th five-year plan released 2016, the Chinese government named quantum communication and computing among the priority areas for the government innovation strategy. The next year, it announced a $10 billion investment in a National Laboratory For Quantum Information Sciences.

Meanwhile, the U.S government has been playing catchup with the passage of the National Quantum Initiative in 2018. The congressional act established a national strategy for quantum computing and a framework for government agencies to support research efforts, in addition to earmarking $1.2 billion in funding. The Department of Energy is using $625 million of the earmarked cash to finance five quantum information research hubs composed of national labs, universities, and companies.

While the U.S may lag behind China in terms of total government spending on quantum computing research, it has the advantage of counting a number of quantum-forward nations among its allies. It has so far made steps towards quantum collaboration with the UK, Japan, and Australia.

Ad placeholder

Originally posted here:
Whos winning the quantum computing race? China and the U.S. are neck and neck - The Business of Business

Is our universe a holographic projection? Scientists are using black holes and quantum computing to find out. – Space.com

What happens inside a black hole and how does whatever goes on inside relate to the universe outside it? It's a mystery that has evaded scientists for decades. It's also a mystery that could also redefine how we think about the nature of the universe.

In a paper published in the journal PRX Quantum on Feb. 10, physicists applied quantum computers and machine learning the computational technology that powers facial recognition software and natural language processing to test a possible connection.

Black holes play by Einstein's theory of general relativity, which governs the universe on large scales. Tiny particles outside of black holes play by the rules of the Standard Model of particle physics, which outlines the universe at the absolute tiniest scales.

Related: 8 ways we know that black holes really do exist

There's little overlap and a lot of disagreement between the two systems. "Connecting the two different theories is a longstanding issue in physics something people have been trying to do since the last century," Enrico Rinaldi, a physicist at the University of Michigan and RIKEN in Japan, and first author of the paper, said in a statement.

One hypothesis that might connect the two theories is an idea that the motions of particles in a two-dimensional plane above the black hole reflect the three-dimensional motions of the black hole, almost like a holographic projection. It's a concept called holographic duality.

That's the concept researchers are hoping to test. First, use a quantum computer to simulate particles that represent a projection of a black hole, then use machine learning to analyze how the particles interact with each other. The researchers hope the process will offer them insight into how both the particles and the black hole work.

"We hope that by understanding the properties of this particle theory through the numerical experiments, we understand something about gravity," Rinaldi said in the statement.

That's important, because holographic duality could go beyond three dimensions. Some scientists even suggest that the universe could be a projection of something with even more dimensions.

That sounds difficult, but this method could be the path to learning more.

Follow us on Twitter @Spacedotcom or on Facebook.

Read more:
Is our universe a holographic projection? Scientists are using black holes and quantum computing to find out. - Space.com

Building The Bridge To The Quantum Future With Hybrid Systems – The Next Platform

While there is a lot of hype, there is no question that quantum computers are going to revolutionize computing. But we are still in the early stages of exploring quantum development, and truly useful quantum systems are still years away. That does not mean that quantum lacks opportunities, however, and companies such as Dell and quantum startup IonQ are exploring the possibilities of hybrid systems that combine classical computer systems with quantum hardware.

IBM currently holds the record for the worlds largest superconducting quantum computer, with its Eagle processor announced last November packing in 127 quantum bits (qubits). But many experts believe that machines with many more qubits will be necessary in order to improve on the unreliability of current hardware.

Superconducting gate speeds are very fast, but youre going to need potentially 10,000 or 100,000 or even a million physical qubits to represent one logical qubit to do the necessary error correction because of low quality, said Matt Keesan, IonQs vice president for product development.

Keesan, speaking at anHPC community event hosted by Dell, said that todays quantum systems suffer greatly from noise, and so we are currently in the noisy intermediate-scale quantum (NISQ) computer era, unable yet to fully realize the power of quantum computers, because of that need for a lot more qubits to run fully fault tolerant quantum computers.

This NISQ era is projected to last for at least the next five years, until quantum systems have developed enough to be able to support qubits in the thousands.

In the meantime, researchers can still make advances by pairing current quantum systems with traditional classical computers, in a way that Keesan compares with adding a GPU to a server.

It turns out the quantum computer by itself isnt enough, he declared. Just like a GPU is more useful when paired with a regular CPU, the quantum processing unit or QPU is more useful today when paired with a classical computer.

Keesan cited some examples of problems that are amenable to this treatment. One, the Variational Quantum Eigensolver (VQE) algorithm, is used to estimate the ground state energy of small molecules. Here, the optimiser runs on a classical computer while the evaluation of that output happens in the quantum computer, and they work together back and forth iteratively.

Another, the quantum approximate optimisation algorithm (QAOA) can find approximate solutions to combinatorial optimization problems by pairing a classical pre-processor with a quantum computer. Quantum circuits can also be used as machine learning models, with the quantum circuit parameters being updated by the classical computer system and evaluated using quantum methods.

More explanation of this is available on IonQs blog, but the trick with these hybrid applications apparently lies in finding the right control points that allow the quantum and classical portions of the algorithms to effectively interact. VQE does this by creating a single quantum circuit with certain parameterized components, then using the classical optimisation algorithm to vary these parameters until the desired outcome is reached.

But this iterative process could easily be very slow, such that a VQE run might take weeks to execute round robin between a classical computer and a quantum computer, according to Keesan, unless the quantum and classical systems are somehow co-located. This is what Dell and IonQ have actually demonstrated, with an IonQ quantum system integrated with a Dell server cluster in order to run to run a hybrid workload.

This integration is perhaps easier with IonQs quantum systems because of the pathway it has taken to developing its quantum technology. Whereas some in the quantum industry use superconductivity and need the qubits to be encased in a bulky specialised refrigeration unit, IonQs approach works at room temperature. It uses trapped ions for its qubits trapped ions for its qubits suspended in a vacuum and manipulated using a laser beam, which enables it to be relatively compact.

We have announced publicly, were driving towards fully rack-mounted systems. And its important to note that systems on the cloud today, at least in our case, are room temperature systems, where the isolation is happening in a vacuum chamber, about the size of a deck of cards, Keesan explained.

Power requirements for IonQs quantum processors are also claimed to be relatively low, with a total consumption in kilowatts, So its very conceivable to put it into a commercial datacentre, with room temperature technology like were using now, Keesan added.

For organisations that might be wondering how to even get started in their quantum journey, Ken Durazzo, Dells vice president of technology research and innovation, shared what the company had learned from its quantum exploration.

One of the key ways Dell found to get started with quantum is by using simulated quantum systems, which Durazzo refers to as using virtual QPUs or vQPUs, to allow for hands-on experimentation to allow developers and engineers to become familiar with using quantum systems.

Some of the key learnings that we identified there were, how do we skill or reskill or upskill people to quickly bridge the gap between the known and the unknown in terms of quantum? Quantum computation is dramatically different than the classical computation, and getting people with hands-on experience there is a bit of a hurdle. And that hands on experimentation helps get people over the hurdle pretty quickly, Durazzo explained.

Also vital is identifying potential use cases, and Durazzo said that zoning those down to a level of smaller action-oriented types of activities is key to really understanding where a user might find a benefit in terms of quantum computation, and therefore where to place the biggest bets in terms of solving these types of issues.

Dell also decided that bringing into operation a hybrid classical-quantum system would best suit their purposes, one in which it would be possible to transit workloads between virtual and the physical QPUs to provide a simple path from experimentation to production.

All of those learning activities enabled us to build a full stack suite of things that provided us the tools that allowed us to be able to integrate seamlessly with that hybrid classical quantum system, Durazzo said.

In Dells view of a hybrid classical-quantum computer, the processing capabilities comprise both virtual QPU servers and real QPUs that deliver that quantum processing capability. This arrangement provides the user with the ability to simulate or run experiments on the virtual QPUs that will then allow them to identify where there may be opportunities or complex problems to be solved on the real QPU side.

One area that we have focused on there is the ability to provide a seamless experience that allows you to develop an application inside of one framework, Qiskit for example, and run that in a virtual QPU or a real QPU just by modifying a flag, without having to modify the application, without having to change the parameters associated with the application, Durazzo explained.

Sonika Johri, IonQs lead quantum applications researcher, gave a demonstration of a hybrid classical-quantum generative learning application. This was trained by sampling the output of a parametrized quantum circuit, which is run on a quantum computer, and updating the static parameters using a classical optimisation technique. This was run on both run on both a quantum simulator a virtual QPU as well as a real quantum computer.

That example application was run using just four qubits, and Johri disclosed that the simulator is actually faster than the quantum computer at that level.

But when you go from 4 to 40 qubits, the amount of time and the amount of memory the simulator needs will increase exponentially with the number of qubits, but for the quantum computer, it is only going to increase linearly. So at four cubits the simulator is faster than the quantum computer, but if you scale up that same example to say, 30 to 40 qubits, the quantum computer is going to be exponentially faster, she explained.

Dell has also now begun to further adapt its hybrid classical-quantum computer by adding intelligent orchestration to automate some of the provisioning and management of the quantum hardware, and further optimize operations.

We have taken that two steps further by adding machine learning into an intelligent orchestration function. And what the machine learning algorithms do is to identify the characteristics associated with the workload and then match the correct number of QPUs and the correct system, either virtual or real QPU, in order to get to the outcomes that youre looking to get to a very specific point in time, Durazzo said.

Quantum computer hardware will continue to evolve, and may even pick up pace as interest in the field (and investment) grows, but Dells Durazzo believes that the classical-quantum hybrid model it has developed is good for a few years yet.

I think that diagram actually shows the future state for a very long time for quantum of a hybrid classical-quantum system, where the interactions are very tight, the interactions are very prescriptive in the world of quantum and classical for growth together into the future, he said. As we further grow those numbers of qubits, the classical infrastructure necessary to support this quantum computation will grow as well. So, there should be a very large increase overall in the system as we start becoming more capable of solving more complex problems inside the quantum space.

Original post:
Building The Bridge To The Quantum Future With Hybrid Systems - The Next Platform