Archive for the ‘Quantum Computer’ Category

Y2K was a flop. But Q-Day could really screw us over – Sydney Morning Herald

Y2K was a flop. But Q-Day could really screw us over  Sydney Morning Herald

See the article here:
Y2K was a flop. But Q-Day could really screw us over - Sydney Morning Herald

Tags:

The $1 Billion Bet on Quantum Computers That Process Light – DISCOVER Magazine

In the battle to build the worlds first useful quantum computers, one company has taken an entirely different approach to the other frontrunners. The conventional approach is to gradually increase the size and power of these devices and test as you go.

But PsiQuantum, a startup based in Palo Alto, California, is gambling on the opposite approach. The company is investing heavily in quantum technologies that are compatible with chip-making fabrication plants that already exist. By using these facilities, their goal is to mass-produce powerful silicon-based quantum computers from the very beginning.

This week, they reveal how well this approach is going and discuss the challenges that still lie ahead.

Founded in 2016, PsiQuantum hit the headlines in 2021 when it raised $700 million to pursue its goal of building useful quantum computers within a decade. This week, it announced a similar injection from the Australian government bring its total funding to some $1.3 billion. That makes it one of the best funded startups in history.

The excitement is largely because of PsiQuantums unique approach. A key decision is its choice of quantum bits or qubits. Other companies are focusing on superconducting qubits, ion traps, neutral atoms, quantum dots and so on.

PsiQuantum has opted to use photons. The advantage is that photons do not easily interact with the environment, so their quantum nature is relatively stable. Thats important for computation.

Paradoxically, this reluctance to interact is also the main disadvantage of photons. Its hard to make them interact with each other in a way that processes information.

But various groups have demonstrated optical quantum computing and PsiQuantum was founded by researchers in this area from Imperial College London and the University of Bristol.

Optical quantum computing works by creating photons or pairs of them, guiding them through channels carved into silicon where they can interact and then measuring their properties with highly specialized detectors.

PsiQuantum intends to do all this with silicon wafers. Their bold idea was that we already know how to make silicon chips on a vast scale by mass-production. Chip fabrication plants cost billions to build so there is a significant advantage in being able to use this current technology.

And by making bigger, more densely packed chips, optical quantum computers can scale relatively easily. Unlike other designs where scaling will be much harder.

So all the focus has been on how to make the manufacture optical quantum computing chips compatible with conventional fabrication plants.

Thats not as easy as it sounds. So this weeks paper outlining their advances has been eagerly awaited.

The team have reached numerous goals. We modified an established silicon photonics manufacturing flow to include high-performance single photon detection and photon pair generation, they say. To our knowledge, this is the first realization of an integrated photonic technology platform capable of on-chip generation, manipulation, and detection of photonic qubits.

But there are significant steps ahead. PsiQuantum still needs to develop a variety of next generation technologies to make large scale photonic quantum computation feasible. It will be necessary to further reduce Silicon Nitride materials and component losses, improve filter performance, and increase detector efficiency to push overall photon loss and fidelity, say the team.

For example, the on-chip photon detectors that are built into waveguides need to be able to count individual photons. The on-chip photon waveguides need to be lower loss. And perhaps the biggest challenge is in developing high speed optoelectronic switches that can rapidly reconfigure optical circuits.

PsiQuantum is making these switches out of barium titanate (BTO), a material that must be incorporated into the fabrication process. We have developed a proprietary process for the growth of high-quality BTO films using molecular beam epitaxy, compatible with foundry processes, they say.

All that looks impressive, but the paper does not include a demonstration of quantum computing itself.

Perhaps its too early to expect that. To be fair, basic quantum computation with photons has long been possible with these kinds of systems at a small scale.

The singular intent of our development is a useful fault-tolerant quantum computer, they say. PsiQuantum has also said elsewhere that its goal is to achieve this by 2029.

Of course, it faces stiff competition from other manufacturers of quantum computers. Itll be an exciting race and the (quantum) clock is ticking.

Ref: A manufacturable platform for photonic quantum computing : arxiv.org/abs/2404.17570

Link:
The $1 Billion Bet on Quantum Computers That Process Light - DISCOVER Magazine

Tags:

Enhancing Quantum Error Correction Effectiveness – AZoQuantum

Apr 30 2024Reviewed by Lexie Corner

In a study published in the journal Nature Physics, a team of scientists led by researchers from the University of ChicagosPritzker School of Molecular Engineering (PME) created the blueprint for a quantum computer that can fix errors more efficiently.

Although quantum computers are an extremely potent computational tool, their delicate qubits challenge engineers: how can they design useful, functional quantum systems using bits that are easily disrupted and erased of data by minute changes in their environment?

Engineers have long grappled with how to make quantum computers less error-prone, frequently creating methods to identify and rectify problems rather than preventing them in the first place. However, many of these error-correction systems entail replicating information over hundreds or thousands of physical qubits simultaneously, making it difficult to scale up efficiently.

The system makes use of reconfigurable atom array hardware, which enables qubits to communicate with more neighbors and, consequently, allows the qLDPC data to be encoded in fewer qubits, as well as a new framework based on quantum low-density party-check (qLDPC) codes, which can detect errors by examining the relationship between bits.

With this proposed blueprint, we have reduced the overhead required for quantum error correction, which opens new avenues for scaling up quantum computers.

Liang Jiang, Study Senior Author and Professor, Pritzker School of Molecular Engineering, University of Chicago

While standard computers rely on digital bitsin an on or off positionto encode data, qubits can exist in states of superposition, giving them the ability to tackle new computational problems. However, qubits unique properties also make them incredibly sensitive to their environment; they change states based on the surrounding temperature and electromagnetism.

Quantum systems are intrinsically noisy. Theres really no way to build a quantum machine that wont have error. You need to have a way of doing active error correction if you want to scale up your quantum system and make it useful for practical tasks.

Qian Xu, Graduate Student, Pritzker School of Molecular Engineering, University of Chicago

For the previous few decades, scientists have primarily relied on one type of error correction, known as surface codes, for quantum systems. In these systems, users encode the same logical information into several physical bits grouped in a wide two-dimensional grid. Errors can be detected by comparing qubits to their immediate neighbors. A mismatch indicates that one qubit misfired.

Xu added, The problem with this is that you need a huge resource overhead. In some of these systems, you need one thousand physical qubits for every logical qubit, so in the long run, we dont think we can scale this up to very large computers.

Jiang, Xu, and colleagues from Harvard University, Caltech, the University of Arizona, and QuEra Computing designed a novel method to fix errors using qLDPC codes. This type of error correction had long been contemplated but not included in a realistic plan.

With qLDPC codes, data in qubits is compared to both direct neighbors and more distant qubits. It enables a smaller grid of qubits to do the same number of comparisons for error correction. However, long-distance communication between qubits has always been a challenge when implementing qLDPC.

The researchers devised a solution in the form of new hardware: reconfigurable atoms that can be relocated using lasers to enable qubits to communicate with new partners.

With todays reconfigurable atom array systems, we can control and manipulate more than a thousand physical qubits with high fidelity and connect qubits separated by a large distance. By matching the structure of quantum codes and these hardware capabilities, we can implement these more advanced qLDPC codes with only a few control lines, putting the realization of them within reach with today's experimental systems.

Harry Zhou, Ph.D. Student, Harvard University

When researchers paired qLDPC codes with reconfigurable neutral-atom arrays, they achieved a lower error rate than surface codes using only a few hundred physical qubits. When scaled up, quantum algorithms requiring thousands of logical qubits might be completed with fewer than 100,000 physical qubits, vastly outperforming the gold-standard surface codes.

Theres still redundancy in terms of encoding the data in multiple physical qubits, but the idea is that we have reduced that redundancy by a lot, Xu added.

Though scientists are developing atom-array platforms quickly, the framework is still theoretical and represents a step toward the real-world use of error-corrected quantum computation. The PME team is now striving to improve its design even more and ensure that reconfigurable atom arrays and logical qubits relying on qLDPC codes can be employed in computation.

Xu concluded, We think in the long run, this will allow us to build very large quantum computers with lower error rates.

Xu, Q., et. al. (2024) Constant-overhead fault-tolerant quantum computation with reconfigurable atom arrays. Nature Physics. doi:10.1038/s41567-024-02479-z

Source: https://www.uchicago.edu/en

The rest is here:
Enhancing Quantum Error Correction Effectiveness - AZoQuantum

Tags:

Unveiling the Universe’s Secrets: A Quantum Leap With AI at CERN – Indiana Daily Student

Photo By Pixabay

For centuries, scientists have been on a thrilling quest to understand the universe's building blocks. At CERN, the European Organization for Nuclear Research, the Large Hadron Collider (LHC) smashes particles together at near-light speed. But deciphering the mysteries hidden within these tiny collisions creates massive data hurdles. Here's where a revolutionary tool emerges Quantum Artificial Intelligence (Quantum AI). This powerful new approach promises to transform our understanding of the universe by joining forces with CERN in groundbreaking ways.

A Game Changer for Untangling the Universe's Code

Quantum AI leverages the mind-bending properties of quantum mechanics through quantum computing. Unlike regular bits (0 or 1), quantum bits (qubits) can be in a state called superposition, representing multiple values simultaneously. This "parallel processing" superpower allows Quantum AI to tackle problems that would take classical computers an eternity, especially when simulating complex quantum systems like those probed at the LHC.

The parallels between Quantum AI and CERN's cutting-edge research are striking. Both delve into the strange world of the subatomic, seeking answers to fundamental questions. The LHC recreates the conditions of the early universe to test the Standard Model the current framework explaining fundamental particles and their interactions. However, the Standard Model has limitations. It can't explain gravity or dark matter/dark energy, which make up most of the universe. Quantum AI offers the potential to crack these mysteries wide open.

Unlocking New Frontiers: Challenges and Opportunities

This convergence of Quantum AI and particle physics presents both challenges and exciting opportunities. One hurdle is developing efficient algorithms specifically designed for quantum computers. Existing classical algorithms might not translate smoothly. Additionally, the nascent state of quantum hardware requires physicists, computer scientists, and AI experts to work together to bridge the gap between theoretical potential and practical application.

However, the potential rewards are equally transformative. Quantum AI can analyze the LHC's massive datasets with unprecedented speed and accuracy. It can also simulate complex particle interactions that would baffle classical computers, potentially leading to the discovery of new particles or forces not yet predicted by the Standard Model. This could revolutionize our understanding of the universe's origin and evolution.

Projects at the Forefront: Quantum AI Meets CERN

Several pioneering projects demonstrate the immense potential of Quantum AI at CERN. A leading example is the 'Quantum Machine Learning for Physics Discovery' project. Here, scientists explore using quantum machine learning algorithms to identify patterns in LHC data that classical algorithms might miss. This could lead to the detection of subtle anomalies hinting at new physics beyond the Standard Model.

Another exciting project focuses on simulating quantum chromodynamics (QCD), the theory describing how quarks and gluons interact to form protons, neutrons, and other hadrons. Accurately simulating QCD is computationally expensive, but Quantum AI could significantly speed these simulations up, providing deeper insights into the strong nuclear force holding these particles together.

Hunting the Elusive Neutrinos

While Quantum AI offers a glimpse into the future of particle physics, ongoing experiments like FASER (Forward Search Experiment) at CERN highlight the power of existing technologies. Operational since 2022, FASER is specifically designed to study weakly interacting particles, particularly neutrinos. These elusive particles are crucial for understanding fundamental forces and the imbalance of matter and antimatter in the universe but are nearly impossible to detect directly.

FASER's ingenious positioning in a side tunnel of the LHC allows it to capture these weakly interacting particles that escape the main detectors. Here, Quantum AI can play a crucial role in analyzing the vast amount of data collected by FASER, identifying patterns and anomalies that could reveal the properties of these elusive particles.

The recent measurement of neutrino interaction strength (cross-section) by FASER is a testament to its capabilities. This groundbreaking achievement, the first of its kind at a particle collider, provides valuable insights into neutrino behavior and paves the way for future discoveries involving these mysterious particles.

A Bridge to Fintech: The Evolving Landscape of Financial Markets

The world of finance is another domain undergoing a transformation fueled by advancements in artificial intelligence. While companies like"Quantum AI" (quantumai.co) focus on applying these advancements to trading, the broader concept of leveraging AI for market analysis holds significant potential.

As with any emerging technology, there will be a period of refinement and optimization before its full potential is realized. However, the potential benefits of AI-powered market analysis, including identifying patterns and trends that might elude human traders, are undeniable. While their core research focuses on quantum technologies, their initial exploration led to the development of a successful trading bot. This exemplifies how advancements in quantum-powered AI can potentially revolutionize various industries, including finance.

It's important to acknowledge, however, that the field of quantum computing is still in its early stages. While such advancements offer exciting possibilities, further research and development are needed before widespread practical applications become a reality.

Looking Ahead: A Symbiotic Future

The synergy between Quantum AI and CERN represents a significant leap forward in scientific exploration. As Quantum AI and quantum computing continue to evolve, we can expect even more remarkable breakthroughs at the intersection of artificial intelligence and particle physics.

This exciting frontier holds the potential to rewrite our scientific textbooks, from unraveling the nature of dark matter to understanding the origin of the universe itself. The combined power ofcutting-edge experiments like FASERand the analytical prowess of Quantum AI will be instrumental in unlocking the universe's deepest secrets, propelling humanity further into the unknown.

Read the original:
Unveiling the Universe's Secrets: A Quantum Leap With AI at CERN - Indiana Daily Student

Tags:

Australia to fund $620M quantum computer claimed to be first at ‘utility-scale’ – The Register

Australia to fund $620M quantum computer claimed to be first at 'utility-scale'  The Register

Read the original:
Australia to fund $620M quantum computer claimed to be first at 'utility-scale' - The Register

Tags: