Archive for the ‘Quantum Computer’ Category

The $1 Billion Bet on Quantum Computers That Process Light – DISCOVER Magazine

In the battle to build the worlds first useful quantum computers, one company has taken an entirely different approach to the other frontrunners. The conventional approach is to gradually increase the size and power of these devices and test as you go.

But PsiQuantum, a startup based in Palo Alto, California, is gambling on the opposite approach. The company is investing heavily in quantum technologies that are compatible with chip-making fabrication plants that already exist. By using these facilities, their goal is to mass-produce powerful silicon-based quantum computers from the very beginning.

This week, they reveal how well this approach is going and discuss the challenges that still lie ahead.

Founded in 2016, PsiQuantum hit the headlines in 2021 when it raised $700 million to pursue its goal of building useful quantum computers within a decade. This week, it announced a similar injection from the Australian government bring its total funding to some $1.3 billion. That makes it one of the best funded startups in history.

The excitement is largely because of PsiQuantums unique approach. A key decision is its choice of quantum bits or qubits. Other companies are focusing on superconducting qubits, ion traps, neutral atoms, quantum dots and so on.

PsiQuantum has opted to use photons. The advantage is that photons do not easily interact with the environment, so their quantum nature is relatively stable. Thats important for computation.

Paradoxically, this reluctance to interact is also the main disadvantage of photons. Its hard to make them interact with each other in a way that processes information.

But various groups have demonstrated optical quantum computing and PsiQuantum was founded by researchers in this area from Imperial College London and the University of Bristol.

Optical quantum computing works by creating photons or pairs of them, guiding them through channels carved into silicon where they can interact and then measuring their properties with highly specialized detectors.

PsiQuantum intends to do all this with silicon wafers. Their bold idea was that we already know how to make silicon chips on a vast scale by mass-production. Chip fabrication plants cost billions to build so there is a significant advantage in being able to use this current technology.

And by making bigger, more densely packed chips, optical quantum computers can scale relatively easily. Unlike other designs where scaling will be much harder.

So all the focus has been on how to make the manufacture optical quantum computing chips compatible with conventional fabrication plants.

Thats not as easy as it sounds. So this weeks paper outlining their advances has been eagerly awaited.

The team have reached numerous goals. We modified an established silicon photonics manufacturing flow to include high-performance single photon detection and photon pair generation, they say. To our knowledge, this is the first realization of an integrated photonic technology platform capable of on-chip generation, manipulation, and detection of photonic qubits.

But there are significant steps ahead. PsiQuantum still needs to develop a variety of next generation technologies to make large scale photonic quantum computation feasible. It will be necessary to further reduce Silicon Nitride materials and component losses, improve filter performance, and increase detector efficiency to push overall photon loss and fidelity, say the team.

For example, the on-chip photon detectors that are built into waveguides need to be able to count individual photons. The on-chip photon waveguides need to be lower loss. And perhaps the biggest challenge is in developing high speed optoelectronic switches that can rapidly reconfigure optical circuits.

PsiQuantum is making these switches out of barium titanate (BTO), a material that must be incorporated into the fabrication process. We have developed a proprietary process for the growth of high-quality BTO films using molecular beam epitaxy, compatible with foundry processes, they say.

All that looks impressive, but the paper does not include a demonstration of quantum computing itself.

Perhaps its too early to expect that. To be fair, basic quantum computation with photons has long been possible with these kinds of systems at a small scale.

The singular intent of our development is a useful fault-tolerant quantum computer, they say. PsiQuantum has also said elsewhere that its goal is to achieve this by 2029.

Of course, it faces stiff competition from other manufacturers of quantum computers. Itll be an exciting race and the (quantum) clock is ticking.

Ref: A manufacturable platform for photonic quantum computing : arxiv.org/abs/2404.17570

Link:
The $1 Billion Bet on Quantum Computers That Process Light - DISCOVER Magazine

Enhancing Quantum Error Correction Effectiveness – AZoQuantum

Apr 30 2024Reviewed by Lexie Corner

In a study published in the journal Nature Physics, a team of scientists led by researchers from the University of ChicagosPritzker School of Molecular Engineering (PME) created the blueprint for a quantum computer that can fix errors more efficiently.

Although quantum computers are an extremely potent computational tool, their delicate qubits challenge engineers: how can they design useful, functional quantum systems using bits that are easily disrupted and erased of data by minute changes in their environment?

Engineers have long grappled with how to make quantum computers less error-prone, frequently creating methods to identify and rectify problems rather than preventing them in the first place. However, many of these error-correction systems entail replicating information over hundreds or thousands of physical qubits simultaneously, making it difficult to scale up efficiently.

The system makes use of reconfigurable atom array hardware, which enables qubits to communicate with more neighbors and, consequently, allows the qLDPC data to be encoded in fewer qubits, as well as a new framework based on quantum low-density party-check (qLDPC) codes, which can detect errors by examining the relationship between bits.

With this proposed blueprint, we have reduced the overhead required for quantum error correction, which opens new avenues for scaling up quantum computers.

Liang Jiang, Study Senior Author and Professor, Pritzker School of Molecular Engineering, University of Chicago

While standard computers rely on digital bitsin an on or off positionto encode data, qubits can exist in states of superposition, giving them the ability to tackle new computational problems. However, qubits unique properties also make them incredibly sensitive to their environment; they change states based on the surrounding temperature and electromagnetism.

Quantum systems are intrinsically noisy. Theres really no way to build a quantum machine that wont have error. You need to have a way of doing active error correction if you want to scale up your quantum system and make it useful for practical tasks.

Qian Xu, Graduate Student, Pritzker School of Molecular Engineering, University of Chicago

For the previous few decades, scientists have primarily relied on one type of error correction, known as surface codes, for quantum systems. In these systems, users encode the same logical information into several physical bits grouped in a wide two-dimensional grid. Errors can be detected by comparing qubits to their immediate neighbors. A mismatch indicates that one qubit misfired.

Xu added, The problem with this is that you need a huge resource overhead. In some of these systems, you need one thousand physical qubits for every logical qubit, so in the long run, we dont think we can scale this up to very large computers.

Jiang, Xu, and colleagues from Harvard University, Caltech, the University of Arizona, and QuEra Computing designed a novel method to fix errors using qLDPC codes. This type of error correction had long been contemplated but not included in a realistic plan.

With qLDPC codes, data in qubits is compared to both direct neighbors and more distant qubits. It enables a smaller grid of qubits to do the same number of comparisons for error correction. However, long-distance communication between qubits has always been a challenge when implementing qLDPC.

The researchers devised a solution in the form of new hardware: reconfigurable atoms that can be relocated using lasers to enable qubits to communicate with new partners.

With todays reconfigurable atom array systems, we can control and manipulate more than a thousand physical qubits with high fidelity and connect qubits separated by a large distance. By matching the structure of quantum codes and these hardware capabilities, we can implement these more advanced qLDPC codes with only a few control lines, putting the realization of them within reach with today's experimental systems.

Harry Zhou, Ph.D. Student, Harvard University

When researchers paired qLDPC codes with reconfigurable neutral-atom arrays, they achieved a lower error rate than surface codes using only a few hundred physical qubits. When scaled up, quantum algorithms requiring thousands of logical qubits might be completed with fewer than 100,000 physical qubits, vastly outperforming the gold-standard surface codes.

Theres still redundancy in terms of encoding the data in multiple physical qubits, but the idea is that we have reduced that redundancy by a lot, Xu added.

Though scientists are developing atom-array platforms quickly, the framework is still theoretical and represents a step toward the real-world use of error-corrected quantum computation. The PME team is now striving to improve its design even more and ensure that reconfigurable atom arrays and logical qubits relying on qLDPC codes can be employed in computation.

Xu concluded, We think in the long run, this will allow us to build very large quantum computers with lower error rates.

Xu, Q., et. al. (2024) Constant-overhead fault-tolerant quantum computation with reconfigurable atom arrays. Nature Physics. doi:10.1038/s41567-024-02479-z

Source: https://www.uchicago.edu/en

The rest is here:
Enhancing Quantum Error Correction Effectiveness - AZoQuantum

Unveiling the Universe’s Secrets: A Quantum Leap With AI at CERN – Indiana Daily Student

Photo By Pixabay

For centuries, scientists have been on a thrilling quest to understand the universe's building blocks. At CERN, the European Organization for Nuclear Research, the Large Hadron Collider (LHC) smashes particles together at near-light speed. But deciphering the mysteries hidden within these tiny collisions creates massive data hurdles. Here's where a revolutionary tool emerges Quantum Artificial Intelligence (Quantum AI). This powerful new approach promises to transform our understanding of the universe by joining forces with CERN in groundbreaking ways.

A Game Changer for Untangling the Universe's Code

Quantum AI leverages the mind-bending properties of quantum mechanics through quantum computing. Unlike regular bits (0 or 1), quantum bits (qubits) can be in a state called superposition, representing multiple values simultaneously. This "parallel processing" superpower allows Quantum AI to tackle problems that would take classical computers an eternity, especially when simulating complex quantum systems like those probed at the LHC.

The parallels between Quantum AI and CERN's cutting-edge research are striking. Both delve into the strange world of the subatomic, seeking answers to fundamental questions. The LHC recreates the conditions of the early universe to test the Standard Model the current framework explaining fundamental particles and their interactions. However, the Standard Model has limitations. It can't explain gravity or dark matter/dark energy, which make up most of the universe. Quantum AI offers the potential to crack these mysteries wide open.

Unlocking New Frontiers: Challenges and Opportunities

This convergence of Quantum AI and particle physics presents both challenges and exciting opportunities. One hurdle is developing efficient algorithms specifically designed for quantum computers. Existing classical algorithms might not translate smoothly. Additionally, the nascent state of quantum hardware requires physicists, computer scientists, and AI experts to work together to bridge the gap between theoretical potential and practical application.

However, the potential rewards are equally transformative. Quantum AI can analyze the LHC's massive datasets with unprecedented speed and accuracy. It can also simulate complex particle interactions that would baffle classical computers, potentially leading to the discovery of new particles or forces not yet predicted by the Standard Model. This could revolutionize our understanding of the universe's origin and evolution.

Projects at the Forefront: Quantum AI Meets CERN

Several pioneering projects demonstrate the immense potential of Quantum AI at CERN. A leading example is the 'Quantum Machine Learning for Physics Discovery' project. Here, scientists explore using quantum machine learning algorithms to identify patterns in LHC data that classical algorithms might miss. This could lead to the detection of subtle anomalies hinting at new physics beyond the Standard Model.

Another exciting project focuses on simulating quantum chromodynamics (QCD), the theory describing how quarks and gluons interact to form protons, neutrons, and other hadrons. Accurately simulating QCD is computationally expensive, but Quantum AI could significantly speed these simulations up, providing deeper insights into the strong nuclear force holding these particles together.

Hunting the Elusive Neutrinos

While Quantum AI offers a glimpse into the future of particle physics, ongoing experiments like FASER (Forward Search Experiment) at CERN highlight the power of existing technologies. Operational since 2022, FASER is specifically designed to study weakly interacting particles, particularly neutrinos. These elusive particles are crucial for understanding fundamental forces and the imbalance of matter and antimatter in the universe but are nearly impossible to detect directly.

FASER's ingenious positioning in a side tunnel of the LHC allows it to capture these weakly interacting particles that escape the main detectors. Here, Quantum AI can play a crucial role in analyzing the vast amount of data collected by FASER, identifying patterns and anomalies that could reveal the properties of these elusive particles.

The recent measurement of neutrino interaction strength (cross-section) by FASER is a testament to its capabilities. This groundbreaking achievement, the first of its kind at a particle collider, provides valuable insights into neutrino behavior and paves the way for future discoveries involving these mysterious particles.

A Bridge to Fintech: The Evolving Landscape of Financial Markets

The world of finance is another domain undergoing a transformation fueled by advancements in artificial intelligence. While companies like"Quantum AI" (quantumai.co) focus on applying these advancements to trading, the broader concept of leveraging AI for market analysis holds significant potential.

As with any emerging technology, there will be a period of refinement and optimization before its full potential is realized. However, the potential benefits of AI-powered market analysis, including identifying patterns and trends that might elude human traders, are undeniable. While their core research focuses on quantum technologies, their initial exploration led to the development of a successful trading bot. This exemplifies how advancements in quantum-powered AI can potentially revolutionize various industries, including finance.

It's important to acknowledge, however, that the field of quantum computing is still in its early stages. While such advancements offer exciting possibilities, further research and development are needed before widespread practical applications become a reality.

Looking Ahead: A Symbiotic Future

The synergy between Quantum AI and CERN represents a significant leap forward in scientific exploration. As Quantum AI and quantum computing continue to evolve, we can expect even more remarkable breakthroughs at the intersection of artificial intelligence and particle physics.

This exciting frontier holds the potential to rewrite our scientific textbooks, from unraveling the nature of dark matter to understanding the origin of the universe itself. The combined power ofcutting-edge experiments like FASERand the analytical prowess of Quantum AI will be instrumental in unlocking the universe's deepest secrets, propelling humanity further into the unknown.

Read the original:
Unveiling the Universe's Secrets: A Quantum Leap With AI at CERN - Indiana Daily Student

Quantum-proofing passwords and artwork with DNA encryption – Advanced Science News

Chaotic pools of DNA could be the future of encryption, proving authenticity of artwork or securing passwords against quantum computers.

Engineers have harnessed the chaotic patterns of random DNA sequences to create a unique authentication system for securing artwork and defending passwords from the looming threat of quantum computing.

Conventional encryption techniques use math and algorithms to create what are called one-way functions. As the name implies, an input is presented in the function which leads to a specific output, but due to the nature of the function, the equation is difficult to compute in reverse, creating a secure barrier against unauthorized access.

Researchers are concerned that quantum computing has the power to unravel these complex functions, leaving current encryption vulnerable. Robert Grass, a chemical engineer at ETH Zurich, believes one-way functions based on physical things, like DNA, rather than theoretical mathematics is the solution.

Our system is based on true randomness, said Grass in a press release. The input and output values are physically linked, and its only possible to get from the input value to the output value, not the other way round.

Since its a physical system and not a digital one, it cant be decoded by an algorithm, not even by one that runs on a quantum computer, added Anne Lscher, a doctoral student in Grass group.

Using DNA as a physical one-way function works because of its ability to store vast amounts of information as sequences of base pairs. Unlike binary code, where a position in a sequence can be a one or zero, DNA uses four bases to build sequences, increasing the amount of data DNA can hold.

Accessing the data is possible because each base only binds to one other. Short stretches of DNA, called primers, are used to bind to a complimentary piece of DNA and initiate sequencing of the complimentary strand. However, to use this as an encryption method requires a little chaos.

Synthesizing DNA in a lab is now quite easy, and pools of millions of random DNA sequences can be built for roughly one US dollar. The immense number of combinations possible with four bases creates a molecular chaos whereby no two randomly generated DNA pools can be the same and are impossible to simulate even with the most powerful computers.

Therefore, a given set of primers used to sequence stretches of the pool will reveal a totally unique and unpredictable output. As a one way function, the primers act as the input and the random sequences the output.

Because DNA is so small, these pools can easily be added into paint or sprayed onto a small section of an object. The same primers can then be used to sequence the pool on the object and the master pool, and verify that the same sequence is returned.

While synthesizing the DNA pools is cheap and easy, the current limitation for this physical encryption method is the DNA sequencing, which is costlier and requires specialized labs.

To make it practical we need more consumer-friendly sequencing technologies, said Grass in an interview with Advanced Science News. But thats all things that are being developed for many other applications at the moment, so Im not so afraid of that.

The broader public wont be using DNA passwords anytime soon, but the first applications could be protecting against forgery or securing the trackability of sensitive supply chains, like medicines. According to Grass, embracing the chaos of random DNA sequences is both unusual and exciting.

Chemists hate [disorder], he said. Here, we are really building on that, we are going all in and saying we want as many side products as possible because we are going to work with the randomness it offers.

Reference: Anne M. Luescher, et al. Chemical unclonable functions based on operable random DNA pools, Nature Communications (2024). DOI: 10.1038/s41467-024-47187-7

Feature image: DNA can be used to confirm the authenticity of valuable art prints. Credit: AI-generated image, ETH Zurich

Go here to read the rest:
Quantum-proofing passwords and artwork with DNA encryption - Advanced Science News

ATSE Welcomes Large Quantum of Technology Investment for Queensland – AZoQuantum

The Australian Academy of Technological Sciences and Engineering (ATSE) welcomes the announcement of a new, almost $1 billion quantum computing investment in Queensland, announced today by Prime Minister Anthony Albanese and Minister for Industry and Science Ed Husic along with Queenslands Premier Steven Miles. The investment is equally co-funded by the Federal and State Governments.

This funding will enable technology start-up company PsiQuantum to establish its Asia-Pacific headquarters in Brisbane and build the worlds first fault tolerant quantum computer, advancing the local quantum industry and creating 400 jobs and supporting PhD positions. Fault tolerance is the next step in the development of useful, practical quantum computers, heralding the arrival of new computing capabilities in Australia in coming years.

As stated in ATSEs submission to the National Quantum Strategy, growing the Australian quantum industry requires supporting four interrelated areas: basic research, infrastructure, talent and business activity.

ATSE CEO Kylie Walker said todays landmark announcement will supercharge these areas and enable Australia to build on its early quantum computing success.

ATSEs response to the National Quantum Strategy called out the then-unmet need for the Australian Government to back the strategy with public funding. Todays investment in PsiQuantum and the research, technology and manufacturing industry that will grow around it will enable the Australian quantum industry to become an international leader.

We applaud the Australian Government and Queensland Government for responding to our calls for large-scale quantum investment through the National Quantum Strategy consultation process, and investing in building technology-forward Australian industry, said Kylie Walker.

Todays announcement follows the Australian Governments announcement of $18.4 million for the University of Sydney to establish Quantum Australia to foster critical collaborations and encourage the creation and growth of quantum startups.

Fellows of the Academy such as Professor Michelle Simmons AO FTSE FAA FRS, Professor Andre Luiten FTSE and Professor Elanor Huntington FTSE are at the forefront of Australias quantum industry.

These initiatives, along with the Global Science and Technology Development Fund Strategic Element grants scheme (GSTDF) which has quantum computing as a key priority area, are placing Australia at the forefront of this emerging technology, and supporting a strong innovation culture to secure Australias economic resilience.

Source:https://www.atse.org.au/

Read this article:
ATSE Welcomes Large Quantum of Technology Investment for Queensland - AZoQuantum