Archive for the ‘Quantum Computer’ Category

Exploring new frontiers with Fujitsu’s quantum computing research and development – Fujitsu

Fujitsu and RIKEN have already successfully developed a 64-qubit superconducting quantum computer at the RIKEN-RQC-Fujitsu Collaboration Center, which was jointly established by the two organizations (*1). Our interviewee, researcher Shingo Tokunaga, is currently participating in a joint research project with RIKEN. He majored in electronic engineering at university and worked on microwave-related research topics. After joining Fujitsu, he worked in a variety of software fields, including network firmware development as well as platform development for communication robots. Currently, he is applying his past experience in the Quantum Hardware Team at the Quantum Laboratory to embark on new challenges.

In what fields do you think quantum computing can be applied to?

ShingoQuantum computing has many potential applications, such as finance and healthcare, but especially in quantum chemistry calculations used in drug development. If we can use it for these calculations, we can realize efficient and high precision simulations in a short period of time. Complex calculations that traditionally take a long time to solve on conventional computers are expected to be solved quickly by quantum computers. One such example of this is finding solutions for combinatorial optimization problems such as molecular structure patterns. The spread of the novel coronavirus has made the development of vaccines and therapeutics urgent, and in such situations where rapid responses are needed, I believe the time will come when quantum computers can be utilized.

Fujitsu is collaborating with world-leading research institutions to advance research and development in all technology areas, from quantum devices to foundational software and applications, with the aim of realizing practical quantum computers. Additionally, we are also advancing the development of hybrid technologies (*2) for quantum computers and high-performance computing technologies, represented by the supercomputer Fugaku, which will be necessary for large-scale calculations until the full practicality of quantum computers is achieved.

What themes are you researching? What are your challenges and goals?

ShingoOne of the achievements of our collaborative research with RIKEN is the construction of a 64-qubit superconducting quantum computer. Superconducting quantum computers operate by manipulating quantum bits on quantum chips cooled to under 20 mK using ultra-low-temperature refrigerators, driving them with microwave signals of around 8 GHz, and reading out the state of the bits. However, since both bit operations and readouts are analog operations, errors are inherent. Our goal is to achieve higher fidelity in the control and readout of quantum bits, providing an environment where quantum algorithms can be executed with high computational accuracy, ultimately solving our customers' challenges.

What role do you play in the team?

ShingoThe Quantum Hardware Team consists of many members responsible for tasks such as designing quantum chips, improving semiconductor manufacturing processes, designing and constructing components inside refrigerators, as well as designing and constructing control devices outside refrigerators. I am responsible for building control devices and controlling quantum bits. While much attention is often given to the development of the main body of quantum computers or quantum chips, by controlling and reading quantum bits with high precision, we can deliver the results of the development team to users, and that's my role.

How do you carry out controlling quantum bits, and in what sequence or process?

ShingoThe first step is the basic evaluation of the quantum chip, followed by calibration for controlling the quantum bits. First, we receive the quantum chip from the manufacturing team and perform performance measurements. To evaluate the chip, it is placed inside the refrigerator, and after closing the cover of the refrigerator, which is multilayered for insulation, the inside is vacuumed and cooling begins. It usually takes about two days to cool from room temperature to 20 mK. In the basic evaluation, we confirm parameters such as the resonance frequency of the quantum bits and coherence time called T1(the time it takes for a qubit to become initialized). Then, we perform calibration for quantum bit operations and readouts. Bit operations and readouts may not always yield the desired results, because there are interactions between the bits. The bit to be controlled may be affected by the neighboring bits, so it is necessary to control based on the overall situation of the bits. Therefore, we investigate why the results did not meet expectations, consult with researchers at RIKEN, and make further efforts to minimize errors.

How do you approach the challenge of insufficient accuracy in bit operations and readouts?

ShingoThere are various approaches we can try, such as improving semiconductor processes, implementing noise reduction measures in control electronics, and changing the method of microwave signal irradiation. Our team conducts studies on the waveform, intensity, phase, and irradiation timing of microwave signals necessary to improve the accuracy of quantum bit control. Initially, we try existing methods described in papers on our quantum chip and then work to improve accuracy further from there.

What other areas do you focus on or innovate in, outside of your main responsibilities? Can you also explain the reasons for this?

ShingoI am actively advancing tasks to contribute to improving the performance of quantum computer hardware further. The performance of the created quantum chip can only be evaluated by cooling it in a refrigerator and conducting measurements. Based on these results, it is important to determine what is needed to improve the performance of quantum computer hardware and provide feedback to the quantum chip design and manufacturing teams.

For Fujitsu, the development of quantum computers marks a first-time challenge. Do you have any concerns?

ShingoI believe that venturing into unknown territories is precisely where the value of a challenge lies, presenting opportunities for new discoveries and growth. Fujitsu is tackling quantum computer research and development by combining various technologies it has cultivated over the years. I aim to address challenges one by one and work towards achieving stable operation. Once stable operation is achieved, I hope to conduct research on new control methods.

What kind of activities you are undertaking to accelerate your research on quantum computers?

ShingoQuantum computing is an unknown field even for myself, so I am advancing development while consulting with researchers at RIKEN, our collaborative research partner. I aim to build a relationship of give and take, so I actively strive to cooperate if there are ways in which I can contribute to RIKEN's research.

What is your outlook for future research?

ShingoUltimately, our goal is to utilize quantum computers to solve societal issues, but quantum computing is still in its early stages of development. I believe that it is the responsibility of our Quantum Hardware Team urgently to provide application development teams with qubits and quantum gates that have many bits and high fidelity. In particular, fidelity improvement in two-qubit gate operations is a challenge in the field of control, and I aim to work on improving it. Additionally, I want to explore the development of a quantum platform that allows customers to maximize their utilization of quantum computers.

We use technology to make peoples lives happier. As a result of this belief, we have created various technologies and contributed to the development of society and our customers. At the Fujitsu Technology Hall located in the Fujitsu Technology Park, you can visit mock-ups of Fujitsu's quantum computers, as well as experience the latest technologies such as AI.

Mock-up of a quantum computer exhibited at the Fujitsu Technology Hall

See original here:
Exploring new frontiers with Fujitsu's quantum computing research and development - Fujitsu

Glimpse of next-generation internet – Harvard Office of Technology Development

May 20th, 2024

By Anne Manning, Harvard Staff Writer Published in the Harvard Gazette

An up close photo of the diamond silicon vacancy center.

Its one thing to dream up a next-generation quantum internet capable of sending highly complex, hacker-proof information around the world at ultra-fast speeds. Its quite another to physically show its possible.

Thats exactly what Harvard physicists have done, using existing Boston-area telecommunication fiber, in a demonstration of the worlds longest fiber distance between two quantum memory nodes. Think of it as a simple, closed internet carrying a signal encoded not by classical bits like the existing internet, but by perfectly secure, individual particles of light.

The groundbreaking work, published in Nature, was led by Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics, in collaboration with Harvard professors Marko Lonar and Hongkun Park, who are all members of the Harvard Quantum Initiative. The Nature work was carried out with researchers at Amazon Web Services.

The Harvard team established the practical makings of the first quantum internet by entangling two quantum memory nodes separated by optical fiber link deployed over a roughly 22-mile loop through Cambridge, Somerville, Watertown, and Boston. The two nodes were located a floor apart in Harvards Laboratory for Integrated Science and Engineering.

Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers.

Mikhail Lukin, the Joshua and Beth Friedman University Professor in the Department of Physics

Quantum memory, analogous to classical computer memory, is an important component of a quantum computing future because it allows for complex network operations and information storage and retrieval. While other quantum networks have been created in the past, the Harvard teams is the longest fiber network between devices that can store, process, and move information.

Each node is a very small quantum computer, made out of a sliver of diamond that has a defect in its atomic structure called a silicon-vacancy center. Inside the diamond, carved structures smaller than a hundredth the width of a human hair enhance the interaction between the silicon-vacancy center and light.

The silicon-vacancy center contains two qubits, or bits of quantum information: one in the form of an electron spin used for communication, and the other in a longer-lived nuclear spin used as a memory qubit to store entanglement, the quantum-mechanical property that allows information to be perfectly correlated across any distance.

(In classical computing, information is stored and transmitted as a series of discrete binary signals, say on/off, that form a kind of decision tree. Quantum computing is more fluid, as information can exist in stages between on and off, and is stored and transferred as shifting patterns of particle movement across two entangled points.)

Map showing path of two-node quantum network through Boston and Cambridge. Credit: Can Knaut via OpenStreetMap

Using silicon-vacancy centers as quantum memory devices for single photons has been a multiyear research program at Harvard. The technology solves a major problem in the theorized quantum internet: signal loss that cant be boosted in traditional ways.

A quantum network cannot use standard optical-fiber signal repeaters because simple copying of quantum information as discrete bits is impossible making the information secure, but also very hard to transport over long distances.

Silicon-vacancy-center-based network nodes can catch, store, and entangle bits of quantum information while correcting for signal loss. After cooling the nodes to close to absolute zero, light is sent through the first node and, by nature of the silicon vacancy centers atomic structure, becomes entangled with it, so able to carry the information.

Since the light is already entangled with the first node, it can transfer this entanglement to the second node, explained first author Can Knaut, a Kenneth C. Griffin Graduate School of Arts and Sciences student in Lukins lab. We call this photon-mediated entanglement.

Over the last several years, the researchers have leased optical fiber from a company in Boston to run their experiments, fitting their demonstration network on top of the existing fiber to indicate that creating a quantum internet with similar network lines would be possible.

Showing that quantum network nodes can be entangled in the real-world environment of a very busy urban area is an important step toward practical networking between quantum computers, Lukin said.

A two-node quantum network is only the beginning. The researchers are working diligently to extend the performance of their network by adding nodes and experimenting with more networking protocols.

The paper is titled Entanglement of Nanophotonic Quantum Memory Nodes in a Telecom Network. The work was supported by the AWS Center for Quantum Networkings research alliance with the Harvard Quantum Initiative, the National Science Foundation, the Center for Ultracold Atoms (an NSF Physics Frontiers Center), the Center for Quantum Networks (an NSF Engineering Research Center), the Air Force Office of Scientific Research, and other sources.

Harvard Office of Technology Development enabled the strategic alliance between Harvard University and Amazon Web Services (AWS) to advance fundamental research and innovation in quantum networking.

Tags: Alliances, Collaborations, Quantum Physics, Internet, Publication

Press Contact: Kirsten Mabry | (617) 495-4157

Read more:
Glimpse of next-generation internet - Harvard Office of Technology Development

The quantum internet is fast becoming a real thing – RedShark News

Researchers at Harvard University have demonstrated the longest distance fibre transmission between quantum nodes to date, highlighting that the idea of a quantum internet isn't just fanciful thinking.

Transmitting quantum information between nodes is not a new idea, and it has been demonstrated before. However, performing the feat over a long distance via fibre has always been a stumbling block, due to the degradation of the light signal. Most demonstrations of the concepts of a quantum internet have taken place in laboratories using line of site lasers to create the entangled photons.

Maintaining a state of entanglement over a distance has always been a stumbling block. Entanglement is when two subatomic particles are linked, with a change in one being instantly reflected in the other, even if they are separated by billions of light years. Albert Einstein referred to this phenomena as "spooky action at a distance." When it comes to a quantum internet, the idea is that two qubits (the quantum version of a traditional computer bit) become 'linked'. Once this happens, any change in quantum state of one qubit is instantly reflected in the other, even if they are vast distances apart on a network.

Unfortunately, it is all too easy for an entangled information transmission system to degrade, due to all sorts of reasons, from interference from the outside world to the degradation and scattering of photons. And, unlike traditional ways of transmitting data, quantum information can't be 'boosted' with the use of repeaters.

However, now researchers have managed to show how the transmission of quantum information over long distances is in fact possible in a real-world setting using already existing fibre networks. These latest demonstrations of the potential of transmitting information via entangled photons were performed by three separate research teams, based in the United States, China and the Netherlands, and utilised existing fibre optic networks, with the information being transmitted over several kilometres in busy urban areas. And, while this doesn't mean that we're going to be using a quantum internet any time soon, the experiments represent a gigantic step forward in terms of making such a network a reality.

According to Nature, the experiments were made possible by using photons in the infra-red area of the spectrum, making them more friendly to optical fibre. However, each of the three teams differed in the type of quantum memory device that they used.

The Chinese team, lead by Pan Jian-Wei at the University of Science and Technology of China (USTC), utilised three separate quantum memory sites at separate labs, which used the collective states of clouds of rubidium atoms in which to encode the qubit quantum states. According to Nature, "The qubits quantum states can be set using a single photon, or can be read out by tickling the atomic cloud to emit a photon."

The three labs were connected via an optical fibre network to a separate photonic server located around 10km distance away. Because the experiment relied on the photons from at least two of the atom clouds to reach the server at the same time to produce entanglement, the timing required needed to be incredibly precise, thus reducing the practicality of such a system.

The Dutch team's experiment also relied on precise timing, but instead of rubidium atoms they utilised individual nitrogen atoms embedded in small diamond crystals. The qubits were encoded in the electron states of the nitrogen and in the nuclear states of nearby carbon atoms.Importantly, the team performed the experiment over a 25km run of optical fibre, which serves as considerable proof that the transmission of quantum information in a real world setting is possible.

Finally, the US team used two nodes within the same building, but the fibre network they used made its way over long distances throughout the Boston area, apparently crossing the Charles River six times. Unlike the other two teams, the US method required less precision in the timing by sending one photon to entangle itself with a silicon atom at the first node. This photon made its way around the fibre loop, grazing the second silicon atom on arrival, entangling it with the first one.

Okay, so why would we want to do this? Simply put, transmitting quantum information is highly secure, and effectively 'hacker proof'. Another possibility that has been put forward is the idea of connecting several quantum computers together over distance, effectively creating one large computer. Quantum sensor networks are another potential use, enabling high precision measurements, such as highly accurate timekeeping, more precise navigation systems, measurements of gravitational fields, magnetic fields, and other physical phenomena.

On a more 'real-world' playing field that could affect the average internet user, cloud computing could be made totally secure. It's pretty much impossible to intercept the transmission of quantum information without it being detected. AI learning and processing could also be drastically improved with faster training times, improved algorithms, and new ways to tackle data analysis and pattern recognition.

The use of a quantum network could also help improve the overall efficiency of the internet itself, with new protocols and quantum-enhanced algorithms being developed.

Now, it does sometimes seem as if anything to do with quantum computing is a bit like the promise of new battery technology; it never seems to actually arrive in a practical way. But, the experiments above represent one of the most major brick walls to creating a quantum internet being broken down.

References: Nature, Science Daily

Tags: Technology Internet Quantum Computing

Read the original:
The quantum internet is fast becoming a real thing - RedShark News

Ripple publishes math prof’s warning: ‘Public-key cryptosystems should be replaced’ – Cointelegraph

Professor Massimiliano Sala of the University of Trento in Italy recently discussed the future of blockchain technology as it relates to encryption and quantum computing with the crew at Ripple as part of the companys ongoing university lecture series.

Salas discussion focused on the potential threat posed by quantum computers as the technology matures. According to the professor, current encryption methods could be easy for tomorrows quantum computers to solve, thus putting entire blockchains at risk.

Per Sala:

What the professor is referring to is a hypothetical paradigm called Q-day, a point at which quantum computers become sufficiently powerful and available for bad actors to break classical encryption methods.

While this would have far-reaching implications for any field where data security is important including emergency services, infrastructure, banking and defense it could theoretically devastate the world of cryptocurrency and blockchain.

Specifically, Sala warned that all classical public-key cryptosystems should be replaced with counterparts secure against quantum attacks. The idea here is that a future quantum computer or quantum attack algorithm could crack the encryption on these keys using mathematical brute force.

It bears mention that Bitcoin, the worlds most popular cryptocurrency and blockchain, would fall under this category.

While there currently exists no practical quantum computer capable of such a feat, governments and science institutions around the globe have been preparing for Q-day as if its an eventuality. For his part, Sala said that such an event may not be imminent. However, physicists at dozens of academic and commercial laboratories have demonstrated breakthroughs that have led many in the field to believe such systems could arrive within a matter of years.

Ultimately, Sala said hes satisfied with the progress being made in the sector and recommends that blockchain developers continue to work with encryption experts who understand the standards and innovations surrounding quantum-proofing modern systems.

Related: Harvard built hacker-proof quantum network in Boston using existing fiber cable

View post:
Ripple publishes math prof's warning: 'Public-key cryptosystems should be replaced' - Cointelegraph

Explore the Growing Role of Linux in Quantum Computing – ITPro Today

Quantum computers differ fundamentally from classical computers. Classical computer chips rely on billions of transistors, each in a binary state of either on or off. A quantum computer, on the other hand, uses qubits instead of transistors, and these qubits can exist in multiple states simultaneously, thanks to quantum mechanics principles like superposition and entanglement. This means that a qubit can be on, off, or in a combination of both states, providing a vast range of possibilities in processing. The state of a qubit can be altered by observation, a phenomenon known as the Schrdinger effect. While quantum computers excel at solving certain problems, they do not replace classical computers entirely.

As quantum computing technology advances, there is a growing need for operating systems that can support quantum computing frameworks. In this article, we will explore the intersection of Linux and quantum computing, focusing on how Linux-based operating systems are becoming pivotal in the development and deployment of quantum computing technologies. We will also examine recent advancements in quantum computing, the role of Linux in quantum programming environments, and how Linux distributions are adapting to support quantum computing frameworks.

Related: How To Get Started in Quantum Early Adopters Offer Advice

As mentioned, quantum computing uses the principles of quantum mechanics, such as quantum entanglement, to perform calculations that would be practically impossible for classical computers, including even multi-GPU supercomputers. Because qubits can exist in multiple states at once, quantum computers can conduct parallel computations to solve the most complex of problems.

Over the past few decades, quantum computing and its theoretical underpinnings have come a long way. Major tech companies like Google and IBM have made substantial investments in the field. IBM among others has even made their quantum computers available online, allowing anyone to learn about the specifics of quantum computing and run workloads through quantum logic gates.

The open-source nature of Linux has enabled developers to develop operating systems that are both flexible and robust. Linux is inherently compatible with most of the software and tools used in the quantum computing environment.

Several quantum programming languages and frameworks, including IBMs Qiskit, Googles Cirq, and QuTiP (Quantum Toolbox in Python), run natively on Linux-based systems. Additionally, Linux readily supports containerization technologies like Docker and container orchestration tools like Kubernetes, core components in quantum computing environments. Containerization allows developers to package quantum computing applications and their dependencies in self-contained, portable units, facilitating deployment and management, even at scale and across various hardware architectures.

Linux distributions must evolve to meet the developing needs of quantum computing programming and research. Various Linux distributions make it easy for developers to install and maintain quantum computing tools by providing specialized packages and repositories for quantum computing software. Ubuntu, Fedora, and Debian are among these distributions.

Additionally, some Linux distributors are exploring quantum computing simulators and emulators, enabling users to experiment with quantum algorithms and workflows even without physical access to hardware. This development bridges the gap for Linux users, giving them access to both classical and quantum computing systems, which had been previously available mainly to Windows and MacOS users.

There have also been advancements in the compatibility between Linux distributions and quantum processors. As quantum computing technology becomes more affordable and accessible, Linux distributions must ensure integration with quantum computing processing units and peripherals. The integration allows users to take advantage of quantum acceleration for specific workloads, enhancing computational capabilities.

Linux, famous for its Unix-based operating system, is celebrated for its flexibility, scalability, and open-source ethos, making it well-suited for quantum computing applications. Several factors underscore Linuxs growing role in the quantum computing environment.

Linux enables developers to customize their computing environments to suit their specific personal or organizational needs. This flexibility has proven crucial in ensuring Linux remains up to date with quantum computing demands.

Linux operating systems are inherently highly compatible with various hardware architectures, making them well-suited for quantum computing platforms.

Linux has a vibrant open-source community that encourages knowledge exchange and cooperation. This communal ethos accelerates progress in quantum computing research because of the exchange of ideas and resources.

Security is of paramount importance in quantum computing systems, especially in handling sensitive data and cryptographic algorithms. Linux stands out with its robust security features, coupled with its extensive support for encryption and authentication, making it an ideal choice for operating systems powering quantum computing systems and applications.

Several different software packages for Linux have been specifically designed for quantum computing research and development. These packagescome with essential tools and libraries. Here are a few examples.

Qiskit is IBMs quantum computing development framework, written in Python. It offers a toolkit for quantum computing circuit design, simulation, and execution. Known for its compatibility with multiple Linux distributions, Qiskit is in wideuse.

QuTiP, short for Quantum Toolbox in Python, is a Python software package for quantum computing simulations. Built on Python and the NumPy library, QuTiP offers a wide range of functionalities for simulating quantum computing systems. QuTiP is compatible with most Linux distributions, and it is frequently used for quantum optical applications and quantum information science.

ProjectQ is an open-source quantum computing framework developed in Python. It is useful for simplifying the development of quantum computing algorithms and applications. It achieves this by providing high-level intuitive APIs (application programming interfaces) and abstractions. Compatible with most Linux distributions, ProjectQ also supports various quantum backends.

Linux has gained major tractionin the quantum computing space in recent years. However, several challenges persist. One such challenge is optimizing Linux distributions for quantum computing hardware, which requires specialized drivers and low-level optimizations. Additionally, security remains an ongoing concern that requires focused attention to mitigate potential threats.

Despite these challenges, Linux is positioned favorably to play a significant role in quantum computing systems. As the fieldexpands, Linux software packages and distributions tailored for quantum computing are becoming increasingly prevalent and evolving alongside advancements. Collaboration with open-source communities also has the potential to drive innovation and accelerate development in the space.

Linux has emerged as a foundational element in the evolution of quantum computing systems. Linuxs inherent customizability, compatibility, security, and robustness make it an ideal operating system for quantum computing. As this transformative technology continues to evolve, Linux looks set to maintain its essential role in shaping its future.

ITPro Today Linux resources

Read the original:
Explore the Growing Role of Linux in Quantum Computing - ITPro Today