Archive for the ‘Quantum Computer’ Category

New Superconducting Diode Could Improve Performance Of … – Eurasia Review

A University of Minnesota Twin Cities-led team has developed a new superconducting diode, a key component in electronic devices, that could help scale up quantum computers for industry use and improve the performance of artificial intelligence systems. Compared to other superconducting diodes, the researchers device is more energy efficient; can process multiple electrical signals at a time; and contains a series of gates to control the flow of energy, a feature that has never before been integrated into a superconducting diode.

The paper is published inNature Communications, a peer-reviewed scientific journal that covers the natural sciences and engineering.

A diode allows current to flow one way but not the other in an electrical circuit. Its essentially half of a transistor, the main element in computer chips. Diodes are typically made with semiconductors, but researchers are interested in making them with superconductors, which have the ability to transfer energy without losing any power along the way.

We want to make computers more powerful, but there are some hard limits we are going to hit soon with our current materials and fabrication methods, said Vlad Pribiag, senior author of the paper and an associate professor in the University of Minnesota School of Physics and Astronomy. We need new ways to develop computers, and one of the biggest challenges for increasing computing power right now is that they dissipate so much energy. So, were thinking of ways that superconducting technologies might help with that.

The University of Minnesota researchers created the device using three Josephson junctions, which are made by sandwiching pieces of non-superconducting material between superconductors. In this case, the researchers connected the superconductors with layers of semiconductors. The devices unique design allows the researchers to use voltage to control the behavior of the device.

Their device also has the ability to process multiple signal inputs, whereas typical diodes can only handle one input and one output. This feature could have applications in neuromorphic computing, a method of engineering electrical circuits to mimic the way neurons function in the brain to enhance the performance of artificial intelligence systems.

The device weve made has close to the highest energy efficiency that has ever been shown, and for the first time, weve shown that you can add gates and apply electric fields to tune this effect, explained Mohit Gupta, first author of the paper and a Ph.D. student in the University of Minnesota School of Physics and Astronomy. Other researchers have made superconducting devices before, but the materials theyve used have been very difficult to fabricate. Our design uses materials that are more industry-friendly and deliver new functionalities.

The method the researchers used can, in principle, be used with any type of superconductor, making it more versatile and easier to use than other techniques in the field. Because of these qualities, their device is more compatible for industry applications and could help scale up the development of quantum computers for wider use.

Right now, all the quantum computing machines out there are very basic relative to the needs of real-world applications, Pribiag said. Scaling up is necessary in order to have a computer thats powerful enough to tackle useful, complex problems. A lot of people are researching algorithms and usage cases for computers or AI machines that could potentially outperform classical computers. Here, were developing the hardware that could enable quantum computers to implement these algorithms. This shows the power of universities seeding these ideas that eventually make their way to industry and are integrated into practical machines.

Read more here:
New Superconducting Diode Could Improve Performance Of ... - Eurasia Review

Graphene and Quantum Computing: A Match Made in Heaven – CityLife

Graphene and Quantum Computing: A Match Made in Heaven

Graphene, a single layer of carbon atoms arranged in a hexagonal lattice, has been hailed as a wonder material since its discovery in 2004. This ultra-thin, ultra-strong material has the potential to revolutionize industries ranging from electronics to medicine. One area where graphenes unique properties could have a particularly profound impact is in the realm of quantum computing.

Quantum computing is an emerging field that seeks to harness the strange and powerful properties of quantum mechanics to perform calculations far beyond the capabilities of classical computers. While still in its infancy, quantum computing has the potential to revolutionize fields such as cryptography, drug discovery, and artificial intelligence. However, the development of practical quantum computers has been hampered by a number of technical challenges, including the need for materials that can support and manipulate delicate quantum states.

This is where graphene comes in. Graphenes remarkable electronic properties make it an ideal candidate for use in quantum computing. For one, graphene is an excellent conductor of electricity, with electrons able to move through the material with very little resistance. This property could be used to create ultra-fast, low-power quantum computing devices.

Moreover, graphenes two-dimensional structure gives it unique quantum properties. Electrons in graphene behave as if they have no mass, allowing them to move at extremely high speeds and follow the rules of quantum mechanics rather than classical physics. This means that graphene could potentially be used to create quantum bits, or qubits, the fundamental building blocks of quantum computers.

Qubits are the quantum equivalent of classical bits, which represent information as either a 0 or a 1. However, qubits can exist in a superposition of both 0 and 1 simultaneously, allowing quantum computers to perform many calculations at once. This parallelism is what gives quantum computers their immense potential for solving complex problems.

One of the key challenges in building a quantum computer is maintaining the delicate quantum states of qubits. Quantum states are easily disturbed by their environment, leading to errors in calculations. This phenomenon, known as decoherence, is a major obstacle to the development of practical quantum computers.

Graphenes unique properties could help address this issue. The materials two-dimensional structure means that it can be easily integrated with other materials, such as superconductors, which are essential for maintaining quantum states. Additionally, graphenes high electron mobility could be used to create devices that can manipulate and control qubits with high precision.

Recent research has demonstrated the potential of graphene for quantum computing applications. In one study, scientists at the Massachusetts Institute of Technology (MIT) were able to create a graphene-based device that could control the flow of electrons with a high degree of precision. This device, known as a valleytronics system, could potentially be used to create qubits that are less susceptible to decoherence.

In another study, researchers at the University of Cambridge were able to use graphene to create a new type of qubit that is both more stable and more easily controlled than existing designs. This topological qubit could be a major step forward in the development of practical quantum computers.

While there is still much work to be done, it is clear that graphene has the potential to play a crucial role in the development of quantum computing. The marriage of these two cutting-edge fields could lead to breakthroughs that were once thought to be the stuff of science fiction. As researchers continue to explore the potential of graphene and quantum computing, we may be on the cusp of a new era of technological innovation that will reshape our world in ways we can only begin to imagine.

Read more:
Graphene and Quantum Computing: A Match Made in Heaven - CityLife

Cyberwarfare: How the IDF safeguards strategic assets in the digital … – Ynetnews

The artificial intelligence craze sweeping the planet has not skipped intelligence and defense systems, especially since the Israel Defense Forces and many other western militaries have been utilizing it for years - but it's the leap in generative AI that is noteworthy.

How quickly has every child been able to transform himself into a professional painter, author and even hacker, is a phenomenon that we all need to take a pause for and be mindful of, as it exemplifies how quickly forceful technology has made the shift from obscure laboratories, hidden from public view, to the every child's bedroom.

3 View gallery

The IDF is spearheading cyberwarfare

(Photo: Dana Koppel)

Take quantum computing, for instance.

The crumbling of cipher keys has become every security system's biggest nightmare scenario for 2023. We're talking about a situation in which internal communications, computer networks and operational documents become publicly exposed, which will surely signal an unprecedented security breach.

As far as Israel goes, it was in 1997, when the Ansariya ambush, in which a unit from the Israeli Navys special operation unit, Shayetet 13, on a mission in South Lebanon, stumbled into a deadly ambush by Islamic Resistance guerrillas, leaving 12 operatives dead.

While in a civilian context the day-to-day war of attrition against hackers is conducted in the name of protecting private clients and patents, in the military realm, it is about protecting a country's strategic assets.

In a more narrowly defined Israeli context, it means protecting the Iron Dome missile defense system, the digital emergency alarm array and operational details ingrained in top secret IDF plans.

Cyberwarfare is divided between military intelligence and C4I corps, the IDF's elite technological unit. The Cyber Defense Brigade was established six years ago, and the most intriguing component of that brigade is the Center of Encryption and Information Security.

That's where ciphers and codes are developed, serving the IDF, Shin Bet, Mossad and many other governmental bodies.

The Center of Encryption and Information Security officials say that the most convenient part of cyber is dealing with what's known and familiar. The future, on the other hand, gets trickier to deal with, and that entails quantum computing.

It is a rather advanced processing method, based on observations made in quantum mechanics. "Quantum computers will be able to instantaneously perform tasks that today's computers would require at least a millennia. They would easily crack today's ciphers," a lieutenant colonel from the unit says.

"When you currently connect to your bank account, work, email or WhatsApp, various components ensure the security of your access. One crucial element is an algorithm called RSA, which relies on intricate mathematical problems," he says.

3 View gallery

IBM's quantum computer at an exhibition in Germany

(Photo: Shutterstock)

"While these problems can theoretically be solved, they are notoriously complex and time-consuming, even for supercomputers. However, with the advent of quantum computers, RSA encryption could be defeated within seconds.

"This implies that hackers or adversaries would possess nearly limitless computing power to decrypt traditional ciphers. Consequently, sensitive and encrypted data could be compromised today, with the potential to decrypt it once a quantum computer of sufficient strength becomes accessible," the lieutenant colonel explains.

Could this danger materialize tomorrow? "That would depend on your definition of tomorrow. Major technology companies are already demonstrating remarkable advancements in this domain, with estimates suggesting that they will develop a stable and dependable quantum computer within the next five to 10 years.

"From the perspective of the IDF, this timeline is alarmingly brief. We consider it highly likely that within the coming decade, quantum computers will fall into the hands of entities interested in accessing the IDF's classified information. Consequently, we have been diligently studying this subject since the mid-2000s."

"Keep in mind, this is uncharted territory," says a major in the unit. "Here, we do not rely on pre-existing textbooks or established foundations. We are tasked with starting from scratch, immersing ourselves in comprehensive self-learning and research. What's more, we take on the responsibility of developing our own curriculum and training individuals from the ground up."

Aside from its computational applications, quantum technology has the kind of applications that could rival an episode of "Star Trek." Many of these advancements are poised to have a profound impact on the military system, with some already being partially realized.

An example of this can be observed in the use of Lidar technology, which employs quantum sensors for laser-based object mapping. It is already integrated into autonomous vehicles, smartphones and is instrumental in generating highly detailed maps.

Quantum sensors will also enable remarkably precise navigation, independent of GPS satellites or similar systems. Furthermore, quantum communication promises stable and secure connections over considerable distances, often spanning dozens of miles.

3 View gallery

Cyberwarfare could soon replace traditional battlefields

(Photo: Courtesy)

But with many of those serving in these specialized cyber units ranging from 18 to 30 years of age, it raises the question: How would a bunch of kids solve problems that the planet's finest minds are still struggling with?

The lieutenant colonel is optimistic about that. "First, and this may sound trite, I firmly believe in the exceptionalism of the 'Jewish mind,' particularly in the realm of mathematics. Since its inception, the proportion of graduates from the mathematical field who have gone on to become esteemed doctors and professors in academia is remarkable.

"Second, the IDF possesses a unique advantage in its ability to bring together the brightest minds in one place, all working toward solving the same problems. Unlike academia, where minds are dispersed and lack a unified mission, the IDF provides a concrete operational context for our missions.

"Moreover, we receive continuous support from reserve personnel and external consultants who have successfully passed through rigorous security clearance protocols. The IDF benefits from a wealth of research knowledge accumulated over decades."

How do you research quantum computing with a quantum computer? "The research we conduct is based on algorithms and, in theory, it can be performed since we understand the behavior involved. However, it's evident that for demonstration and testing purposes, a quantum computer is necessary, which is currently unavailable in Israel.

"To overcome this limitation, we rely on quantum computing services provided by prominent international software giants through the cloud. We make use of these services extensively for our research endeavors."

Original post:
Cyberwarfare: How the IDF safeguards strategic assets in the digital ... - Ynetnews

The 5 Most Promising AI Hardware Technologies – MUO – MakeUseOf

Artificial Intelligence (AI) has made remarkable advancements since the end of 2022. Increasingly sophisticated AI-based software applications are revolutionizing various sectors by providing inventive solutions. From seamless customer service chatbots to stunning visual generators, AI is enhancing our daily experiences. However, behind the scenes, AI hardware is pivotal in fueling these intelligent systems.

AI hardware refers to specialized computer hardware designed to perform AI-related tasks efficiently. This includes specific chips and integrated circuits that offer faster processing and energy-saving capabilities. In addition, they provide the necessary infrastructure to execute AI algorithms and models effectively.

The role of AI hardware in machine learning is crucial as it aids in the execution of complex programs for deep learning models. Furthermore, compared to conventional computer hardware like central processing units (CPUs), AI hardware can accelerate numerous processes, significantly reducing the time and cost required for algorithm training and execution.

Furthermore, with the growing popularity of AI and machine learning models, there has been an increased demand for acceleration solutions. As a result, companies like Nvidia, the world's leading GPU manufacturer, have witnessed substantial growth. In June 2023, The Washington Post reported that Nvidia's market value surpassed $1 trillion, surpassing the worth of Tesla and Meta. Nvidia's success highlights the significance of AI hardware in today's technology landscape.

If you're familiar with what edge computing is, you likely have some understanding of edge computing chips. These specialized processors are designed specifically to run AI models at the network's edge. With edge computing chips, users can process data and perform crucial analytical operations directly at the source of the data, eliminating the need for data transmission to centralized systems.

The applications for edge computing chips are diverse and extensive. They find utility in self-driving cars, facial recognition systems, smart cameras, drones, portable medical devices, and other real-time decision-making scenarios.

The advantages of edge computing chips are significant. Firstly, they greatly reduce latency by processing data near its source, enhancing the overall performance of AI ecosystems. Additionally, edge computing enhances security by minimizing the amount of data that needs to be transmitted to the cloud.

Here are some of the leading manufacturers of AI hardware in the field of edge computing chips:

Some might wonder, "What is quantum computing, and is it even real?" Quantum computing is indeed a real and advanced computing system that operates based on the principles of quantum mechanics. While classical computers use bits, quantum computing utilizes quantum bits (qubits) to perform computations. These qubits enable quantum computing systems to process large datasets more efficiently, making them highly suitable for AI, machine learning, and deep learning models.

The applications of quantum hardware have the potential to revolutionize AI algorithms. For example, in drug discovery, quantum hardware can simulate the behavior of molecules, aiding researchers in accurately identifying new drugs. Similarly, in material science, it can contribute to climate change predictions. The financial sector can benefit from quantum hardware by developing price prediction tools.

Below are the significant benefits of quantum computing for AI:

Application Specific Integrated Circuits (ASICs) are designed for targeted tasks like image processing and speech recognition (though you may have heard about ASICs through cryptocurrency mining). Their purpose is to accelerate AI procedures to meet the specific needs of your business, providing an efficient infrastructure that enhances overall speed within the ecosystem.

ASICs are cost-effective compared to traditional central processing units (CPUs) or graphics processing units (GPUs). This is due to their power efficiency and superior task performance, surpassing CPUs and GPUs. As a result, ASICs facilitate AI algorithms across various applications.

These integrated circuits can handle substantial volumes of data, making them instrumental in training artificial intelligence models. Their applications extend to diverse fields, including natural language processing of texts and speech data. Furthermore, they simplify the deployment of complex machine-learning mechanisms.

Neuromorphic hardware represents a significant advancement in computer hardware technology, aiming to mimic the functioning of the human brain. This innovative hardware emulates the human nervous system and adopts a neural network infrastructure, operating with a bottom-up approach. The network comprises interconnected processors, referred to as neurons.

In contrast to traditional computing hardware that processes data sequentially, neuromorphic hardware excels at parallel processing. This parallel processing capability enables the network to simultaneously execute multiple tasks, resulting in improved speed and energy efficiency.

Furthermore, neuromorphic hardware offers several other compelling advantages. It can be trained with extensive datasets, making it suitable for a wide range of applications, including image detection, speech recognition, and natural language processing. Additionally, the accuracy of neuromorphic hardware is remarkable, as it rapidly learns from vast amounts of data.

Here are some of the most notable neuromorphic computing applications:

A Field Programmable Gate Array (FPGA) is an advanced integrated circuit that offers valuable benefits for implementing AI software. These specialized chips can be customized and programmed to meet the specific requirements of the AI ecosystem, earning them the name "field-programmable."

FPGAs consist of configurable logic blocks (CLBs) that are interconnected and programmable. This inherent flexibility allows for a wide range of applications in the field of AI. In addition, these chips can be programmed to handle operations of varying complexity levels, adapting to the system's specific needs.

Operating like a read-only memory chip but with a higher gate capacity, FPGAs offer the advantage of re-programmability. This means they can be programmed multiple times, allowing for adjustments and scalability per the evolving requirements. Furthermore, FPGAs are more efficient than traditional computing hardware, offering a robust and cost-effective architecture for AI applications.

In addition to their customization and performance advantages, FPGAs also provide enhanced security measures. Their complete architecture ensures robust protection, making them reliable for secure AI implementations.

AI hardware is on the cusp of transformative advancements. Evolving AI applications demand specialized systems to meet computational needs. Innovations in processors, accelerators, and neuromorphic chips prioritize efficiency, speed, energy savings, and parallel computing. Integrating AI hardware into edge and IoT devices enables on-device processing, reduced latency, and enhanced privacy. Convergence with quantum computing and neuromorphic engineering unlocks the potential for exponential power and human-like learning.

The future of AI hardware holds the promise of powerful, efficient, and specialized computing systems that will revolutionize industries and reshape our interactions with intelligent technologies.

Originally posted here:
The 5 Most Promising AI Hardware Technologies - MUO - MakeUseOf

Quantum computing: The five biggest breakthroughs – Engineers Ireland

Quantum computing is a revolutionary technology already making waves in many industries, such as drug discovery, cryptography, finance, and logistics. It works by exploiting quantum mechanical phenomena to perform complex computations in a fraction of the time classical computers require. Two main quantum mechanical phenomena drive quantum computers' speed and computational prowess superposition and entanglement.

Unlike classical computers, which operate on binary bits (0 and 1), quantum computers operate on quantum bits or qubits. Qubits can exist in a state of superposition. This means that any qubit has some probability of existing simultaneously in the 0 and 1 states, exponentially increasing the computational power of quantum computers.

Another unique property that qubits have is their ability to become entangled. This means that two qubits, no matter how physically far, are correlated so that knowing the state of one particle automatically tells us something about its companion, even when they are far apart. This correlation can be harnessed for processing vast amounts of data and solving complex problems that classical computers cannot.

Classical computers only have the power to simulate phenomena based on classical physics, making it more difficult or slower to solve problems that rely on quantum phenomena. This is where the true importance of quantum computers lies.

Since quantum computers are based on qubits, they can solve challenging problems using classical computers and revolutionise many industries. For example, quantum computers can rapidly simulate molecules and chemical reactions, discovering new drugs and materials with exceptional properties.

Although significant breakthroughs have been made in quantum computing, we are still in the nascent stages of its development.

The objective of quantum supremacy is to demonstrate that a quantum computer can solve a problem that no classical computer can solve in any reasonable length of time, despite the usefulness of the problem. Achieving this goal demonstrates the power of a quantum computer over a classical computer in complex problem-solving.

InOctober 2019, Google confirmedthat it had achieved quantum supremacy using its fully programmable 54-qubit processor called Sycamore. They solved a sampling problem in 200 seconds which would take a supercomputer nearly 10,000 years to solve. This marked a significant achievement in the development of quantum computing.

Richard Feynman first theorised the idea of using quantum mechanics to perform calculations impossible for classical computers. Image:Unknown/Wikimedia Commons

Since then, many researchers have demonstrated quantum supremacy by solving various sampling problems. The impact of achieving quantum supremacy cannot be overstated. It validates the potential of quantum computing to solve problems beyond the capabilities of classical computers, as first theorised by Richard Feynman in the 1980s.

Apart from sampling problems, other applications have been proposed for demonstrating quantum supremacy, such as Shor's algorithm for factoring integers which are extremely important in encryption. However, implementing Shor's algorithm for large numbers is not feasible with existing technology and is hence not the preferred oversampling algorithm for demonstrating supremacy.

The most pressing concern with quantum computers is their sensitivity to errors induced by environmental noise and imperfect control. This hinders their practical usability, as data stored on a quantum computer can become corrupted.

Classical error correction relies on redundancy, ie, repetition. However, quantum information cannot be cloned or copied due to the no-cloning theorem (which states thatit is impossible to create an independent and identical copy of an arbitrary unknownquantum state). Therefore, a new error correction method is required for quantum computing systems.

QEC for a single qubit. Image:Self/Wikimedia Commons

Quantum error correction (QEC) is a way to mitigate these errors and ensure that the data stored on a quantum computer is error-free, thus improving the reliability and accuracy of quantum computers.

The principle of QEC is to encode the data stored on a quantum computer such that the errors can be detected and corrected without disrupting the computation being performed on it.

This is done using quantum error-correction codes (QECCs). QECCs work by encoding the information onto a larger state space. They further correct the error without measuring the quantum state, thereby preventing the collapse of the quantum state.

The first experimental demonstration of QEC was done in 1998with nuclear magnetic resonance qubits. Since then, several experiments to demonstrate QEC have been performed using, for example, linear optics and trapped ions, among others.

A significant breakthrough camein 2016 when researchers extended the lifespan of a quantum bit using QEC. Their research showed the advantage of using hardware-efficient qubit encoding over traditional QEC methods for improving the lifetime of a qubit.

The detection and elimination of errors is critical to developing realistic quantum computers. QEC handles errors in the stored quantum information, but what about the errors after performing operations? Is there a way to correct those errors and ensure that the computations are not useless?

Fault-tolerant quantum computing is a method to ensure that these errors are detected and corrected using a combination of QECCs and fault-tolerant gates. This ensures that errors arising during the computations don't accumulate and render them worthless.

Quantum computing features. Image:Akash Sain/iStock

The biggest challenge in achieving fault-tolerant quantum computing is the need for many qubits. QECCs themselves require a lot of qubits to detect and correct errors.

Additionally, fault-tolerant gates also require a large number of qubits. However, two independent theoretical studies published in1998and2008proved that fault-tolerant quantum computers can be built. This has come to be known as the threshold theorem, which states that if the physical error rates of a quantum computer are below a certain threshold, the logical error rate can be suppressed to arbitrarily low values.

No experimental findings have proven fault-tolerant quantum computing due to the high number of qubits needed. The closest we've come to an experimental realisation is a2022 study published in Nature,demonstrating fault-tolerant universal quantum gate operations.

We have seen teleportation one too many times in science fiction movies and TV shows. But are any researchers close to making it a reality? Well, yes and no. Quantum teleportation allows for transferring one quantum state from one physical location to another without physically moving the quantum state itself. It has a wide range of applications, from secure quantum communication to distributed quantum computing.

Quantum teleportation wasfirst investigated in 1993by scientists who were using it as a way to send and receive quantum information. It was experimentally realised only four years later, in 1997, by two independent research groups. The basic principle behind quantum teleportation is entanglement (when two particles remain connected even when separated by vast distances).

Since 1997, many research groups have demonstrated the quantum teleportation of photons, atoms, and other quantum particles. It is the only real form of teleportation that exists.

In fact, the 2022 Nobel Prize in Physics was awarded to three scientists Alain Aspect, John Clauser, and Anton Zeilinger for experiments with entangled photons. The work demonstrated that teleportation between photons was possible. Their work demonstrated quantum entanglement and showed it could be used to teleport quantum information from one photon to another.

Quantum teleportation is the cornerstone for building a quantum internet. This is because it enables the distribution of entanglement over long distances.

Another important application of quantum teleportation is enabling remote quantum operations, meaning that a quantum computation can be performed on a distant processor without transmitting the qubits. This could be useful for secure communication and for performing quantum computations in inaccessible or hostile environments.

Topology is a branch of mathematics concerned with studying the properties of shapes and spaces preserved when deformed. But what does it have to do with quantum computing?

In essence, topological quantum computing is a theoretical model that uses quasiparticles called anyons (quasiparticles in two-dimensional space) for encoding and manipulating qubits.

The method is founded on the topological properties of matter, and in the case of anyons, the world lines (the path that an object traces in four-dimensional spacetime) of these particles form braids. These braids then make up the logic gates which are the building blocks of computers.

No experimental studies demonstrate topological quantum computing. Image:FMNLab/Wikimedia Commons

Topological qubits are protected against local perturbations and can be manipulated with high precision, making them less susceptible to decoherence. Additionally, topological quantum computing is more resistant to errors due to its inherent redundancy and topological protection, making it a promising candidate for fault-tolerant quantum computing.

Most topological quantum computing research is theoretical; currently, no studies provide substantial experimental support for the same. But, developments in this area of research are vital for building practical and scalable quantum computers.

With a mix of theoretical and experimental demonstrations, quantum computing is still in the early stages of research and development. These developments can potentially revolutionise several industries and academic disciplines, including financial services, materials science, cryptography, and artificial intelligence.

Even if there is still more study, the implications for quantum computing's future are promising. We may anticipate further developments and innovations in the years to come.

Continued here:
Quantum computing: The five biggest breakthroughs - Engineers Ireland