Archive for the ‘Quantum Computer’ Category

Quantum Mechanics, the Chinese Room Experiment and the Limits of Understanding – Scientific American

Like great art, great thought experiments have implications unintended by their creators. Take philosopher John Searles Chinese room experiment. Searle concocted it to convince us that computers dont really think as we do; they manipulate symbols mindlessly, without understanding what they are doing.

Searle meant to make a point about the limits of machine cognition. Recently, however, the Chinese room experiment has goaded me into dwelling on the limits of human cognition. We humans can be pretty mindless too, even when engaged in a pursuit as lofty as quantum physics.

Some background. Searle first proposed the Chinese room experiment in 1980. At the time, artificial intelligence researchers, who have always been prone to mood swings, were cocky. Some claimed that machines would soon pass the Turing test, a means of determining whether a machine thinks.

Computer pioneer Alan Turing proposed in 1950 that questions be fed to a machine and a human. If we cannot distinguish the machines answers from the humans, then we must grant that the machine does indeed think. Thinking, after all, is just the manipulation of symbols, such as numbers or words, toward a certain end.

Some AI enthusiasts insisted that thinking, whether carried out by neurons or transistors, entails conscious understanding. Marvin Minsky espoused this strong AI viewpoint when I interviewed him in 1993. After defining consciousness as a record-keeping system, Minsky asserted that LISP software, which tracks its own computations, is extremely conscious, much more so than humans. When I expressed skepticism, Minsky called me racist.

Back to Searle, who found strong AI annoying and wanted to rebut it. He asks us to imagine a man who doesnt understand Chinese sitting in a room. The room contains a manual that tells the man how to respond to a string of Chinese characters with another string of characters. Someone outside the room slips a sheet of paper with Chinese characters on it under the door. The man finds the right response in the manual, copies it onto a sheet of paper and slips it back under the door.

Unknown to the man, he is replying to a question, like What is your favorite color?, with an appropriate answer, like Blue. In this way, he mimics someone who understands Chinese even though he doesnt know a word. Thats what computers do, too, according to Searle. They process symbols in ways that simulate human thinking, but they are actually mindless automatons.

Searles thought experiment has provoked countless objections. Heres mine. The Chinese room experiment is a splendid case of begging the question (not in the sense of raising a question, which is what most people mean by the phrase nowadays, but in the original sense of circular reasoning). The meta-question posed by the Chinese Room Experiment is this: How do we know whether any entity, biological or non-biological, has a subjective, conscious experience?

When you ask this question, you are bumping into what I call the solipsism problem. No conscious being has direct access to the conscious experience of any other conscious being. I cannot be absolutely sure that you or any other person is conscious, let alone that a jellyfish or smartphone is conscious. I can only make inferences based on the behavior of the person, jellyfish or smartphone.

Now, I assume that most humans, including those of you reading these words, are conscious, as I am. I also suspect that Searle is probably right, and that an intelligent program like Siri only mimics understanding of English. It doesnt feel like anything to be Siri, which manipulates bits mindlessly. Thats my guess, but I cant know for sure, because of the solipsism problem.

Nor can I know what its like to be the man in the Chinese room. He may or may not understand Chinese; he may or may not be conscious. There is no way of knowing, again, because of the solipsism problem. Searles argument assumes that we can know whats going on, or not going on, in the mans mind, and hence, by implication, whats going on or not in a machine. His flawed initial assumption leads to his flawed, question-begging conclusion.

That doesnt mean the Chinese room experiment has no value. Far from it. The Stanford Encyclopedia of Philosophy calls it the most widely discussed philosophical argument in cognitive science to appear since the Turing Test. Searles thought experiment continues to pop up in my thoughts. Recently, for example, it nudged me toward a disturbing conclusion about quantum mechanics, which Ive been struggling to learn over the last year or so.

Physicists emphasize that you cannot understand quantum mechanics without understanding its underlying mathematics. You should have, at a minimum, a grounding in logarithms, trigonometry, calculus (differential and integral) and linear algebra. Knowing Fourier transforms wouldnt hurt.

Thats a lot of math, especially for a geezer and former literature major like me. I was thus relieved to discover Q Is for Quantum by physicist Terry Rudolph. He explains superposition, entanglement and other key quantum concepts with a relatively simple mathematical system, which involves arithmetic, a little algebra and lots of diagrams with black and white balls falling into and out of boxes.

Rudolph emphasizes, however, that some math is essential. Trying to grasp quantum mechanics without any math, he says, is like having van Goghs Starry Night described in words to you by someone who has only seen a black and white photograph. One that a dog chewed.

But heres the irony. Mastering the mathematics of quantum mechanics doesnt make it easier to understand and might even make it harder. Rudolph, who teaches quantum mechanics and co-founded a quantum-computer company, says he feels cognitive dissonance when he tries to connect quantum formulas to sensible physical phenomena.

Indeed, some physicists and philosophers worry that physics education focuses too narrowly on formulas and not enough on what they mean. Philosopher Tim Maudlin complains in Philosophy of Physics: Quantum Theory that most physics textbooks and courses do not present quantum mechanics as a theory, that is, a description of the world; instead, they present it as a recipe, or set of mathematical procedures, for accomplishing certain tasks.

Learning the recipe can help you predict the results of experiments and design microchips, Maudlin acknowledges. But if a physics student happens to be unsatisfied with just learning these mathematical techniques for making predictions and asks instead what the theory claims about the physical world, she or he is likely to be met with a canonical response: Shut up and calculate!

In his book, Maudlin presents several attempts to make sense of quantum mechanics, including the pilot-wave and many-worlds models. His goal is to show that we can translate the Schrdinger equation and other formulas into intelligible accounts of whats happening in, say, the double-slit experiment. But to my mind, Maudlins ruthless examination of the quantum models subverts his intention. Each model seems preposterous in its own way.

Pondering the plight of physicists, Im reminded of an argument advanced by philosopher Daniel Dennett in From Bacteria to Bach and Back: The Evolution of Minds. Dennett elaborates on his long-standing claim that consciousness is overrated, at least when it comes to doing what we need to do to get through a typical day. We carry out most tasks with little or no conscious attention.

Dennett calls this competence without comprehension. Adding insult to injury, Dennett suggests that we are virtual zombies. When philosophers refer to zombies, they mean not the clumsy, grunting cannibals of The Walking Dead but creatures that walk and talk like sentient humans but lack inner awareness.

When I reviewed Dennetts book, I slammed him for downplaying consciousness and overstating the significance of unconscious cognition. Competence without comprehension may apply to menial tasks like brushing your teeth or driving a car but certainly not to science and other lofty intellectual pursuits. Maybe Dennett is a zombie, but Im not! That, more or less, was my reaction.

But lately Ive been haunted by the ubiquity of competence without comprehension. Quantum physicists, for example, manipulate differential equations and matrices with impressive competenceenough to build quantum computers!but no real understanding of what the math means. If physicists end up like information-processing automatons, what hope is there for the rest of us? After all, our minds are habituation machines, designed to turn even complex taskslike being a parent, husband or teacherinto routines that we perform by rote, with minimal cognitive effort.

The Chinese room experiment serves as a metaphor not only for physics but also for the human condition. Each of us sits alone within the cell of our subjective awareness. Now and then we receive cryptic messages from the outside world. Only dimly comprehending what we are doing, we compose responses, which we slip under the door. In this way, we manage to survive, even though we never really know what the hell is happening.

Further Reading:

Is the Schrdinger Equation True?

Will Artificial Intelligence Ever Live Up to Its Hype?

Can Science Illuminate Our Inner Dark Matter

See original here:
Quantum Mechanics, the Chinese Room Experiment and the Limits of Understanding - Scientific American

After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing – The Globe and Mail

D-Wave is the first company to offer a commercially available quantum computer.

Reuters

One of Canadas most heavily financed technology development companies, quantum computer maker D-Wave Systems Inc., has secured a $40-million financial contribution from the federal government.

The funding, through Ottawas Strategic Innovation Fund, follows a year of reset expectations for D-Wave, a leader in the global race to develop computers whose chips draw their power by harnessing natural properties of subatomic particles to perform complex calculations faster than conventional computers.

Burnaby, B.C.-based D-Wave is the first company to offer a commercially available quantum computer, but after 20-plus years of development and more than US$300-million in funds raised, it is still in the early stages of building a sustainable business.

Story continues below advertisement

Last year D-Wave promoted Silicon Valley veteran executive Alan Baratz to chief executive officer, replacing Vern Brownell, to step up commercialization efforts. The company also parted ways with other top executives and long-time board members.

Mr. Baratz, who led Sun Microsystems Inc.s effort in the 1990s to transform Java from a nascent programming language into the internets main software-writing platform, directed D-Wave to stop selling its shed-sized computers, which listed for US$15-million and had just a handful of customers including NASA, Google, Lockheed Martin and the U.S. Los Alamos National Laboratory.

Instead, D-Wave has focused on selling online access to the technology and expanded its software applications, which Mr. Baratz had started developing after joining as chief product officer in 2017. Customers including Volkswagen and biotechnology startups have used D-Waves technology to find answers to dense optimization problems, such as improving traffic flows in big cities, identifying proteins that could become breakthrough drugs and improving the efficiency of painting operations on vehicle production assembly lines.

D-Wave also completed a costly US$40-million refinancing last year that wiped out most of the value of some long-time investors, including the U.S. Central Intelligence Agencys venture capital arm, Amazon CEO Jeff Bezos and fund giant Fidelity Investments. The capital restructuring cut D-Waves valuation to less than US$170-million, down from US$450-million, The Globe reported in October. Investors that ponied up, including Public Sector Pension Investment Board, D-Waves top shareholder, BDC Capital and Goldman Sachs, maintained their relative stakes, limiting their writedowns.

Over the years [D-wave has] had to raise money and more money and more money ... and as such you end up getting diluted over time because every third quarter it seems like you run out of the $50-million that you raised, Kevin Rendino, CEO and portfolio manager of D-Wave investor 180 Degree Capital Corp., told his investors last November. D-Wave has been a source of bitter disappointment for all of us.

Meanwhile, D-Wave faces years and tens of millions of dollars more in costs to continue developing its core technology. The government aid will support a $120-million project to advance D-Waves hardware and software and will help place Canada at the forefront of quantum technology development, and will create new jobs and opportunities to help Canadians and advance the economy, Franois-Philippe Champagne, Minister of Innovation, Science and Industry, said in a release.

During a press conference to discuss the funding, the minister was asked if the government would review potential takeovers of quantum computing companies, as the U.S. government is considering doing. Mr. Champagne provided a non-committal response, saying Im sure you would expect us to be eyes wide open when it comes to whatever we would need to take in terms of steps to protect.[intellectual property] that has been developed in Canada.

Story continues below advertisement

Were always out there looking at how we can improve to make sure that new technologies and inventions and improvements and IP that has been developed in Canada stays in Canada.

D-Wave faces a slew of competitors including Google, Microsoft, Intel, IBM and Honeywell that are also trying to build the first quantum machine that can outperform classical or conventional computers. In addition, a new class of startups including Torontos Xanadu Quantum Technologies Inc. and College Park, Md.-based IonQ Inc. believe they can build quantum chips that dont have to be supercooled to function, as D-Waves system and others in development do. IonQ said this week it would go public through a special purpose acquisition company become the first publicly traded quantum computing-focused company.

Mr. Baratz said in an emailed statement that since D-Waves launch last September of of its latest quantum chip and expanded efforts to sell online access to its computers weve been encouraged by the positive customer response to the value delivered by a quantum system designed for practical, in-production business-scale applications. Were eager to see even more developers, academics, and companies leverage it to solve larger, more complex problems.

Your time is valuable. Have the Top Business Headlines newsletter conveniently delivered to your inbox in the morning or evening. Sign up today.

Read more here:
After year of reset expectations, D-Wave secures $40-million from Ottawa for quantum computing - The Globe and Mail

Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? – Analytics India Magazine

As per Open AI data, the amount of computational power needed to train large AI models has grown massively doubling every three and a half months since 2021. GPT-3, which requires 3.14E23 FLOPS of computing for training, is a good case in point.

Typically, to carry out high-performance computing tasks, conventional AI chips are equipped with transistors that work with electrons. Although they perform a wide array of complex high performing tasks, energy consumption and engineering glitches pose a challenge. Thus, the growing need for computing power has set researchers on a quest to find a workaround to boost these chips power without increasing energy consumption.

And thats when experts turned to photons and light particles that can easily substitute electrons in AI chips to reduce the heat, leading to a massive reduction in energy consumption and a dramatic upgrade in processor speed.

While electrons perform calculations by reducing the information to a series of 1s and 0s, photonic chips split and mix beams of light within tiny channels to carry out the tasks. Compared to regular AI chips, photonics chips are only designed to perform a certain kind of mathematical calculation, critical for running large AI models.

Lightmatter, an MIT-backed startup, last year developed an AI chip Envise that leverages photons (light particles) to perform computing tasks.

Lights travel faster than electrons. The concept of using light as a substitute for carrying out heavy tasks (aka photonics computing/optical computing) dates back to the 1980s, when Nokia Bell Labs, an American industrial research and scientific development company, tried to develop a light-based processor. However, due to the impracticality of creating a working optical transistor, the concept didnt take off.

We experience optical technology in cameras, CDs, and even in Blue-Ray discs. But these photons are usually converted into electrons to deploy in chips. Four decades later, photonic computing gained momentum when IBM and researchers from the University of Oxford Muenster developed the system that uses light instead of electricity to perform several AI model-based computations.

Alongside, Lightmatters new AI chip has created a buzz in the industry. According to the company website, Envise can run the largest neural networks three times higher inferences/second than the Nvidia DGX-A100, with seven times the inferences/second/Watt on BERT-Base with the SQuAD dataset.

Japan-based NTT company has also been developing an optical computer believed to outpace quantum computing to solve optimisation problems. Last year, Chinese quantum physicist, Chao-Yang Lu, has also announced light-based quantum computing.

Other companies like US-based Honeywell and IonQ have also been working around the issue by using trapped ions.

Such developments have led the experts to believe photonics computing will gain ground once the big tech companies throw their weight behind it and understand the importance of using light for their AI chips.

On the other hand, like any other remarkable technology, photonics computing also comes with certain challenges. Despite its less energy-consumption, photons chips are considered less accurate and precise than electron-based chips. Much of this could be attributed to its analogue-based calculations, making it perfect for running pre-trained models and deep neural networks.

On the designing aspect, silicon-based computer chips dont go well with photo particles that limit their usage in computing.

The cost issues and environmental impact of digital chips might set the stage for photonics computing to rise as a substitute. With startups like Lightmatter and giants like IBM committing resources to this computing paradigm, AI might get a photonic boost.

Visit link:
Can Photonic Computing Solve The Rising Cost & Energy Issues Of AI? - Analytics India Magazine

What is Quantum Computing | Microsoft Azure

It's the use of quantum mechanics to run calculations on specialized hardware.

To fully define quantum computing, we need to define some key terms first.

The quantum in "quantum computing" refers to the quantum mechanics that the system uses to calculate outputs. In physics, a quantum is the smallest possible discrete unit of any physical property. It usually refers to properties of atomic or subatomic particles, such as electrons, neutrinos, and photons.

A qubit is the basic unit of information in quantum computing. Qubits play a similar role in quantum computing as bits play in classical computing, but they behave very differently. Classical bits are binary and can hold only a position of 0 or 1, but qubits can hold a superposition of all possible states.

Quantum computers harness the unique behavior of quantum physicssuch as superposition, entanglement, and quantum interferenceand apply it to computing. This introduces new concepts to traditional programming methods.

In superposition, quantum particles are a combination of all possible states. They fluctuate until they're observed and measured. One way to picture the difference between binary position and superposition is to imagine a coin. Classical bits are measured by "flipping the coin" and getting heads or tails. However, if you were able to look at a coin and see both heads and tails at the same time, as well as every state in between, the coin would be in superposition.

Entanglement is the ability of quantum particles to correlate their measurement results with each other. When qubits are entangled, they form a single system and influence each other. We can use the measurements from one qubit to draw conclusions about the others. By adding and entangling more qubits in a system, quantum computers can calculate exponentially more information and solve more complicated problems.

Quantum interference is the intrinsic behavior of a qubit, due to superposition, to influence the probability of it collapsing one way or another. Quantum computers are designed and built to reduce interference as much as possible and ensure the most accurate results. To this end, Microsoft uses topological qubits, which are stabilized by manipulating their structure and surrounding them with chemical compounds that protect them from outside interference.

A quantum computer has three primary parts:

For some methods of qubit storage, the unit that houses the qubits is kept at a temperature just above absolute zero to maximize their coherence and reduce interference. Other types of qubit housing use a vacuum chamber to help minimize vibrations and stabilize the qubits.

Signals can be sent to the qubits using a variety of methods, including microwaves, laser, and voltage.

Quantum computer uses and application areas

A quantum computer can't do everything faster than a classical computer, but there are a few areas where quantum computers have the potential to make a big impact.

Quantum computers work exceptionally well for modeling other quantum systems because they use quantum phenomena in their computation. This means that they can handle the complexity and ambiguity of systems that would overload classical computers. Examples of quantum systems that we can model include photosynthesis, superconductivity, and complex molecular formations.

Classical cryptographysuch as the RivestShamirAdleman (RSA) algorithm thats widely used to secure data transmissionrelies on the intractability of problems such as integer factorization or discrete logarithms. Many of these problems can be solved more efficiently using quantum computers.

Optimization is the process of finding the best solution to a problem given its desired outcome and constraints. In science and industry, critical decisions are made based on factors such as cost, quality, and production timeall of which can be optimized. By running quantum-inspired optimization algorithms on classical computers, we can find solutions that were previously impossible. This helps us find better ways to manage complex systems such as traffic flows, airplane gate assignments, package deliveries, and energy storage.

Machine learning on classical computers is revolutionizing the world of science and business. However, training machine learning models comes with a high computational cost, and that has hindered the scope and development of the field. To speed up progress in this area, we're exploring ways to devise and implement quantum software that enables faster machine learning.

A quantum algorithm developed in 1996 dramatically sped up the solution to unstructured data searches, running the search in fewer steps than any classical algorithm could.

Azure Quantum resources

Build quantum solutions today as an early adopter of Azure Quantum Preview, a full-stack open cloud ecosystem. Access software, hardware, and pre-built solutions and start developing on a trusted, scalable, and secure platform.

Continued here:
What is Quantum Computing | Microsoft Azure

In battle with U.S., China to focus on 7 ‘frontier’ technologies from chips to brain-computer fusion – CNBC

GUANGZHOU, China China is looking to boost research into what it calls "frontier technology" including quantum computing and semiconductors, as it competes with the U.S. for supremacy in the latest innovations.

In its five-year development plan, the 14th of its kind, Beijing said it would make "science and technology self-reliance and self-improvement a strategic pillar for national development," according to a CNBC translation.

Premier Li Keqiang said on Friday that China would increase research and development spending by more than 7% per year between 2021 and 2025, in pursuit of "major breakthroughs" in technology.

China's technology champions such as Huawei and SMIC have been targeted by U.S. sanctions as tensions between Beijing and Washington have ramped up in the past few years.

As such, China has concentrated on boosting its domestic expertise in areas it sees as strategically important, such as semiconductors. And now it has laid out seven "frontier technologies" that it will prioritize not just for the next five years, but beyond too.

China plans to focus on specialized chip development for AI applications and developing so-called open source algorithms. Open source technology is usually developed by one entity and licensed by other companies.

There will also be an emphasis on machine learning in areas such as decision making. Machine learning is the development of AI programs trained on vast amounts of data. The program "learns" as it is fed more data.

AI has been a key field for Chinese companies and the central government over the last few years. Major companies such as Alibaba and Baidu have been investing in the technology.

China and the U.S. are competing for AI dominance. A group of experts chaired by former Google CEO Eric Schmidt said China could soon replace the U.S. as the world's "AI superpower."

Semiconductors are a critical area for China and one it has invested a lot in over the past few years but the country has struggled to catch up to the U.S., Taiwan and South Korea.

The problem is the complexity of the semiconductor supply chain. Taiwan's TSMC and South Korea's Samsung are the two most advanced chip manufacturers but they rely on tools from the U.S. and Europe.

Washington has put SMIC, China's biggest chip manufacturer, on an export blacklist called the Entity List. SMIC cannot get its hands on American technology. And the U.S. has reportedly pushed to stop Dutch company ASML from shipping a key tool that could help SMIC catch up to rivals.

Since China doesn't have the companies that can design and make the tools that its chip manufacturers require, it relies on companies from other countries. This is something China wants to change.

In its five-year plan, China says it will focus on research and development in integrated circuit design tools, key equipment and key materials.

Chips are incredibly important because they go into many of the devices we use such as smartphones but are also important for other industries.

China plans to research areas such as how to stop diseases of the brain.

But it also says that it plans to look into "brain-inspired computing" as well as "brain-computer fusion technology," according to a CNBC translation. The five-year plan did not elaborate on what that could look like.

China laid out seven "frontier" technologies in its 14th Five Year Plan. These are areas that China will focus research on and include semiconductors and brain-computer fusion.

Yuichiro Chino | Moment | Getty Images

However, such work is already underway in the U.S. at Elon Musk's company Neuralink. Musk is working on implantable brain-chip interfaces to connect humans and computers.

With the outbreak of the coronavirus last year, biotechnology has grown in importance.

China says it will focus on "innovative vaccines" and "research on biological security."

China's research will concentrate on understanding the progression of cancer, cardiovascular, respiratory and metabolic diseases.

The government also says that it will research some "cutting-edge" treatment technologies such as regenerative medicine. This involves medicine that can regrow or repair damaged cells, tissues and organs.

China says it will also be looking at key technologies in the prevention and treatment of major transmissible diseases.

Space exploration has been a top priority for China recently. Beijing said it will focus on research into the "origin and evolution of the universe," exploration of Mars as well as deep sea and polar research.

In December, a Chinese spacecraft returned to Earth carrying rocks from the moon. It was the first time China has launched a spacecraft from an extraterrestrial body and the first time it has collected moon samples.

And in July, China launched a mission to Mars called Tianwen -1.

CNBC's Iris Wang contributed to this report.

The rest is here:
In battle with U.S., China to focus on 7 'frontier' technologies from chips to brain-computer fusion - CNBC