Archive for the ‘Quantum Computing’ Category

Qubits and Defining the Quantum Computer | HowStuffWorks

The Turing machine, developed by Alan Turing in the 1930s, is a theoretical device that consists of tape of unlimited length that is divided into little squares. Each square can either hold a symbol (1 or 0) or be left blank. A read-write device reads these symbols and blanks, which gives the machine its instructions to perform a certain program. Does this sound familiar? Well, in a quantum Turing machine, the difference is that the tape exists in a quantum state, as does the read-write head. This means that the symbols on the tape can be either 0 or 1 or a superposition of 0 and 1; in other words the symbols are both 0 and 1 (and all points in between) at the same time. While a normal Turing machine can only perform one calculation at a time, a quantum Turing machine can perform many calculations at once.

Today's computers, like a Turing machine, work by manipulating bits that exist in one of two states: a 0 or a 1. Quantum computers aren't limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today's most powerful supercomputers.

This superposition of qubits is what gives quantum computers their inherent parallelism. According to physicist David Deutsch, this parallelism allows a quantum computer to work on a million computations at once, while your desktop PC works on one. A 30-qubit quantum computer would equal the processing power of a conventional computer that could run at 10 teraflops (trillions of floating-point operations per second). Today's typical desktop computers run at speeds measured in gigaflops (billions of floating-point operations per second).

Quantum computers also utilize another aspect of quantum mechanics known as entanglement. One problem with the idea of quantum computers is that if you try to look at the subatomic particles, you could bump them, and thereby change their value. If you look at a qubit in superposition to determine its value, the qubit will assume the value of either 0 or 1, but not both (effectively turning your spiffy quantum computer into a mundane digital computer). To make a practical quantum computer, scientists have to devise ways of making measurements indirectly to preserve the system's integrity. Entanglement provides a potential answer. In quantum physics, if you apply an outside force to two atoms, it can cause them to become entangled, and the second atom can take on the properties of the first atom. So if left alone, an atom will spin in all directions. The instant it is disturbed it chooses one spin, or one value; and at the same time, the second entangled atom will choose an opposite spin, or value. This allows scientists to know the value of the qubits without actually looking at them.

Next, we'll look at some recent advancements in the field of quantum computing.

Go here to see the original:
Qubits and Defining the Quantum Computer | HowStuffWorks

Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 – The Coin Republic

Ritika Sharma Monday, 13 January 2020, 03:49 EST Modified date: Monday, 13 January 2020, 05:00 EST

Quantum computing whenever hit the headlines left not just Bitcoin holders but also every Cryptocurrency holder worried about the uncertainty around their holdings.

It widely believed that the underlying technology of Bitcoin, Blockchain is immutable, meaning it cannot be changed or encrypted without authority over encryption keys.

However, with quantum computers, it is possible to break a blockchains cryptographic codes. Quantum computing can hit the most significant features of Blockchain like unchangeable data, unalterable, and security making it vulnerable.

Google has achieved quantum supremacy as of late 2019, which poses a threat to Bitcoin. It will be a threat to Blockchain, as quantum computing will affect one blockchains key features like inalterability and security, thus making Blockchain as highly vulnerable technology.

Later, china Joined Google in the quantum supremacy Race and announced working on quantum technology. With this, the year 2020 might witness the end of the Crypto Era.

How can Quantum computing break the Blockchain?

The reason behind this fear is quite genuine and straightforward: Bitcoin or any Cryptocurrency depends on cryptography, hash functions, and asymmetric cryptographic number mainly relies on the computing power of computers. The hash function calculates a random number for each block.

The results obtained by this process are effortless to verify, but challenging to find. However, quantum computing has powerful algorithmic capabilities, which is precisely the enemy of this key.

Quantum computing uses subatomic particles, which will be available in more than one state at one time. This feature makes Quantum computing faster than the technology we use today.

Quantum computers can work 100 million times faster than current systems; the computational power is capable of solving any complex mathematical equation in a matter of a few seconds, which current systems take 10,000 years to solve.

With such super computational powers, Quantum computers is capable of calculating the one-way functions that will make one-way encryption obsolete.

The risk over Blockchain is more if it gets in the wrong hands. Hackers with a quantum computer can hack the Cryptocurrency ledger and take complete control of Blockchain.

Will Googles Quantum computing wipe out your Bitcoins?

Googles quantum Supremacy only to traditional computers on classical problems; this isnt actual quantum technology. It was presented bluntly as, quantum supremacy, though it is just a step in the world of quantum computing space.

Even if Googles quantum computer demonstrates, its computing power on specific problems far exceeds the best performing supercomputing. The results of this research by Google do not have much meaning in terms of Bitcoin. This isnt even near to what we can call breaking Bitcoin or Blockchain.

However, Googles quantum supremacy does not pose any threat to Bitcoin; many people in the space still stressed about quantum threat theory. Many analysts claim that the quantum algorithm used by Shor can crack private keys, but again, there Is a long way to go before it could break bitcoins Blockchain.

According to researchers, a quantum computer with 4,000 qubits is undoubtedly able to break the Blockchain. Still, googles the quantum computer has only 53 qubits, which cannot cause any harm to Blockchain, and it is worth mentioning that The higher the qubit, the more difficult it becomes.

Satoshi Nakamotos Proposed solution to beat Quantum Supremacy

Satoshi was a true visionary, the things we are concerned about today, and had already been answered by him. In 2010, satoshi Nakamoto responded to the question about quantum computers by username llama on bitcoin talk.

He replied that If Bitcoin suddenly cracked, the signature will be destroyed; but if it is slowly changed, the system still has time to convert to a stronger function, and Re-sign all your assets. Another cruder answer to this question suggested by the author of Mastering Bitcoin, Andreas Antonopoulos, If the quantum computer comes, we will upgrade.

The Quantum supremacy threat isnt new to the crypto world, and many cryptocurrency projects such as Ethereum, quantum chains, etc., focused on making blockchain quantum resistance, experts in Cryptocurrency space also advocating the development of quantum encryption technology to ensure the security of funds.

Unless a threat of Actual Quantum computing of far more powerful processor explodes, Bitcoin and its developers still have time to secure it. With the continuous development in Quantum technology and the development of more qubit chips, still, there will be the sword of Damocles hanging on the head of the cryptocurrency.

The rest is here:
Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic

Bleeding edge information technology developments – IT World Canada

What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?

Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.

A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:

Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.

Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.

The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.

Selected business applications of machine learning include:

For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.

Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.

Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.

Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:

For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.

The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.

IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.

The aggressive implementation of IIoT can:

For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.

RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.

RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.

For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.

What bleeding edge information technology developments would you add to this list? Let us know in the comments below.

Read more:
Bleeding edge information technology developments - IT World Canada

New York University Partners with IBM to Explore Quantum Computing for Simulation of Quantum Systems and Advancing Quantum Education – Quantaneo, the…

The announcement of the agreement was made during CES 2020, the annual global technology conference and showcase in Las Vegas.

Together with the Air Force Research Lab (AFRL) and IBM, NYU will explore quantum computing research to study measurement-based quantum computing, materials discovery with variational quantum eigensolver, and emulating new phases on small quantum systems.

We are excited to join AFRL and IBM to transform quantum computing concepts into a powerful technology by educating a new quantum workforce, expanding our scientific partnership and engaging in cross disciplinary collaboration, said Javad Shabani, an assistant professor of physics at NYU.

Under the agreement to join the AFRL hub, NYU will be part of a community of Fortune 500 companies, startups, academic institutions, and research labs working to advance quantum computing and explore practical applications. NYU will leverage IBMs quantum expertise and resources, Qiskit software and developer tools, and will have cloud-based access to IBMs Quantum Computation Center. IBM offers, through the cloud, 15 of the most advanced universal quantum computing systems available, including a 53-qubit qubit systemthe largest commercially available system in the industry.

Read this article:
New York University Partners with IBM to Explore Quantum Computing for Simulation of Quantum Systems and Advancing Quantum Education - Quantaneo, the...

Were approaching the limits of computer power we need new programmers now – The Guardian

Way back in the 1960s, Gordon Moore, the co-founder of Intel, observed that the number of transistors that could be fitted on a silicon chip was doubling every two years. Since the transistor count is related to processing power, that meant that computing power was effectively doubling every two years. Thus was born Moores law, which for most people working in the computer industry or at any rate those younger than 40 has provided the kind of bedrock certainty that Newtons laws of motion did for mechanical engineers.

There is, however, one difference. Moores law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. In terms of size of transistor, he said, you can see that were approaching the size of atoms, which is a fundamental barrier, but itll be two or three generations before we get that far but thats as far out as weve ever been able to see. We have another 10 to 20 years before we reach a fundamental limit.

Weve now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, theres been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called cores in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moores law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; theres a legend that for years afterwards he could recite the entire program by heart.

There are thousands of stories like this from the early days of computing. But as Moores law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed. Programming became industrialised as software engineering. The construction of sprawling software ecosystems such as operating systems and commercial applications required large teams of developers; these then spawned associated bureaucracies of project managers and executives. Large software projects morphed into the kind of death march memorably chronicled in Fred Brookss celebrated book, The Mythical Man-Month, which was published in 1975 and has never been out of print, for the very good reason that its still relevant. And in the process, software became bloated and often inefficient.

But this didnt matter because the hardware was always delivering the computing power that concealed the bloatware problem. Conscientious programmers were often infuriated by this. The only consequence of the powerful hardware I see, wrote one, is that programmers write more and more bloated software on it. They become lazier, because the hardware is fast they do not try to learn algorithms nor to optimise their code this is crazy!

It is. In a lecture in 1997, Nathan Myhrvold, who was once Bill Gatess chief technology officer, set out his Four Laws of Software. 1: software is like a gas it expands to fill its container. 2: software grows until it is limited by Moores law. 3: software growth makes Moores law possible people buy new hardware because the software requires it. And, finally, 4: software is only limited by human ambition and expectation.

As Moores law reaches the end of its dominion, Myhrvolds laws suggest that we basically have only two options. Either we moderate our ambitions or we go back to writing leaner, more efficient code. In other words, back to the future.

What just happened?Writer and researcher Dan Wang has a remarkable review of the year in technology on his blog, including an informed, detached perspective on the prospects for Chinese domination of new tech.

Algorithm says noTheres a provocative essay by Cory Doctorow on the LA Review of Books blog on the innate conservatism of machine-learning.

Fall of the big beastsHow to lose a monopoly: Microsoft, IBM and antitrust is a terrific long-view essay about company survival and change by Benedict Evans on his blog.

See the original post:
Were approaching the limits of computer power we need new programmers now - The Guardian