Archive for the ‘Quantum Computing’ Category

World High Performance Computing (HPC) Markets to 2025 – AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process…

DUBLIN, Jan. 9, 2020 /PRNewswire/ -- The "High Performance Computing (HPC) Market by Component, Infrastructure, Services, Price Band, HPC Applications, Deployment Types, Industry Verticals, and Regions 2020-2025" report has been added to ResearchAndMarkets.com's offering.

This report evaluates the HPC market including companies, solutions, use cases, and applications. Analysis includes HPC by organizational size, software and system type, server type, and price band, and industry verticals. The report also assesses the market for integration of various artificial intelligence technologies in HPC. It also evaluates the exascale-level HPC market including analysis by component, hardware type, service type, and industry vertical.

High Performance Computing (HPC) may be provided via a supercomputer or via parallel processing techniques such as leveraging clusters of computers to aggregate computing power. HPC is well-suited for applications that require high performance data computation such as certain financial services, simulations, and various R&D initiatives.

The market is currently dominated on the demand side by large corporations, universities, and government institutions by way of capabilities that are often used to solve very specific problems for large institutions. Examples include financial services organizations, government R&D facilities, universities research, etc.

However, the cloud-computing based as a Service model allows HPC market offerings to be extended via HPC-as-a-Service (HPCaaS) to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems. Industry use cases are increasingly emerging that benefit from HPC-level computing, many of which benefit from split processing between localized device/platform and HPCaaS.

In fact, HPCaaS is poised to become much more commonly available, partially due to new on-demand supercomputer service offerings, and in part as a result of emerging AI-based tools for engineers. Accordingly, up to 45% of revenue will be directly attributable to the cloud-based business model via HPCaaS, which makes High-Performance Computing solutions available to a much wider range of industry verticals and companies, thereby providing computational services to solve a much broader array of problems.

In a recent study, we conducted interviews with major players in the market as well as smaller, lesser known companies that are believed to be influential in terms of innovative solutions that are likely to drive adoption and usage of both cluster-based HPC and supercomputing.

In an effort to identify growth opportunities for the HPC market, we investigated market gaps including unserved and underserved markets and submarkets. The research and advisory firm uncovered a market situation in which HPC currently suffers from an accessibility problem as well as inefficiencies and supercomputer skill gaps.

Stated differently, the market for HPC as a Service (e.g. access to high-performance computing services) currently suffers from problems related to the utilization, scheduling, and set-up time to run jobs on a supercomputer. We identified start-ups and small companies working to solve these problems.

One of the challenge areas identified is low utilization but (ironically) also high wait times for most supercomputers. Scheduling can be a challenge in terms of workload time estimation. About 20% of jobs are computationally heavy 30% of jobs cannot be defined very well in terms of how long jobs will take (within 3-minute window at best). In many instances, users request substantive resources and don't actually use computing time.

In addition to the scheduling challenge, we also identified a company focused on solving additional problems such as computational planning and engineering. We spoke with the principal of a little-known company called Microsurgeonbot, Inc. (doing business as MSB.ai), which is developing a tool for setting up computing jobs for supercomputers.

The company is working to solve major obstacles in accessibility and usability for HPC resources. The company focuses on solving a very important problem in HPC: Supercomputer job set-up and skills gap. Their solution known as "Guru" is poised to make supercomputing much more accessible, especially to engineers in small to medium-sized businesses that do not have the same resources or expertise as large corporate entities.

Key Topics Covered

1 Executive Summary1.1 Companies in Report1.2 Target Audience1.3 Methodology

2 Introduction2.1 Next Generation Computing2.2 High Performance Computing2.2.1 HPC Technology2.2.1.1 Supercomputers2.2.1.2 Computer Clustering2.2.2 Exascale Computation2.2.2.1 United States2.2.2.2 China2.2.2.3 Europe2.2.2.4 Japan2.2.2.5 India2.2.2.6 Taiwan2.2.3 High Performance Technical Computing2.2.4 Market Segmentation Considerations2.2.4.1 Government, NGOs, and Universities2.2.4.2 Small Companies and Middle Market2.2.5 Use Cases and Application Areas2.2.5.1 Computer Aided Engineering2.2.5.2 Government2.2.5.3 Financial Services2.2.5.4 Education and Research2.2.5.5 Manufacturing2.2.5.6 Media and Entertainment2.2.5.7 Electronic Design Automation2.2.5.8 Bio-Sciences and Healthcare2.2.5.9 Energy Management and Utilities2.2.5.10 Earth Science2.2.6 Regulatory Framework2.2.7 Value Chain Analysis2.2.8 AI to Drive HPC Performance and Adoption

3 High Performance Computing Market Analysis and Forecast 2020-20253.1 Global High Performance Computing Market 2020-20253.1.1 Total High Performance Computing Market 2020-20253.1.2 High Performance Computing Market by Component 2020-20253.1.2.1 High Performance Computing Market by Hardware and Infrastructure Type 2020-20253.1.2.1.1 High Performance Computing Market by Server Type 2020-20253.1.2.2 High Performance Computing Market by Software and System Type 2020-20253.1.2.3 High Performance Computing Market by Professional Service Type 2020-20253.1.3 High Performance Computing Market by Deployment Type 2020-20253.1.4 High Performance Computing Market by Organization Size 2020-20253.1.5 High Performance Computing Market by Server Price Band 2020-20253.1.6 High Performance Computing Market by Application Type 2020-20253.1.6.1 High Performance Technical Computing Market by Industry Vertical 2020-20253.1.6.2 Critical High Performance Business Computing Market by Industry Vertical 2020-20253.1.1 High Performance Computing Deployment Options: Supercomputer vs. Clustering 2020-20253.1.2 High Performance Computing as a Service (HPCaaS) 2020-20253.1.3 AI Powered High Performance Computing Market3.1.3.1 AI Powered High Performance Computing Market by Component3.1.3.2 AI Powered High Performance Computing Market by AI Technology3.2 Regional High Performance Computing Market 2020-20253.3 Exascale Computing Market 2020-20253.3.1 Exascale Computing Driven HPC Market by Component 2020-20253.3.2 Exascale Computing Driven HPC Market by Hardware Type 2020-20253.3.3 Exascale Computing Driven HPC Market by Service Type 2020-20253.3.4 Exascale Computing Driven HPC Market by Industry Vertical 2020-20253.3.1 Exascale Computing as a Service 2020-2025

4 High Performance Computing Company Analysis4.1 HPC Vendor Ecosystem4.2 Leading HPC Companies4.2.1 Amazon Web Services Inc.4.2.2 Atos SE4.2.3 Adavnced Micro Devices Inc.4.2.4 Cisco Systems4.2.5 DELL Technologies Inc.4.2.6 Fujitsu Ltd.4.2.7 Hewlett Packard Enterprise (HPE)4.2.8 IBM Corporation4.2.9 Intel Corporation4.2.10 Microsoft Corporation4.2.11 NEC Corporation4.2.12 NVIDIA4.2.13 Rackspace Inc.4.1 Companies to Watch4.1.1 Braket Inc.4.1.1 MicroSurgeonBot Inc. (MSB.ai)

5 Conclusions and Recommendations5.1 AI to Support Adoption and Usage of HPC5.2 5G and 6G to Drive Increased Demand for HPC

6 Appendix: Future of Computing6.1 Quantum Computing6.1.1 Quantum Computing Technology6.1.2 Quantum Computing Considerations6.1.3 Market Challenges and Opportunities6.1.4 Recent Developments6.1.5 Quantum Computing Value Chain6.1.6 Quantum Computing Applications6.1.7 Competitive Landscape6.1.8 Government Investment in Quantum Computing6.1.9 Quantum Computing Stakeholders by Country6.1 Other Future Computing Technologies6.1.1 Swarm Computing6.1.2 Neuromorphic Computing6.1.3 Biocomputing6.2 Market Drivers for Future Computing Technologies6.2.1 Efficient Computation and High Speed Storage6.2.2 Government and Private Initiatives6.2.3 Flexible Computing6.2.4 AI-enabled, High Performance Embedded Devices, Chipsets, and ICs6.2.5 Cost Effective Computing powered by Pay-as-you-go Model6.3 Future Computing Market Challenges6.3.1 Data Security Concerns in Virtualized and Distributed Cloud6.3.2 Funding Constrains R&D Activities6.3.3 Lack of Skilled Professionals across the Sector6.3.4 Absence of Uniformity among NGC Branches including Data Format

For more information about this report visit https://www.researchandmarkets.com/r/xa4mit

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager press@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Excerpt from:
World High Performance Computing (HPC) Markets to 2025 - AI, IoT, and 5G will be Major Drivers for HPC Growth as they Facilitate the Need to Process...

Quantum Computers Finally Beat Supercomputers in 2019 – Discover Magazine

In his 2013 book, Schrdingers Killer App, Louisiana State University theoretical physicist Jonathan Dowling predicted what he called super exponential growth. He was right. Back in May, during Googles Quantum Spring Symposium, computer engineer Hartmut Neven reported the companys quantum computing chip had been gaining power at breakneck speed.

The subtext: We are venturing into an age of quantum supremacy the point at which quantum computers outperform the best classical supercomputers in solving a well-defined problem.

Engineers test the accuracy of quantum computing chips by using them to solve a problem, and then verifying the work with a classical machine. But in early 2019, that process became problematic, reported Neven, who runs Googles Quantum Artificial Intelligence Lab. Googles quantum chip was improving so quickly that his group had to commandeer increasingly large computers and then clusters of computers to check its work. Its become clear that eventually, theyll run out of machines.

Case in point: Google announced in October that its 53-qubit quantum processor had needed only 200 seconds to complete a problem that would have required 10,000 years on a supercomputer.

Nevens group observed a double exponential growth rate in the chips computing power over a few months. Plain old exponential growth is already really fast: It means that from one step to the next, the value of something multiplies. Bacterial growth can be exponential if the number of organisms doubles during an observed time interval. So can computing power of classical computers under Moores Law, the idea that it doubles roughly every year or two. But under double exponential growth, the exponents have exponents. That makes a world of difference: Instead of a progression from 2 to 4 to 8 to 16 to 32 bacteria, for example, a double-exponentially growing colony in the same time would grow from 2 to 4 to 16 to 256 to 65,536.

Neven credits the growth rate to two factors: the predicted way that quantum computers improve on the computational power of classical ones, and quick improvement of quantum chips themselves. Some began referring to this growth rate as Nevens Law. Some theorists say such growth was unavoidable.

We talked to Dowling (who suggests a more fitting moniker: the Dowling-Neven Law) about double exponential growth, his prediction and his underappreciated Beer Theory of Quantum Mechanics.

Q: You saw double exponential growth on the horizon long before it showed up in a lab. How?

A: Anytime theres a new technology, if it is worthwhile, eventually it kicks into exponential growth in something. We see this with the internet, we saw this with classical computers. You eventually hit a point where all of the engineers figure out how to make this work, miniaturize it and then you suddenly run into exponential growth in terms of the hardware. If it doesnt happen, that hardware falls off the face of the Earth as a nonviable technology.

Q: So you werent surprised to see Googles chip improving so quickly?

A: Im only surprised that it happened earlier than I expected. In my book, I said within the next 50 to 80 years. I guessed a little too conservatively.

Q: Youre a theoretical physicist. Are you typically conservative in your predictions?

People say Im fracking nuts when I publish this stuff. I like to think that Im the crazy guy that always makes the least conservative prediction. I thought this was far-out wacky stuff, and I was making the most outrageous prediction. Thats why its taking everybody by surprise. Nobody expected double exponential growth in processing power to happen this soon.

Q: Given that quantum chips are getting so fast, can I buy my own quantum computer now?

A: Most of the people think the quantum computer is a solved problem. That we can just wait, and Google will sell you one that can do whatever you want. But no. Were in the [prototype] era. The number of qubits is doubling every six months, but the qubits are not perfect. They fail a lot and have imperfections and so forth. But Intel and Google and IBM arent going to wait for perfect qubits. The people who made the [first computers] didnt say, Were going to stop making bigger computers until we figure out how to make perfect vacuum tubes.

Q: Whats the big deal about doing problems with quantum mechanics instead of classical physics?

A: If you have 32 qubits, its like you have 232 parallel universes that are working on parts of your computation. Or like you have a parallel processor with 232 processors. But you only pay the electric bill in our universe.

Q: Quantum mechanics gets really difficult, really fast. How do you deal with that?

A: Everybody has their own interpretation of quantum mechanics. Mine is the Many Beers Interpretation of Quantum Mechanics. With no beer, quantum mechanics doesnt make any sense. After one, two or three beers, it makes perfect sense. But once you get to six or 10, it doesnt make any sense again. Im on my first bottle, so Im in the zone.

[This story originally appeared in print as "The Rules of the Road to Quantum Supremacy."]

See original here:

Quantum Computers Finally Beat Supercomputers in 2019 - Discover Magazine

Quantum computing : Solving problems beyond the power of classical computing – Economic Times

Weather forecasting today is good. Can it get better? Sure, it can, if computers can be better. This is where quantum computers come into the picture. They possess computing capacity beyond anything that todays classical computers can ever achieve. This is because quantum computers can run calculations exponentially faster than todays conventional binary computers. That makes them powerful enough to bridge gaps which exist in todays weather forecasting, drug discovery, financial modelling and many other complex areas.

Classical computing has been the backbone of modern society. It gave us satellite TV, the internet and digital commerce. It put robots on Mars and smartphones in our pockets.

But many of the worlds biggest mysteries and potentially greatest opportunities remain beyond the grasp of classical computers, says Stefan Filipp, quantum scientist at IBM Research. To continue the pace of progress, we need to augment the classical approach with a new platform, one that follows its own set of rules. That is quantum computing.

Classical computing is based on the binary system, where the fundamental carriers of information bits can take on a value of either 0 or 1.

All information is stored and read as a sequence of 0s and 1s. A state of 0 is off (or false) and a state of 1 is on (or true). Unlike bits, quantum bits or qubits can have multiple values or states between 0 and 1, enabling them to store different types of information.

Superposition and entanglement are two fundamental properties of quantum objects. The ability to manipulate these properties is what makes quantum algorithms fundamentally different from classical algorithms.

Quantum computers working with classical systems have the potential to solve complex real-world problems such as simulating chemistry, modelling financial risk and optimising supply chains.

For example, Exxon Mobil plans to use quantum computing to better understand catalytic and molecular interactions that are too difficult to calculate with classical computers. Potential applications include more predictive environmental models and highly accurate quantum chemistry calculations to enable the discovery of new materials for more efficient carbon capture.

JP Morgan Chase is focusing on use cases for quantum computing in the financial industry, including trading strategies, portfolio optimisation, asset pricing and risk analysis.

In India, the government has launched two initiatives in the emerging field a networked programme on Quantum Information Science and Technology (QuST) and the National Mission on Quantum Technologies & Applications (NMQTA).

Despite all the progress, practical and working quantum systems might take most of the 2020s. And you wont see or need a quantum machine on your desk. These will be used by governments and large enterprises, unless you want to find aliens or figure out and execute ways to boil the ocean while sitting at home.

This story is part of the 'Tech that can change your life in the next decade' package

View original post here:

Quantum computing : Solving problems beyond the power of classical computing - Economic Times

The Impact of Quantum Computing on Banking will be gigantic says Deltec Bank, Bahamas – Quantaneo, the Quantum Computing Source

However, even with that progression, there are still jobs that classical computers are not powerful enough to do. The answer looks set to come from quantum computing. In this post, we will look at what quantum computing is and how it could revolutionize a long-standing industry such as banking.

What is Quantum Computing?

Quantum computers are expected to be a new kind of technology that can solve complex problems well beyond the capabilities of traditional systems. If you take an everyday problem like climate change, the intricacies of solving it are incredibly complex. A standard computer does not have the power or ability to even get close to genuinely understanding everything that is going on. The main reason is the endless amounts of data that computers need to process to generate an accurate decision.

A quantum computer is often referred to as a supercomputer. It has highly advanced processing power that can take masses of variables into account, helping predict weather patterns and natural disasters in the case of climate change.

Brief Technical Summary

A typical computer stores information in what is known as bits. In quantum computing, these are known as qubits. Qubits have certain properties that mean a connected group of them can provide way more processing power than binary bits from classical computing. In short, where binary bits store 1s and 0s to handle a task, qubits can represent numerous possible combinations of these simultaneously.

Practical Example

An example of this could be if running a travel agency. Lets say three people need to move from one place to another, Jenny, Anna and Steve. For that purpose, there are two taxis and the problem you want to solve is who gets into which taxi. However, we know that Jenny and Anna are friends, Jenny and Steve are enemies and Anna and Steve are enemies.

The aim would be to maximize the number of friend pairs and minimize the enemy pairs sharing the same taxi. A classical computer would store each possible solution with bits one at a time before being able to calculate a potential solution. However, a quantum computer will use qubits to represent all the solutions at the same time. It will find the best solution in a few milliseconds as it piles everything into just 1 operation.

The difference here is a traditional computer performs more and more calculations every time the data scales up, whereas a quantum computer will only ever have to process one operation.

In the real-world, one industry that could heavily benefit from this technology and processing power is banking.

Quantum Computing in Banking

In an article from Banco Bilbao Vizcaya Argentaria (BBVA) from October 2019, it was suggested that this kind of quantum computing power might fundamentally change the face of banking in time.

Encryption of personal data is critical to banking, with RSA-2048 being used at the highest levels. For a classical computer to find the key to decrypt the algorithm would take 1,034 steps. To put that into context, a processor capable of a trillion operations per second would still take 317 billion years to resolve the problem. Realistically, that makes decryption impossible.

However, a quantum computer could solve the decryption in just 107 steps. If the computer were running at a million operations per second, this calculation would only take 10 seconds to complete. The potential of quantum computing in this context is quite amazing. That said, we are still a long way off having enough processing power to reach those heights, but experts are working on it.

Barclays

Researchers at Barclays Bank in collaboration with IBM have created a proof-of-concept quantum optimized application. The solution revolves around the transaction settlement process. A settlement works on a transaction-by-transaction basis where they are pushed into a queue and settled in batches. During a processing window, as many trades as possible from the queue are settled.

Trades are complex by nature according to Lee Braine, director of research and engineering at Barclays. Traders can tap into funds before the transaction has been cleared. They are settled if funding is available or if there is some sort of credit collateral facility.

In a quantum computing context, a small number of trades could, in theory, be done in your head. However, as you get up to 10 or 20 transactions, you might need to use a pen and paper. Any more than that and we start going into classical computing. However, as we get to hundreds of trades, the machines begin to experience limitations.

A bit like the travel agency example we gave earlier, a quantum computer could run masses of complex aspects of trading. Using a seven-bit qubit system, the team could identify certain features that were of sufficient complexity. The same calculations would need about 200 traditional computers.

JP Morgan

Using an IBM machine, researchers at JP Morgan have demonstrated that they could simulate the future value of a financial product. They are testing the use of quantum computers to speed up intensive pricing calculations which would take traditional machine hours to compute. As portfolios become larger, the algorithms have greater complexity and could get to a point where they are impossible to calculate.

The research by the team has shown that a commercial-grade quantum computer can run the same calculations in a matter of seconds.

Summary

According to Deltec Bank, the Bahamas Banks are successfully testing quantum computers to solve problems that were previously very resource-intensive or impossible to complete. Although the technology is still some years away from changing the way banks calculate financial models due to complex hardware requirements, it is important to start testing now.

IBM themselves have stated they are a while away from a perfect solution with big breakthroughs still required but the time will certainly come.

See the article here:

The Impact of Quantum Computing on Banking will be gigantic says Deltec Bank, Bahamas - Quantaneo, the Quantum Computing Source

Science stories that shaped 2019 – Telegraph India

This was the year of quantum physics, which redefined the kilogram and the computer. It was also the year of teamwork. Hundreds of scientists across the globe worked together to do the seemingly impossible capture an image of a black hole. A global collaboration of scientists journeyed into the heart of the Arctic to measure how the climate is changing in this crucial spot. This was also the year we lost a portion of the Amazon rainforest to a fire fuelled by greed.

First image of a black hole

After more than a decade at work, the Event Horizon Telescope, a large telescope array consisting of a global network of radio telescopes, stunned the world by capturing the first direct image of a black hole, which is situated at the centre of the Messier 87 galaxy, 54 million light years away. The image shows a circular silhouette outlined by emission from hot gas swirling around it, lending credibility to Einsteins theory of general relativity near all black holes.

Evidence of black holes from which nothing, not even light, can escape has existed for aeons. And astronomers have long observed the effects of these mysterious phenomena on their surroundings. Because of the lack of light, it was believed that you could not snap an image of these caverns in space.

Polarstern breaks ice

The German icebreaker ship, RV Polarstern, is right now stuck in the midst of the frozen Arctic sea at the North Pole. Its on a mission known as the Multidisciplinary drifting Observatory for the Study of Arctic Climate (Mosaic) the largest climate-change research expedition to the central Arctic. This region, one of the most inaccessible places on our planet, is critical to Earths climate and its essential to study it thoroughly.

During the year-long expedition (September 2019 to September 2020) that has taken 20 years to organise, over 600 researchers will rotate on and off the ship, supported by many more in research institutes across the world. The data harvested should give us an accurate picture of ice or its absence near the North Pole and is expected to silence climate change sceptics forever.

Googles quantum claim

Google claims to have reached a long-sought breakthrough called quantum supremacy that allows computers to calculate at inconceivable speeds. While some scientists are cautious about the implications, major tech companies in the US and China are investing heavily in quantum computing. IBM, a Google competitor, described the term quantum supremacy as misleading and proposed another metric, quantum volume .

Denisovan discoveries

A jawbone of a 1,60,000-year-old Denisovan hominids who existed alongside Neanderthals and disappeared 50,000 years ago was recently discovered in the Tibetan Plateau. This is the first time a fossil of this species has been found outside the Denisova Cave in Siberia, confirming the theory that these relatives of modern humans once lived across much of central and eastern Asia. The find also suggests Denisovans may have evolved genetic adaptations to high altitudes, which Tibetans inherited thanks to interbreeding between Denisovans and modern humans.

Crispr in clinical trials

Crispr/Cas9, a gene editing technique akin to molecular scissors that can snip, repair or insert genes into DNA, went into a spate of clinical trials. The technique holds the promise of curing nearly 6,000 known genetic diseases. There is already clinical evidence that it has cured two patients in the US, one suffering from beta thalassaemia and the other from sickle cell disease.

Crash course on the moon

The race to land on the moon is back in vogue. While Chinas Change-4 lander touched down smoothly on the moons far side in January, probes sent by the Israeli agency, SpaceIL, and the Indian Space Research Organisation crash-landed. China plans to launch another lunar lander next year. The European Space Agency, Russia and Nasa hope to follow in its footsteps.

Kilogram, redefined

In the biggest overhaul of the International System of Units, four units kilogram, kelvin, ampere and mole were redefined in terms of constants of nature. The new definition anchors the value of the kilogram to the Planck constant, an unvarying and infinitesimal number at the heart of quantum physics. Previously, the kilogram was defined as the mass of a specific object (stored in a Paris vault) that represented the mass of one litre of pure water at its freezing point.

Amazon ablaze

The Amazon rainforest, the worlds largest carbon sink, was irreversibly damaged after settlers allegedly set fire to it, with tacit support from the Brazilian government. Data released by Brazils National Institute for Space Research shows that from January to July, fires consumed 4.6 million acres of the Brazilian part of the Amazon rainforest. The nations right-wing President, Jair Bolsonaro, wants to facilitate the interests of industries in the forest, uncaring of the worldwide environmental concern.

Read the original here:

Science stories that shaped 2019 - Telegraph India