Archive for the ‘Quantum Computing’ Category

SAP innovation wing aims to shape future of ERP technology – TechTarget

Few people understand SAP's research into emerging technologies like blockchain, AI, quantum computing and the metaverse better than Martin Heinig and Yaad Oren.

Heinig heads New Ventures and Technologies, a group of several hundredpeople working in labs to define SAP innovation and long-term strategy. "We look at technologies that have the potential to disrupt the market," he said.

Oren heads a subgroup, the SAP Innovation Center Network, which he likened to a high school for research projects. "Once they graduate, they move into the real world," he said.

Heinig's is one group within an SAP R&D function that he said is divided into three parts: academic groups like the Hasso Plattner Institute, namesake of SAP's co-founder, which takes the longest view, and product engineering, which operates on a roughly two-year timeframe.

"We sit right in the middle," Heinig said. "We're looking at opportunities for SAP that are five-plus years out and then try to figure out what something can be. We create prototypes to find out if we can really build it and if is feasible for customers."

A common theme of these technological investigations emerged during separate interviews with Heinig and Oren at this month's SAP Sapphire 2022 conference in Orlando. Much of the work serves the ambitious goal of extending business processes beyond the walls of an organization. Doing so requires breaking business processes into smaller pieces that can be securely shared between software systems and corporate entities. Besides being shareable, these new processes are designed to be autonomous, "smart" and composable so they can be endlessly reconfigured to do exactly what people need them to do.

The interviews are combined and edited for brevity and clarity. Heinig and Oren both were emphatic that mentioning a technology does not mean SAP is committed to productizing it.

What important technologies are you working on that could lead to major changes in ERP?

Martin Heinig: Quantum computing, definitely, but it can also be simulated quantum computers like digital annealers, where we have a step change in computing power that can open up new scenarios. For example, in the supply chain, when you have optimization problems that would not take hours but minutes.

Things like homomorphic encryption can also be a game changer. The beauty of it is you can do analytics on encrypted data, so it will not reveal the actual information but you can still do some basic calculations. For example, I can give you sales data, but it would be encrypted so you don't know the company that I'm working with, but you would see the order number or quantity. Whole industries could package the data and do analytics. It could be interesting in the healthcare sector, where you don't want to reveal patient names.

The problem is it still requires a lot of computing time, so we need to go into the hardware space with partners and find out if there are some technologies, like specific chips, that can be a kind of coprocessor to minimize the penalty on the computing side.

When do you think quantum computing could be practical for business use?

Heinig: We see a lot of progress, and the number of qubits is increasing tremendously, but we have not found a quantum computer that can solve real-world problems yet. It's hard to estimate, but it's not 10 years out. Maybe the first real use cases are three to five years out.

We're currently testing how it would work. The scenarios we are looking at are more in the optimization space, like supply chain warehouse management or production planning, where you have a lot of very complex problems to solve that need a lot of compute power. We try out how to translate these kinds of problems into quantum computing language.

Why should people care? What's going to be so great about quantum computing?

Yaad Oren: It's a whole new paradigm for computing. It's not only the computation but how people will build software.

With classical programming, you interact with the processor in a certain way. If you have a quantum processor, you interact differently. Even the development languages get affected.

The disruptive potential is across the stack, from infrastructure to platform to software development languages.

There is a lot of hype, but SAP is currently looking at three areas where we see the potential for disruption.

The first is optimization. Quantum computing is not good for arithmetic, like one plus one equals two, but it's very good for combinatorial problems like the traveling salesman problem, when you have many nodes and a factorial level of complexity.

Optimization problems fit quantum computing like a glove -- for example, supply chain optimization, when you have so many parameters to evaluate regarding the route, pandemic regulations and weather.

We're using a lot of quantum simulations -- partner solutions, quantum annealing and other technologies -- because the quantum computer is not there yet. SAP is also involved in a government-funded project with another German company on building quantum as a service. We use a lot of simulation technologies that have already helped us understand the power of this.

The second is called post-quantum cryptography, which is the security and encryption aspect of quantum, a big thing given the number of phishing attacks, ransomware and password hacks.

Quantum holds a lot of promise to create encryption at a level the industry has never known before. It's about creating a new type of password that is not breakable. The quantum code is the means to the end.

In browsers today, you have auto-generated passwords that are done by algorithms. Quantum computing can give birth to new types of algorithms that create passwords at a new level of complexity.

Eventually any password is breakable if you spend enough time. With quantum computing, this becomes much harder or impossible. Of course, it's a matter of time until hackers catch up.

The third benefit of quantum computing is AI. Machine learning is based on unique data, and you need computation power to train models. With quantum computing, you can create new types of AI models and applications that you couldn't train before, because now you have a strong computer that can learn more and solve new problems. It will give birth to new types of automation and predictive analytics.

What kind of research are you doing in AI and machine learning?

Heinig: Enterprise knowledge graphs are a concept based on machine learning technology that we're looking into. It's basically the idea of modeling the connections between business objects and bringing in the relationships between them. This is a very important technology for creating context for situational awareness and personalization.

Oren: Regarding the future of AI and analytics, we have a lot of advances in this area. SAP is also focusing on infusing AI into the core application.

We are working a lot on the future of planning and introducing new types of AI like reinforcement learning to create new types of simulations.

Today, if you want to have planning solutions, you cannot always get the full perspective on uncertainties, and you cannot have recommendations and simulations for scenarios that you didn't ask about.

We're working on a self-learning system that provides continuous intelligence. It's not a product yet, but we're working on it with customers. You don't need to train the model and build the machine learning model yourself. It can keep learning even in areas that you didn't explicitly ask to explore, to fight uncertainty. This was requested by customers during the coronavirus pandemic and all the disruption in supply chain, where you need to deal with a lot of uncertainty.

Does the metaverse have implications for ERP, realistically?

Heinig: Yes, but the question is when and to what extent? What does it mean from a process perspective? Companies have already crossed the borders between physical goods and digital goods. The basic research question that we are looking into is how can we make these processes seamless?

You have your physical store where you sell physical goods and you have a digital store. You can sell physical goods, order them and get them delivered. So how can we extend this so you can also sell in your digital store a physical good with a digital good, like a non-fungible token (NFT)? No matter where you want to do business in the metaverse, the ERP system should help you run your processes.

Where do you see the most promising business applications of the metaverse?

Oren: The metaverse is also a lot of hype, and we need to distill the noise. For me, the magic happens more in the practical -- I would even say boring -- side of the metaverse, not the avatars and the UIs.

We're looking more at the Web 3.0 aspects of things. Web. 3.0 is kind of what greases the skids of the metaverse -- for example, all the crypto payments.

Of the top three long-term observations we have about the metaverse, the first one is everything regarding digital finance. There is huge demand from the industry. The number of transactions and volume of the new generation of buyers is huge -- how you accept crypto payments and how you sell NFTs.

Let's say an avatar is buying something. You need a profit and loss statement that can take fiat money and crypto money. How you do the balance sheets and audit them may not be sexy, but it's really important.

The metaverse is a combination of real-world technologies and the digital world. How do you do analytics and planning if you have functionalities and workflows and things that are both digital and real? Those are different areas that need a bridge between them.

The third thing is the augmented employee. They're going to have digital representations in the metaverse. We're evaluating how you can connect those representations into the enterprise system from SAP SuccessFactors to all the other data sources you have in the organization.

Some members of the U.S. Congress asked the Environmental Protection Agency to consider regulating bitcoin miners because they use a lot of compute power and water. Are you trying to improve the efficiency of blockchain?

Heinig: We take this into consideration, especially proof-of-work authentication, which is very energy consuming, but it's not what we research. It's more about how would we use blockchain technology, hopefully in a very energy-efficient way, to find customer use cases we can enable with blockchain.

One good example would be a green token for tracing raw materials, using the concept of tokenization and blockchain technologies in cross-company scenarios.

Another idea is cross-company workflows. For example, how you can have different process steps across companies and across systems, store them and make sure they are auditable.

A third example would be self-sovereign identity (SSI). The idea is you store your identity in a personal wallet. Today you have a central register where you store the identities and proof of identity against one central database. A good example is when you use your Google identity to log in at different websites.

The idea is similar to a bitcoin wallet. You would have credentials that are verifiable in your personal wallet, and you can verify yourself against different systems. We would make sure it is auditable so you always know that an identity is real.

The beauty of this concept is that you can work with different systems seamlessly. Maybe in the future it would be a way to have more personalized experiences with systems because it could also store information that a system could use to personalize your experience.

What blockchain mechanisms are you looking at for connecting business processes and building trust?

Oren: It started with onboarding. Let's take a supply chain or order-to-cash -- any process with many vendors. Today, when you onboard a new player to a business network, there are a lot of time-consuming manual steps and authorization. The mechanism we use, self-sovereign identity, harnesses the power of a blockchain so that everything is auditable and immutable. You can quickly onboard vendors to the network, supply chain and any process.

With blockchain tokens, you can onboard vendors with ease because everything is documented. In enterprise processes like order-to-cash, any step, like when you deliver something -- let's say you're manufacturing an engine, to use an example from the keynote -- you deliver the piston, someone else provides another component. For anything you send between vendors, you need to have proof-of-delivery documents, which are legal documents. You need to call a lawyer and have a notary service sign the document. It is paper based.

Using the token, you can do self-authorization. You don't need to call those legal services to sign documents. Using the blockchain, everything is immediate, auditable and transparent. It's part of a proof of concept. It's not a product but they talk about it publicly.

We also have this carbon data network project that was also mentioned in the keynote where you have track and trace to see the CO2 emissions of each part by each vendor in each stage of the supply chain.

You are doing some research on composable business processes. What specifically are you looking at?

Heinig: We have 50 years of business knowledge that's basically all packaged in our S/4HANA system. How do we find a way to make it composable to make it more flexible and include easier third-party solutions?

Integration today is basically happening on a technical level, but we would like to lift it up on the business process level. Today we sell software that's packaged, and you have the business processes inside the software. But I think we should change that so we would sell you business processes and you would not even need to bother with what kind of software you're using, because these would be packaged, orchestrated functions that are already pre-integrated.

Oren: If you have, let's say, a need in order-to-cash for a new type of vendor verification or some compliance, it should be very easy -- like plug and play -- to add services from SAP or not. We want to have this orchestration layer of having two services working together. This is something that requires a lot of technical underpinning to both have the abstraction and orchestration of services to work together.

Analytics is a major focus of your group. Why is analytics worth looking at?

Heinig: It's basically analytics plus planning, and we see two major differences arising.

One is the role of ERP systems and business networks. Let's take sustainability KPIs. It's not enough to try to analyze and optimize them on a company level. You need to look at the whole supply chain on your business network. This means your analytics capabilities need to go cross-company.

From a planning perspective, if you really want to optimize it, you also need to have these planning capabilities along your whole supply chain. This is where things get really, really complicated.

The second one is around how can we lift up analytics and planning to the next level? Today, it's really manual and static. You look at your dashboards and maybe find some anomalies and try to react.

We're trying to change this so it's possible for the system to automatically detect anomalies in data flows and trigger creation of a dashboard that is personalized to your role in the company. The system says, 'we found something, please have a look at it, and these are your three most appropriate options.'

More here:
SAP innovation wing aims to shape future of ERP technology - TechTarget

RIT offers new minor in emerging field of quantum information science and technology | RIT – Rochester Institute of Technology

Rochester Institute of Technology students can soon begin earning a minor in an emerging field that could disrupt the science, technology, engineering, and math (STEM) disciplines. RIT students can now take classes toward a minor in quantum information and technology science.

This is a hot field garnering a lot of attention and we are excited to offer students a chance to gain some technical depth in quantum so they can take this knowledge and go the next step with their careers, said Ben Zwickl, associate professor in RITs School of Physics and Astronomy and advisor for the minor. It will provide a pathway for students from any STEM major to take two core courses that introduce them to quantum and some of its applications, as well as strategically pick some upper-level courses within or outside their program.

Quantum physics seeks to understand the rules and effects of manipulating the smallest amount of energy at the subatomic level. Scientists and engineers are attempting to harness the strange, unintuitive properties of quantum particles to make advances in computing, cryptography, communications, and many other applications. Developers of the minor said there is a growing industry that will need employees knowledgeable about quantum physics and its applications.

Were seeing a lot of giant tech companies like IBM, Intel, Microsoft, and Google get involved with quantum, but theres also a lot of venture capital going to startup companies in quantum, said Gregory Howland, assistant professor in the School of Physics and Astronomy. Howland will teach one of the minors two required courses this fallPrinciples and Applications of Quantum Technology. You have both sides of it really blossoming now.

The minor, much like the field itself, is highly interdisciplinary in nature, with faculty from the College of Science, Kate Gleason College of Engineering, College of Engineering Technology, and Golisano College of Computing and Information Sciences offering classes that count toward the minor. The minor grew out of RITs Future Photon Initiative and funding from the NSFs Quantum Leap Challenge Institutes program.

Associate Professor Sonia Lopez Alarcon from RITs Department of Computer Engineering will teach the other required courseIntroduction to Quantum Computing and Information Sciencestarting this spring. She said taking these courses will provide valuable life skills in addition to lessons about cutting-edge science and technology.

Theyll learn more than just the skills from the courses, theyll learn how to get familiar with a topic thats not in the textbooks officially yet, said Lopez Alarcon. Thats a very important skill for industry. Companies want to know theyre hiring people with the ability to learn about something that is emerging, especially in science and technology because its such a rapidly changing field.

The faculty involved noted that they hope to attract a diverse group of students to enroll in the minor. They said that although the disciplines feeding into quantum have struggled with inclusion related to gender and race and ethnicity, they will work with affinity groups on campus to try to recruit students to the program and ultimately advance the fields inclusivity.

To learn more about the minor, contact Ben Zwickl.

Continue reading here:
RIT offers new minor in emerging field of quantum information science and technology | RIT - Rochester Institute of Technology

ANL Special Colloquium on The Future of Computing – HPCwire

There are, of course, a myriad of ideas regarding computings future. At yesterdays Argonne National Laboratorys Directors Special Colloquium, The Future of Computing, guest speaker Sadasivan Shankar, did his best to convince the audience that the high-energy cost of the current computing paradigm not (just) economic cost; were talking entropy here is fundamentally undermining computings progress such that it will never be able to solve todays biggest challenges.

The broad idea is that the steady abstracting away of informational content from each piece of modern computings complicated assemblage (chips, architecture, programming) inexorably increases the cumulative energy cost, leading toward a hard ceiling. Leaving aside, for a moment, the decline in Moores law (just a symptom really), it is the separation (abstraction) of information from direct computation thats the culprit argues Shankar. Every added step adds energy cost.

Nature, on the other hand, bakes information into things. Consider, said Shankar, how a string of amino acids folds into its intended 3-D conformation on a tiny energy budget and in a very short time just by interacting with its environment, and contrast that with the amount of compute required i.e. energy expended to accurately predict protein folding from a sequence of amino acids. Shankar, research technology manager at the SLAC National Laboratory and adjunct Stanford professor, argues computing must take a lesson from nature and strive to pack information more tightly into applications and compute infrastructure.

Information theory is a rich field with a history of rich debate. Turning theory into practice has often proven more difficult and messy. Shankar (and his colleagues) have been developing a formal framework for classifying the levels of information content in human-made computation schemes and natural systems in a way that permits direct comparison between the two. The resulting scale has eight classification levels (0-7).

Theres a lot to digest in Shankars talk. Rather than going off the rails here with a garbled explanation its worth noting that Argonne has archived the video and Shankar has a far-along paper thats expected in a couple of months. No doubt some of his ideas will stir conversation. Given that Argonne will be home to Aurora, the exascale supercomputer now being built at the lab, it was an appropriate site for a talk on the future of computing.

Before jumping into what the future may hold, heres a quick summary of Shankars two driving points 1) Moores law, or more properly the architecture and semiconductor technology on which it rests, is limited and 2) the growing absolute energy cost of information processing using traditional methods (von Neumann) are limiting:

A big part of the answer to question of how computing must progress, suggested Shankar, is to take a page from Feynmans reverberating idea not just for quantum computing and emulate the way nature computes, pack[ing] all of the information needed for the computing into the things themselves or at least by reducing abstraction as much as possible.

Argonne assembled an expert panel to bat Shankars ideas around. The panel included moderator Rick Stevens (associate laboratory director and Argonne distinguished fellow), Salman Habib (director, Argonne computational science division and Argonne distinguished fellow), Yanjing Li (assistant professor, department of computer science, University of Chicago), and Fangfang Xia (computer scientist, data science and learning division, ANL).

Few quibbled with the high-energy cost of computing as described by Shankar but they had a variety of perspectives on moving forward. One of the more intriguing comments came from Xia, an expert in neuromorphic computing. He suggested using neuromorphic systems to discover new algorithms is a potentially productive approach.

My answer goes back to the earlier point Sadas and Rick made which is, if were throwing away efficiency in the information power conversion process, why dont we stay with biological system for a bit longer. Theres this interesting field called synthetic biological intelligence. They are trying to do these brain-computer interfaces, not in a Neurolink way, because thats still shrouded in uncertainty. But there is a company and they grow these brain cells in a petri dish. Then they connect this to an Atari Pong game. And you can see that after just 10 minutes, these brain cells self-organize into neural networks, and they can learn to play the game, said Xia.

Keep in mind, this is 10 minutes in real life, its not a simulation time. Its only dozens of games, just like how we pick up games. So this data efficiency is enormous. What I find particularly fascinating about this is that in this experiment there was no optimization goal. There is no loss function you have to tweak. The system, when connected in this closed loop fashion, will just learn in an embodied way. That opens so many possibilities, you think about all these dishes, just consuming glucose, you can have them to learn latent representations, maybe to be used in digital models.

Li, a computer architecture expert, noted that general purpose computing infrastructure has existed for a long time.

I remember this is the same architecture of processor design I learned at school, and I still teach the same materials today. For the most part, when were trying to understand how CPUs work, and even some of the GPUs, those have been around for a long time. I dont think there has been a lot of very revolutionary kind of changes for those architectures. Theres a reason for that, because we have developed, good tool chains, the compiler tool change people are educated to understand and program and build those systems. So anytime we want to make a big change [it has] to be competitive and as usable as what we know of today, Li said.

On balance, she expects more incremental changes. I think its not going to be just a big jump and well get there tomorrow. We have to build on small steps looking at building on existing understanding and also evolving along with the application requirements. I do think that there will be places where we can increase energy efficiency. If were looking at the memory hierarchy, for example, we know caches and that it helps us with performance. But its also super inefficient from an energy performance standpoint. But this has worked for a long time, because traditional applications have good locality, but we are increasingly seeing new applications where [there] may not be as many localities so theres a way for innovation in the memory hierarchy path. For example, we can design different memory, kind of reference patterns and infrastructures or applications that do not activate locality, for example. That will be one way of making the whole computing system much more efficient.

Li noted the trend toward specialized computing was another promising approach: If we use a general-purpose computing system like a CPU, theres overhead that goes into fetching the instructions, decoding them. All of those are overheads are not directly solving the problem, but its just what you need to get the generality you need to solve all problems. Increasing specialization towards offloading different specialized tasks would be another kind of interesting perspective of approaching this problem.

There was an interesting exchange between Shankar and Stevens over the large amount of energy consumed in training todays large natural language processing models.

Shankar said, Im quoting from literature on deep neural networks or any of these image recognition networks. They scale quadratically with the number of data points. One of the latest things that is being hyped about in the last few weeks is a trillion parameter, natural language processing [model]. So here are the numbers. To train one of those models, it takes the energy equivalent to four cars being driven a whole year, just to train the model, including the manufacturing cost of the car. That is how much energy is spent in the training on this, so there is a real problem, right?

Not so fast countered Stevens. Consider using the same numbers for how much energy is going into Bitcoin, right? So the estimate is maybe something like 5 percent of global energy production. At least these neural network models are useful. Theyre not just used for natural language processing. You can use it for distilling knowledge. You can use them for imaging and so forth. I want to shift gears a little bit. Governments around the world and VCs are putting a lot of money into quantum computing, and based on what you were talking about, its not clear to me that thats actually the right thing we should be doing. We have lots of opportunities for alternative computing models, alternative architectures that could open up spaces that we know in principle can work. We have classical systems that can do this, he said.

Today, theres an army of computational scientists around the world seeking ways to advance computing, some of them focused on the energy aspect of the problem, others focused on other areas such on performance or capacity. It will be interesting to see if the framework and methodology embodied on Shankars forthcoming paper not only provokes discussion but also provides a concrete methodology for comparing computing system efficiency.

Link to ANL video: https://vimeo.com/event/2081535/17d0367863

Brief Shankar Bio

Sadasivan (Sadas) Shankar is Research Technology Manager at SLAC National Laboratory and Adjunct Professor in Stanford Materials Science and Engineering. He is also an Associate in the Department of Physics in Harvard Faculty of Arts and Sciences, and was the first Margaret and Will Hearst Visiting Lecturer in Harvard University and the first Distinguished Scientist in Residence at the Harvard Institute of Applied Computational Sciences. He has co-instructed classes related to materials, computing, and sustainability and was awarded Harvard University Teaching Excellence Award. He is involved in research in materials, chemistry, and specialized AI methods for complex problems in physical and natural sciences, and new frameworks for studying computing. He is a co-founder and the Chief Scientist in Material Alchemy, a last mile translational and independent venture for sustainable design of materials.

Dr. Shankar was a Senior Fellow in UCLA-IPAM during a program on Machine Learning and Many-body Physics, invited speaker in The Camille and Henry Dreyfus Foundation on application of Machine Learning for chemistry and materials, Carnegie Science Foundation panelist for Brain and Computing, National Academies speaker on Revolutions in Manufacturing through Mathematics, invited to White House event for Materials Genome, Visiting Lecturer in Kavli Institute of Theoretical Physics in UC-SB, and the first Intel Distinguished Lecturer in Caltech and MIT. He has given several colloquia and lectures in universities all over the world. Dr. Shankar also worked in the semiconductor industry in the areas of materials, reliability, processing, manufacturing, and is a co-inventor in over twenty patent filings. His work was also featured in the journal Science and as a TED talk.

Go here to read the rest:
ANL Special Colloquium on The Future of Computing - HPCwire

$5 million from Boeing will support UCLA quantum science and technology research | UCLA – UCLA Newsroom

UCLA has received a $5 million pledge from Boeing Co. to support faculty at the Center for Quantum Science and Engineering.

The center, which is jointly operated by the UCLA College Division of Physical Sciences and the UCLA Samueli School of Engineering, brings together scientists and engineers at the leading edge of quantum information science and technology. Its members have expertise in disciplines spanning physics, materials science, electrical engineering, computer science, chemistry and mathematics.

We are grateful for Boeings significant pledge, which will help drive innovation in quantum science, said MiguelGarca-Garibay, UCLAs dean of physical sciences. This remarkable investment demonstrates confidence that UCLAs renowned faculty and researchers will spur progress in this emerging field.

UCLA faculty and researchers are already working on exciting advances in quantum science and engineering, Garca-Garibaysaid. And the divisions new one-year masters program, which begins this fall, will help meet the huge demand for trained professionals in quantum technologies.

Quantum science explores the laws of nature that apply to matter at the very smallest scales, like atoms and subatomic particles. Scientists and engineers believe that controlling quantum systems has vast potential for advancing fields ranging from medicine to national security.

Harnessing quantum technologies for the aerospace industry is one of the great challenges we face in the coming years, said Greg Hyslop, Boeings chief engineer and executive vice president of engineering, test and technology. We are committed to growing this field of study and our relationship with UCLA moves us in that direction.

In addition to its uses in aerospace, examples of quantum theory already in action include superconducting magnets, lasers and MRI scans. The next generation of quantum technology will enable powerful quantum computers, sensors and communication systems and transform clinical trials, defense systems, clean water systems and a wide range of other technologies.

Quantum information science and technology promises society-changing capabilities in everything from medicine to computing and beyond, said Eric Hudson, UCLAs David S. Saxon Presidential Professor of Physics and co-director of the center. There is still, however, much work to be done to realize these benefits. This work requires serious partnership between academia and industry, and the Boeing pledge will be an enormous help in both supporting cutting-edge research at UCLA and creating the needed relationships with industry stakeholders.

The Boeing gift complements recent support from the National Science Foundation, including a $25 million award in 2020 to the multi-universityNSF Quantum Leap Challenge Institute for Present and Future Quantum Computation, which Hudson co-directs. And in 2021, the UCLA center received a five-year,$3 million traineeship grantfor doctoral students from the NSF.

Founded in 2018, the Center for Quantum Science and Engineering draws from the talents and creativity of dozens of faculty members and students.

Boeings support is a huge boost for quantum science and engineering at UCLA, said Mark Gyure, executive director of the center and a UCLA adjunct professor of electrical and computer engineering at the UCLA Samueli School of Engineering. Enhancing the Center for Quantum Science and Engineering will attract additional world-class faculty in this rapidly growing field and, together with Boeing and other companies in the region, establish Los Angeles and Southern California as a major hub in quantum science and technology.

Go here to see the original:
$5 million from Boeing will support UCLA quantum science and technology research | UCLA - UCLA Newsroom

Learn Quantum Computing with Python and Q# – iProgrammer

Author: Dr. Sarah Kaiser and Dr. Chris GranadePublisher: ManningDate: June 2021Pages: 384ISBN: 978-1617296130Print:1617296139Kindle:B098BNK1T9Audience: Developers interested in quantum computingRating: 4.5Reviewer: Mike JamesQuantum - it's the future...

...or not, depending on your view of the idea. The idea is fairly simple even if the implementation turns out to be next to impossible. Quantum Mechanics is a strange theory, but it is one that seems to work, and the idea of using its insights to compute things is fairly reasonable. After all, QM is the way the world works things out as it creates reality. This book is an attempt to convey the ideas of quantum computing to the average programmer with minimal math. I say minimal because getting the idea isn't really possible without math and implementing the ideas involves math, so you can't avoid it.

I started off with the idea that this task, quantum computing with minimal math wasn't doable and at the end of reading the book I'm even more convinced that it isn't the way to go. Quantum computing is, as already suggested, heavy on math. If you can't handle the math then you are going to have a tough time understanding what is going on. More to the point, ideas that I have in my head that are summarized by math occupy pages of the book that avoids that math. Far from being more complex, the math makes it simpler and provides shortcuts to thinking that makes thinking about it actually possible.

I have to say that my BSc degree was heavy on QM and more recently I did an advanced course on quantum computing, so I was expecting this book to be a quick read and a refresher. Far from it. I had to read, and re-read several times, descriptions of things that I thought I knew in an effort to make the connection between the long descriptions and the simple math in my head. I'm sure that this is going to be the experience of many readers who lack the math in the head and are trying to see the general principles in the very wordy explanations. This is not the book's fault. If there could be a book that did the job this would be it - well written with a dash of humour, interest and passion - but I don't think it works.

The first section is called Getting Started and this is a very slow and gentle intro to the basics of what quantum computing is all about - qubits, states, randomness and so on. The examples are quantum encryption, key distribution, non-local games and teleportation. They all sound exciting, but the reality is fairly simple once you get the idea. All of the programs in this section are in Python.

Part 2 is about algorithms and it is expressed in Q#. On balance I think that the entire book would be better just using Q#, but it's a matter of opinion. A whole chapter is devoted to the Deutsch-Jozsa algorithm which, if you understand QM, is one of the easiest of the quantum algorithms to understand. It is also the simplest such algorithm that shows an advantage over a classical algorithm. It took me a short time to understand using the math when I first encountered it, but here it took me some hours to dig thought the non-math explanation and at the end I still don't think that you get the idea that its all based on parity. Classically parity is difficult to measure, but in QM its a natural measurement.

Part 3 is called Applied Quantum Computing and I was looking forward to this because the only really deep quantum algorithms I learned back in the day were Grover's and Shor's. I was hoping to broaden my horizons. The first chapter covers quantum annealing and this was interesting because it's not a mainstream area of quantum computing but one that has many practical applications. The only problem is quantum annealing is really too close to quantum analog computing for my tastes. It is basically a universal quantum simulator that can solve many ground state problems - invaluable but not inspiring. After this I encountered two more algorithms - Grover's and Shor's. Well, yes, any book on quantum computing has to cover them, but there is nothing else. Are we really expending huge efforts on building quantum computers just to implement two algorithms? My guess is that the answer is no - we are expending huge effort to run just Shor's algorithm so that we can crack codes. This book does little to convince me that quantum computers have much more to offer, but I hope I'm wrong.

My final verdict is that this is about as good a non-math-oriented introduction to quantum computing gets. Be warned, there are equations and mathematics that keep peeking through at every turn. You cannot avoid it, but you don't need much math to cope. What I would conclude, however, is that it is much easier to learn the math first and then learn the QM that is needed for quantum computing. In my opinion the math makes it easier.

To keep up with our coverage of books for programmers, follow@bookwatchiprogon Twitteror subscribe to IProgrammer'sBooksRSSfeedfor each day's new addition to Book Watch and for new reviews.

See the article here:
Learn Quantum Computing with Python and Q# - iProgrammer