ANL Special Colloquium on The Future of Computing – HPCwire
There are, of course, a myriad of ideas regarding computings future. At yesterdays Argonne National Laboratorys Directors Special Colloquium, The Future of Computing, guest speaker Sadasivan Shankar, did his best to convince the audience that the high-energy cost of the current computing paradigm not (just) economic cost; were talking entropy here is fundamentally undermining computings progress such that it will never be able to solve todays biggest challenges.
The broad idea is that the steady abstracting away of informational content from each piece of modern computings complicated assemblage (chips, architecture, programming) inexorably increases the cumulative energy cost, leading toward a hard ceiling. Leaving aside, for a moment, the decline in Moores law (just a symptom really), it is the separation (abstraction) of information from direct computation thats the culprit argues Shankar. Every added step adds energy cost.
Nature, on the other hand, bakes information into things. Consider, said Shankar, how a string of amino acids folds into its intended 3-D conformation on a tiny energy budget and in a very short time just by interacting with its environment, and contrast that with the amount of compute required i.e. energy expended to accurately predict protein folding from a sequence of amino acids. Shankar, research technology manager at the SLAC National Laboratory and adjunct Stanford professor, argues computing must take a lesson from nature and strive to pack information more tightly into applications and compute infrastructure.
Information theory is a rich field with a history of rich debate. Turning theory into practice has often proven more difficult and messy. Shankar (and his colleagues) have been developing a formal framework for classifying the levels of information content in human-made computation schemes and natural systems in a way that permits direct comparison between the two. The resulting scale has eight classification levels (0-7).
Theres a lot to digest in Shankars talk. Rather than going off the rails here with a garbled explanation its worth noting that Argonne has archived the video and Shankar has a far-along paper thats expected in a couple of months. No doubt some of his ideas will stir conversation. Given that Argonne will be home to Aurora, the exascale supercomputer now being built at the lab, it was an appropriate site for a talk on the future of computing.
Before jumping into what the future may hold, heres a quick summary of Shankars two driving points 1) Moores law, or more properly the architecture and semiconductor technology on which it rests, is limited and 2) the growing absolute energy cost of information processing using traditional methods (von Neumann) are limiting:
A big part of the answer to question of how computing must progress, suggested Shankar, is to take a page from Feynmans reverberating idea not just for quantum computing and emulate the way nature computes, pack[ing] all of the information needed for the computing into the things themselves or at least by reducing abstraction as much as possible.
Argonne assembled an expert panel to bat Shankars ideas around. The panel included moderator Rick Stevens (associate laboratory director and Argonne distinguished fellow), Salman Habib (director, Argonne computational science division and Argonne distinguished fellow), Yanjing Li (assistant professor, department of computer science, University of Chicago), and Fangfang Xia (computer scientist, data science and learning division, ANL).
Few quibbled with the high-energy cost of computing as described by Shankar but they had a variety of perspectives on moving forward. One of the more intriguing comments came from Xia, an expert in neuromorphic computing. He suggested using neuromorphic systems to discover new algorithms is a potentially productive approach.
My answer goes back to the earlier point Sadas and Rick made which is, if were throwing away efficiency in the information power conversion process, why dont we stay with biological system for a bit longer. Theres this interesting field called synthetic biological intelligence. They are trying to do these brain-computer interfaces, not in a Neurolink way, because thats still shrouded in uncertainty. But there is a company and they grow these brain cells in a petri dish. Then they connect this to an Atari Pong game. And you can see that after just 10 minutes, these brain cells self-organize into neural networks, and they can learn to play the game, said Xia.
Keep in mind, this is 10 minutes in real life, its not a simulation time. Its only dozens of games, just like how we pick up games. So this data efficiency is enormous. What I find particularly fascinating about this is that in this experiment there was no optimization goal. There is no loss function you have to tweak. The system, when connected in this closed loop fashion, will just learn in an embodied way. That opens so many possibilities, you think about all these dishes, just consuming glucose, you can have them to learn latent representations, maybe to be used in digital models.
Li, a computer architecture expert, noted that general purpose computing infrastructure has existed for a long time.
I remember this is the same architecture of processor design I learned at school, and I still teach the same materials today. For the most part, when were trying to understand how CPUs work, and even some of the GPUs, those have been around for a long time. I dont think there has been a lot of very revolutionary kind of changes for those architectures. Theres a reason for that, because we have developed, good tool chains, the compiler tool change people are educated to understand and program and build those systems. So anytime we want to make a big change [it has] to be competitive and as usable as what we know of today, Li said.
On balance, she expects more incremental changes. I think its not going to be just a big jump and well get there tomorrow. We have to build on small steps looking at building on existing understanding and also evolving along with the application requirements. I do think that there will be places where we can increase energy efficiency. If were looking at the memory hierarchy, for example, we know caches and that it helps us with performance. But its also super inefficient from an energy performance standpoint. But this has worked for a long time, because traditional applications have good locality, but we are increasingly seeing new applications where [there] may not be as many localities so theres a way for innovation in the memory hierarchy path. For example, we can design different memory, kind of reference patterns and infrastructures or applications that do not activate locality, for example. That will be one way of making the whole computing system much more efficient.
Li noted the trend toward specialized computing was another promising approach: If we use a general-purpose computing system like a CPU, theres overhead that goes into fetching the instructions, decoding them. All of those are overheads are not directly solving the problem, but its just what you need to get the generality you need to solve all problems. Increasing specialization towards offloading different specialized tasks would be another kind of interesting perspective of approaching this problem.
There was an interesting exchange between Shankar and Stevens over the large amount of energy consumed in training todays large natural language processing models.
Shankar said, Im quoting from literature on deep neural networks or any of these image recognition networks. They scale quadratically with the number of data points. One of the latest things that is being hyped about in the last few weeks is a trillion parameter, natural language processing [model]. So here are the numbers. To train one of those models, it takes the energy equivalent to four cars being driven a whole year, just to train the model, including the manufacturing cost of the car. That is how much energy is spent in the training on this, so there is a real problem, right?
Not so fast countered Stevens. Consider using the same numbers for how much energy is going into Bitcoin, right? So the estimate is maybe something like 5 percent of global energy production. At least these neural network models are useful. Theyre not just used for natural language processing. You can use it for distilling knowledge. You can use them for imaging and so forth. I want to shift gears a little bit. Governments around the world and VCs are putting a lot of money into quantum computing, and based on what you were talking about, its not clear to me that thats actually the right thing we should be doing. We have lots of opportunities for alternative computing models, alternative architectures that could open up spaces that we know in principle can work. We have classical systems that can do this, he said.
Today, theres an army of computational scientists around the world seeking ways to advance computing, some of them focused on the energy aspect of the problem, others focused on other areas such on performance or capacity. It will be interesting to see if the framework and methodology embodied on Shankars forthcoming paper not only provokes discussion but also provides a concrete methodology for comparing computing system efficiency.
Link to ANL video: https://vimeo.com/event/2081535/17d0367863
Brief Shankar Bio
Sadasivan (Sadas) Shankar is Research Technology Manager at SLAC National Laboratory and Adjunct Professor in Stanford Materials Science and Engineering. He is also an Associate in the Department of Physics in Harvard Faculty of Arts and Sciences, and was the first Margaret and Will Hearst Visiting Lecturer in Harvard University and the first Distinguished Scientist in Residence at the Harvard Institute of Applied Computational Sciences. He has co-instructed classes related to materials, computing, and sustainability and was awarded Harvard University Teaching Excellence Award. He is involved in research in materials, chemistry, and specialized AI methods for complex problems in physical and natural sciences, and new frameworks for studying computing. He is a co-founder and the Chief Scientist in Material Alchemy, a last mile translational and independent venture for sustainable design of materials.
Dr. Shankar was a Senior Fellow in UCLA-IPAM during a program on Machine Learning and Many-body Physics, invited speaker in The Camille and Henry Dreyfus Foundation on application of Machine Learning for chemistry and materials, Carnegie Science Foundation panelist for Brain and Computing, National Academies speaker on Revolutions in Manufacturing through Mathematics, invited to White House event for Materials Genome, Visiting Lecturer in Kavli Institute of Theoretical Physics in UC-SB, and the first Intel Distinguished Lecturer in Caltech and MIT. He has given several colloquia and lectures in universities all over the world. Dr. Shankar also worked in the semiconductor industry in the areas of materials, reliability, processing, manufacturing, and is a co-inventor in over twenty patent filings. His work was also featured in the journal Science and as a TED talk.
Go here to read the rest:
ANL Special Colloquium on The Future of Computing - HPCwire
- Quantum Computing Moves Out Of The Lab - RealClearMarkets - May 1st, 2026 [May 1st, 2026]
- Quantum Computing Companies Are in a Race to Go Public - WSJ - April 27th, 2026 [April 27th, 2026]
- Top Quantum Computing Stocks to Buy in April - The Motley Fool - April 27th, 2026 [April 27th, 2026]
- Watch Wall Street Split on Quantum Computing Bet - Bloomberg.com - April 27th, 2026 [April 27th, 2026]
- Fault-Tolerant Quantum Computing with Trapped Ions: The Walking Cat Architecture - IonQ Quantum Computing - April 27th, 2026 [April 27th, 2026]
- Quantum computing gains stability boost from NVIDIA error correction model - Digital Watch Observatory - April 27th, 2026 [April 27th, 2026]
- Global Quantum Computing Market: Growth, Trends, and Future Outlook (2026-2036) - openPR.com - April 27th, 2026 [April 27th, 2026]
- Quantum Computing in Drug Discovery Market to Reach USD 637.83 - openPR.com - April 27th, 2026 [April 27th, 2026]
- Our Quantum Computing Name Faces a Crucial Spot in the Chart - TheStreet Pro - April 27th, 2026 [April 27th, 2026]
- This Quantum Computing Stock Is Up 200% in 2025. Here's 1 Reason That Could Be Just the Beginning. - AOL.com - April 27th, 2026 [April 27th, 2026]
- Quantinuum Has Submitted a Confidential S-1 Filing to the U.S. SEC for a Proposed Initial Public Offering (IPO) of their Common Stock - Quantum... - April 27th, 2026 [April 27th, 2026]
- Quantum Computing Stocks IonQ, Rigetti, and D-Wave Have Soared Up to 72% in 7 Trading Sessions -- and You'll Likely Regret Chasing This Rally - The... - April 25th, 2026 [April 25th, 2026]
- IonQ Publishes Definitive Technical Report, Establishing Its Fault-Tolerant Quantum Computing Trajectory Setting a New Standard for Technical... - April 25th, 2026 [April 25th, 2026]
- Coinbase Advisers Warn Quantum Computing Will Crack Blockchain Encryption -- And The Window to Prepare Is Narrowing - The Quantum Insider - April 25th, 2026 [April 25th, 2026]
- Quantum Computing Stocks IonQ, Rigetti, and D-Wave Have Soared Up to 72% in 7 Trading Sessions -- and You'll Likely Regret Chasing This Rally - Yahoo... - April 25th, 2026 [April 25th, 2026]
- Why the Real Quantum Race is Shifting from Hardware to Software - Quantum Computing Report - April 25th, 2026 [April 25th, 2026]
- Q.ANT Expands to U.S. and Appoints Former IBM Executive as CTO - Quantum Computing Report - April 25th, 2026 [April 25th, 2026]
- Bitcoin Quantum Computing Threat: Is Your Crypto at Risk in 2026? - Intellectia AI - April 25th, 2026 [April 25th, 2026]
- Quantum computing poised to displace classical AI infrastructure, Northland says - Investing.com - April 25th, 2026 [April 25th, 2026]
- "We are a part of this community now": How institutions are preparing for Bitcoin's quantum computing threat - Blockspace Media - April 25th, 2026 [April 25th, 2026]
- Scientists Overcome Major Quantum Bottleneck, Potentially Transforming Teleportation and Computing - SciTechDaily - April 25th, 2026 [April 25th, 2026]
- Phoenix Mayor Kate Gallego highlights new initiative to make Valley a quantum computing hub - KJZZ - April 23rd, 2026 [April 23rd, 2026]
- IonQ vs. Rigetti: Which Quantum Computing Stock Is the Better Buy Ahead of Q1 Earnings? - TipRanks - April 23rd, 2026 [April 23rd, 2026]
- Ripple haunted by spectre of quantum computing and proposes two-year plan for XRP Ledger - dlnews.com - April 23rd, 2026 [April 23rd, 2026]
- Global Quantum Computing Advanced Packaging Market to Reach USD 278.65 Million by 2036 - openPR.com - April 23rd, 2026 [April 23rd, 2026]
- Quantum Computing Inc Announces Deployment-Ready NeuraWave, A Photonic Computing Platform For Real-Time AI Inference At The Edge - marketscreener.com - April 23rd, 2026 [April 23rd, 2026]
- Architectural Blueprints for Fault-Tolerant Trapped-Ion and Neutral-Atom Systems - Quantum Computing Report - April 23rd, 2026 [April 23rd, 2026]
- Why Nvidia Is the Most Important Quantum Computing Stock of All - Yahoo Finance - April 21st, 2026 [April 21st, 2026]
- Senate Commerce Committee Unanimously Passes National Quantum Initiative Reauthorization Act - Quantum Computing Report - April 21st, 2026 [April 21st, 2026]
- Coinbase advisory board says quantum computing threat is on the horizon, crypto needs a plan - CoinDesk - April 21st, 2026 [April 21st, 2026]
- Two paths to scalable quantum computing: Optical links between fridges and higher-temperature qubits - Phys.org - April 21st, 2026 [April 21st, 2026]
- Why Nvidia Is the Most Important Quantum Computing Stock of All - The Motley Fool - April 21st, 2026 [April 21st, 2026]
- Quantum Computing Stock Earnings Breakdown: The Bigger the Loss, the Bigger the Gain? - NAI500 - April 21st, 2026 [April 21st, 2026]
- Kvantify and Equal1 Partner on Quantum Computing Integration - The Quantum Insider - April 21st, 2026 [April 21st, 2026]
- Why Nvidia Is the Most Important Quantum Computing Stock of All - AOL.com - April 21st, 2026 [April 21st, 2026]
- Quantum Computing Stocks: Sorting the Real Science from the Hype - MarketBeat - April 21st, 2026 [April 21st, 2026]
- Despite Its Name, The Defiance Quantum ETF Is NOT A Bona Fide Quantum Computing ETF (QTUM) - Seeking Alpha - April 21st, 2026 [April 21st, 2026]
- Quantum Computing (NASDAQ:QUBT) Coverage Initiated at Northland Securities - MarketBeat - April 21st, 2026 [April 21st, 2026]
- IBM and the University of Illinois Urbana-Champaign Expand Discovery Accelerator Institute to Advance AI and Quantum Computing - IBM Newsroom - April 17th, 2026 [April 17th, 2026]
- IBM, U of I renew and expand Illinois institute focused on AI, quantum computing - CBS News - April 17th, 2026 [April 17th, 2026]
- D-Wave Quantum Vs. Quantum Computing: Early Revenue Premium Masks Long-Term Parity (NYSE:QBTS) - Seeking Alpha - April 17th, 2026 [April 17th, 2026]
- IBM expands quantum computing partnership with University of Illinois - StreetInsider - April 17th, 2026 [April 17th, 2026]
- The Best Quantum Computing Stocks to Buy Today - Yahoo Finance - April 12th, 2026 [April 12th, 2026]
- 3 Quantum Computing Stocks with Potential to Beat the Market 4/10/2026 - TipRanks - April 12th, 2026 [April 12th, 2026]
- Why Quantum Computing Is Becoming a Strategic Lever in Materials Innovation - Wipro - April 12th, 2026 [April 12th, 2026]
- What Are the Best Quantum Computing Stocks to Buy in 2026? - CoinCentral - April 12th, 2026 [April 12th, 2026]
- Adam Back Says Quantum Threat to Bitcoin Is Decades Away, Urges Gradual Migration to Post-Quantum Security - Bitcoin Magazine - April 10th, 2026 [April 10th, 2026]
- Infleqtion and NASA Deploy Upgraded Quantum Hardware to International Space Station - Quantum Computing Report - April 10th, 2026 [April 10th, 2026]
- Long-term undervaluation lifts Quantum Computing stock to a 4.87% daily increase - Traders Union - April 10th, 2026 [April 10th, 2026]
- Up 1,460% Since 2024, Is It Too Late to Buy This Quantum Computing Leader? - Yahoo Finance - April 8th, 2026 [April 8th, 2026]
- Up 1,460% Since 2024, Is It Too Late to Buy This Quantum Computing Leader? - The Motley Fool - April 8th, 2026 [April 8th, 2026]
- Future of Quantum Computing | High-Speed Processing, AI Synergy - openPR.com - April 8th, 2026 [April 8th, 2026]
- Fully Programmable Quantum Computing With Trapped-ions - Quantum Zeitgeist - April 8th, 2026 [April 8th, 2026]
- News Explorer Quantum Computing Poses Potential Threats to Bitcoin, But Current Research Shows Limitations - Decrypt - April 8th, 2026 [April 8th, 2026]
- Quantum Computing Market to Reach US$ 22.75 Billion by 2033 - openPR.com - April 8th, 2026 [April 8th, 2026]
- A Cryptography Engineers Perspective on Quantum Computing Timelines - OODAloop - April 8th, 2026 [April 8th, 2026]
- Why the mind-bending physics of quantum computing is terrifying for bitcoin and crypto - CoinDesk - April 7th, 2026 [April 7th, 2026]
- Q-Factor emerges with $24M in funding and the next big bet to achieve quantum computing advantage - SiliconANGLE - April 7th, 2026 [April 7th, 2026]
- Alex Pruden: Quantum computing threatens elliptic curve cryptography, advancements could lead to utility-scale systems by decade's end, and the urgent... - April 7th, 2026 [April 7th, 2026]
- Lloyds Bank uses quantum computing to detect money mules - The Times - April 7th, 2026 [April 7th, 2026]
- SpinQ Technology Raises Nearly 1 Billion CNY ($145.3M USD) to Scale Industrial Quantum Computing - Quantum Computing Report - April 5th, 2026 [April 5th, 2026]
- Is Rigetti Computing the Best Quantum Computing Stock to Buy Right Now? - fool.com - April 5th, 2026 [April 5th, 2026]
- SpinQ Technology Secures Nearly 1 Billion Chinese Yuan in Series C Funding to Scale Industrial Superconducting Quantum Computing - The Quantum Insider - April 5th, 2026 [April 5th, 2026]
- Q&A: AWS on new AI agents, quantum computing in healthcare - MobiHealthNews - April 5th, 2026 [April 5th, 2026]
- QuiX Quantum Demonstrates Below-Threshold Error Mitigation in Photonic Quantum Computing for First Time - HPCwire - April 5th, 2026 [April 5th, 2026]
- Google says there's a 10% chance quantum computing will crack bitcoin cryptography by 2032 - PC Gamer - April 5th, 2026 [April 5th, 2026]
- Solana begins testing defenses against 'quantum-computing threat' "Stronger security leads to slower performance" - bloomingbit - April 5th, 2026 [April 5th, 2026]
- 'No Need to Panic': Binance Founder Says Crypto Can Survive Quantum Computing Threat by Upgrading - Yahoo Finance - April 5th, 2026 [April 5th, 2026]
- Clocking Quantum Instability: A New Process Could Open Pathways to Reliable Quantum Computing - thedebrief.org - April 5th, 2026 [April 5th, 2026]
- University of Chicago using "self-driving" lab in quantum computing research - CBS News - April 5th, 2026 [April 5th, 2026]
- IBM Advances Quantum Computing Research: Will it Boost Prospects? - sg.finance.yahoo.com - April 5th, 2026 [April 5th, 2026]
- Safeguarding cryptocurrency by disclosing quantum vulnerabilities responsibly - research.google - April 1st, 2026 [April 1st, 2026]
- New Teaching Approach Could Open The Door to Quantum Computing - The Quantum Insider - April 1st, 2026 [April 1st, 2026]
- EPB Joins Southeastern Quantum Collaborative to Support Regional Infrastructure Integration - Quantum Computing Report - April 1st, 2026 [April 1st, 2026]
- Google warns quantum computing may break bitcoin earlier than thought, prepares 2029 transition - The Block - April 1st, 2026 [April 1st, 2026]
- Alice & Bob secures 3.4 million ARPA-E award to discover rare-earth-free magnets using quantum computing - BeBeez International - April 1st, 2026 [April 1st, 2026]
- 1 Unstoppable Quantum Computing Stock to Buy Before It Soars 200%, According to 1 Wall Street Analyst - finance.yahoo.com - April 1st, 2026 [April 1st, 2026]
- Is Ciscos Quantum Networking Push With Atom Computing Reshaping Its AI Infrastructure Story (CSCO)? - finance.yahoo.com - April 1st, 2026 [April 1st, 2026]
- Changpeng Zhao dismisses 'quantum computing fears' "No need to be scared" - bloomingbit - April 1st, 2026 [April 1st, 2026]
- UMass Amherst and UCSB Demonstrate Photonic Chip Technology for Miniaturized Quantum Systems - Quantum Computing Report - April 1st, 2026 [April 1st, 2026]