Quantum Computing: Is It The Next Data Centre Revolution, Or Just Another Tech Hype Cycle?

An Impact News article titled The Rise of Quantum is Shifting Attention to the World’s Data Centers reports that quantum computing is moving out of research laboratories and closer to commercial use. For decades, the technology lived in academic journals. Now executives speak about deployment inside data centres before the decade ends.

Zulfi Alam, corporate vice president of quantum at Microsoft, said, “By the end of the decade, we are confident that we will have machines in data centers that have commercial value.” He added, “I would not be able to say this with this much clarity last year, but this year, I can state to claim that by 2029 you will have machines that will have commercial [value], meaning that they will be doing calculations that classical machines cannot do.”

Microsoft last year unveiled Majorana 1, a quantum computing chip. Rivals such as Google and Amazon are also racing to develop quantum systems through their cloud networks. The Impact News report says industry road maps point to 2028 to 2032 for early commercial deployment.

Madeleine Jenkins, an analyst at UBS, said, “A lot of companies are telling me that 2027 is going to be a big year for quantum in terms of roadmap, in terms of what’s achieved.” A 103 page UBS report in January said the sector is nearing completion of a quantum computer that could cost tens of millions of dollars to build but solve in 200 seconds a problem that would take a conventional supercomputer 10,000 years.

 

How Could Quantum Change Data Centres?

 

The Impact News article argues that quantum’s advance could reshape the design and economics of data centres. Quantum computers use qubits, which can exist in multiple states at once, allowing them to tackle certain problems far faster than classical systems.

Jenkins said quantum computing would require a “fraction of what a data center would use” for certain workloads. “The big thing is time; if you’re taking the same problem that would take thousands and thousands of… hours, and you’re replacing that with a quantum computer that takes seconds or minutes, then obviously you just need a lot less energy,” she said.

Alam said Majorana 1 is “showcasing more power than the entire computation of the entire planet [in] the palm of your hands and it’s not running super-hot. It’s running cold.” He also said, “A quantum machine is not a standalone entity. It’s a hybrid tool. It’s a quantum accelerator that needs a high-performance computer very close to it.”

Ellie Brown of S and P Global Market Intelligence said, “Ideally, the entire efficiency of a problem-solving workload will go down, but it’s not going to be a complete substitution.” Patrick Moorhead of Moor Insights and Strategy said quantum systems may operate inside dedicated “quantum pods” within data centres, adding new operational complexity.

 

Is Quantum Computing The Next Data Centre Revolution, Or Just Another Tech Hype Cycle?

 

Experts share their thoughts…

 

Our Experts:

 

  • Burkan Bur, MBA, Managing Director, Head of SEO, The Ad Firm
  • Pavel Efremov, Director, FinchTrade
  • David Blumenfeld, NextRivet
  • Kevin Hilscher, Sr. Director, Product Management – PQC & Device Trust, DigiCert
  • Michael J. Attisha, Attorney on Quantum Technologies, Greenberg Traurig Boston
  • Abhishek Chopra, Founder, CEO, and Chief Scientific Officer, Founder & CEO, BQP
  • Mike Litvinenko, Founder, Medical Excellence League
  • Arie Brish, St Edwards University
  • Edward Tian, CEO, GPTZero
  • Scott Dylan, Founder, NexaTech Ventures

 

Burkan Bur, MBA, Managing Director, Head of SEO, The Ad Firm

 

 

“I was gazing at the way server farms works the huge amount of data for twenty years. Most leaders think that existing hardware is quite fine. But big models require processing power so fast that the standard configuration of servers just aren’t up to the task. The addition of the chips leads to heat and expense issues. That is why I look towards quantum processing. Like systems are capable of processing complex generation far quicker than traditional hardware that is used for making large synthetic mirrored text data sets.

“The infrastructure has changed the game for long term growth. I think the way in which we have hardware, is running into a wall. Brands come across numerous issues with the restrictions in traditional math.

“Server farms wallop massive wall-as we drive for more automation. Most facilities are not equipped to have the room to hold the millions and millions of training examples that are required for the development of modern LLM. It takes years of planning to construct these spaces but we need to start now. Based off of my years being in the field we require the ability to use specialised hardware that can fit into tight VRAM limits that can run these heavy models, and the hardware shift is something like speed and accuracy when we are scoring text written by humans, and naturally bots too, and older setups just can’t keep up with the layering of revisions that people expect in today’s world If you want efficiency you know you have got to move to these new architectures.

“Research indicates that majority of writing messages asking models to change something that is in the text comes in at around two thirds. This pattern, that proves that people are interested in having actual tools to edit and polish their works right at this moment. With the help of our fast training algorithms we select example which are difficult to learn for which signal for learning is high.

“Actual data on the human-bot collaboration is more powerful than any blanket wishes about the future of technology. 2 Verified scores of detection are more powerful than trends, and focusing on these real usage patterns is a way that we can build something that is going to last.”

 

Pavel Efremov, Director, FinchTrade

 

 

“The way I see it, quantum computing will matter enormously. Just not yet, and probably not in the way most people expect. From what we are seeing the last years in tech, the pattern repeats: a promising technology emerges, investment accelerates, timelines compress, and then reality asserts itself. Call it quantum, call it AI. The name changes, but the cycle doesn’t. And right now, quantum is nowhere near where AI was when serious commercial money started moving.

“Is it hype? Partly. But the more useful question for data centres isn’t whether quantum is coming. It’s what to do in the meantime. The real near-term pressure isn’t new processing power. It’s security. The first encryption standards built to withstand quantum computers have been already published. For most people that went unnoticed. For governments and security teams, it started a clock. Migration deadlines are already being set, and the organisations paying attention are preparing now.

 

David Blumenfeld, NextRivet

 

 

“In short, the practical use of quantum computers will rely on a “symbiotic” relationship between traditional computers and quantum computers, with AI playing a big part in that connective tissue. Therefore, quantum will absolutely increase the need for more data center and energy usage for both traditional and quantum computers. This is definitely an instance where more means more. “

 

Kevin Hilscher, Sr. Director, Product Management – PQC & Device Trust, DigiCert

 

 

“Quantum computers don’t resemble today’s rack mount servers and storage. Running quantum computers in a datacenter will more resemble a process manufacturing facility than a traditional datacenter, where the floorspace is dominated by support equipment for things such as cryogenics.”

 

Michael J. Attisha, Attorney on Quantum Technologies, Greenberg Traurig Boston

 

 

“Quantum computing will integrate with AI and contribute to a new phase of data centre evolution, but will not grow in the same way that AI has. The AI boom has reshaped data centres because it scales with traditional computing: more GPUs, more memory, more power, etc. Quantum computers don’t scale that way, as they are instead specialized processors with tightly controlled environmental requirements such as cryogenic cooling. Quantum computers will also, at least for the foreseeable future, have relatively low throughput, making them more like cloud-accessible accelerators than replacements for data centre computing racks. Their impact will therefore be incremental and targeted rather than a wholesale infrastructure revolution.

“Is this actually a plausible/realistic expectation, or is it just adding on to the AI bubble issue?
The underlying science and engineering of quantum computing has been advancing for decades and has reached early commercial maturity. Quantum computers should dramatically improve certain computationally intensive tasks such as molecular simulation, materials discovery and cryptography, but are unlikely to transform consumer-facing AI applications like chatbots or media generation.

“Adoption will likely be driven by research institutions, government programs and specialized commercial sectors rather than mass-market demand, and integration will likely occur through a limited number of specialized facilities connected to conventional data centres. Although AI and quantum computing target similar classes of computationally intensive problems, the AI-era hype should not be projected onto quantum computing, which is progressing along a credible but narrower and longer-term path.”

 

Abhishek Chopra, Founder, CEO, and Chief Scientific Officer, Founder & CEO, BQP

 

 

“Quantum computing is often described as either the next infrastructure revolution or just another hype cycle. The reality is more measured. Today’s quantum systems are not positioned to replace traditional data centres. The hardware remains early stage, limited in scale, and highly specialized, while data centres are built for reliability and broad enterprise workloads.

“Where quantum does show promise is in specific, high-complexity problems such as advanced engineering simulations and optimization. In those areas, even incremental gains can matter. The more realistic path forward is hybrid infrastructure, where classical high-performance computing remains foundational and quantum systems are integrated selectively.

“AI and quantum are on very different maturity curves. AI scales on existing infrastructure, while quantum still requires significant hardware and engineering progress. Rather than a sudden data centre revolution, what is more plausible is gradual integration into targeted industrial use cases over time.”

 

 

Mike Litvinenko, Founder, Medical Excellence League

 

 

“I’d say quantum computing in its recent form is still a hype.

“The thing still needs a highly controlled environment, and the current industry is still working toward fault-tolerant machines that can run any useful workloads reliably. IBM’s own plan targets a large scale fault-tolerant system in 2029, which tells you where that timeline sits for practical and durable machines. Even Google’s recent progress on quantum error correction points the same way.”

 

Arie Brish, St Edwards University

 

 

“I have been a technology executive and investor for almost 50 years.

“I designed an AI tool as early as 1985sh, and have been involved on & off with other AI related projects ever since.

“During these 50 years I have been involved in several “stretch of the envelope”, “state of the art” technology cycles.

“I authored “Lay an Egg and Make Chicken Soup” which is an industry agnostic practical guide for commercializing innovations. The book became an Amazon best seller in the Entrepreneurship and Innovation genres. The last chapter in the book talks about Business & Technology futures, issues, opportunities, and risks.

“Most recently I teach Innovation, Entrepreneurship, and Business Strategy classes at St Edward’s Unv in Austin Texas.

“AI is here alive and kicking. Quantum computer is still at its infancy.. Whether in will end up being the next bubble is yet to be seen, but in the meantime it is at the beginning of its “pregnancy” . Many smart people are working on it, with some early signs of life, but its high volume commercial deployments are years out..”

 

Edward Tian, CEO, GPTZero

 

 

“Traditional data centres will not be dethroned by quantum computing. And unlike the rapid growth of AI, we should expect much slower growth and much more calibration of our expectations before we experience the growth of quantum infrastructure.

“The current generation of quantum systems is so specialized, requires exceedingly precise environmental conditions, and is limited to solving three specific types of problems optimisation, cryptography, and certain types of simulations, that they are best considered complementary to classical infrastructures rather than an outright replacement for them.

“In terms of today through several years out, data centres will not be going quantum. Instead, we can expect to see a hybrid architecture where a quantum processor will be utilized for a very limited number of workloads, similar to how we first integrated GPU processors into our data centres for improved AI performance.

“There is much excitement regarding quantum computing and typically this excitement is exaggerated, especially when considering quantum and it being called the next exponential computing wave, but in contrast to AI, which can have an exponential impact to each area of an enterprise, quantum will only be able to impact certain domains due to limitations associated with economics and physics.

“In the next decade, and especially with developments of AI scaling, efficiency measures, and specialised processors, data centres will see continued transformation with little if any significant level of transformation due to quantum computing developments.”

 

Scott Dylan, Founder, NexaTech Ventures

 

 

“Quantum computing and data centres is a question I find genuinely interesting, because it sits at the intersection of real scientific progress and some deeply premature expectations. The short answer is: it’s neither pure hype nor an imminent revolution. It’s something more nuanced, and that nuance matters.

“Right now, we’re still firmly in what the industry calls the NISQ era — noisy intermediate-scale quantum computing. The machines that exist today are impressive as engineering achievements, but they remain error-prone, difficult to scale, and limited to a very narrow set of problems. Jensen Huang raised eyebrows at CES in early 2025 when he said practical quantum computing was 15 to 30 years away. Microsoft’s quantum team has been more optimistic, with their corporate VP of Quantum saying publicly that by 2029 we’ll have machines in data centres with genuine commercial value. The truth likely sits somewhere between those two positions.

“What I would push back on is the framing that quantum is simply “adding to the AI bubble.” They’re fundamentally different technologies solving fundamentally different problems. AI is already generating returns, reshaping business models, and straining data centre capacity to breaking point. UK data centres alone now account for roughly 2.5 per cent of national electricity consumption, and that figure is climbing fast. Quantum isn’t competing with that — it’s a parallel track entirely.

“Where quantum becomes relevant to data centres is in its potential to address the very energy problem AI is creating. A quantum processor can, for certain classes of computation, deliver results using a fraction of the power a classical supercomputer would need. QuEra’s 256-qubit machine, for instance, consumes less than 10 kilowatts — roughly a thousand times less than the average top-performing supercomputer. If we can get past the error correction challenges and actually scale these systems, the efficiency gains for specific workloads could be extraordinary.

“But that “if” is doing a lot of heavy lifting. The infrastructure requirements are substantial. Most quantum systems need cryogenic cooling to near absolute zero. The talent pipeline is thin — S&P analysts have flagged that we simply don’t have enough qualified quantum engineers to support broad deployment. And the industry still lacks basic standards for integrating quantum hardware into existing data centre environments, though the Open Compute Project is working on a quantum-ready certification framework, with a first draft expected sometime this year.

“The realistic timeline, based on where the investment and the engineering actually stand, puts meaningful commercial quantum integration into data centres somewhere in the 2028 to 2032 window. China has committed close to $18 billion in public quantum investment. The UK has laid out a £2 billion quantum roadmap, with £670 million earmarked for quantum computing and long-term funding for the National Quantum Computing Centre in Oxfordshire. These aren’t speculative bets — they’re strategic infrastructure plays by governments that see quantum as a matter of sovereignty as much as innovation.

“What data centre operators should be doing now isn’t ripping out racks to make room for quantum hardware. It’s thinking about hybrid architectures — environments where classical computing, AI accelerators, and eventually quantum processors can coexist. Microsoft, IBM, and others are already building cloud-accessible quantum systems that work alongside classical infrastructure. That hybrid model is almost certainly how this plays out in practice, not some wholesale replacement of existing technology.

“So is it hype? Parts of it, absolutely. The idea that quantum computing will transform data centres in the next two or three years is wishful thinking. But writing it off entirely would be a mistake. The science is real, the investment is serious, and the engineering is advancing faster than most people outside the field appreciate. The smart money isn’t betting on quantum replacing classical computing — it’s positioning for a world where the two work together, and the data centre of 2030 looks quite different from the one we’re building today.”