Predicting the Failure of Quantum Computing

Quantum computing will fail before it succeeds. Thatโ€™s not a criticism of quantum computing. Itโ€™s more a commentary on the difficulty of deploying solutions based on cutting-edge innovation.

In 2020, the human species has extensive experience with new technologies. I can predict the trajectory of quantum computing with a fair degree of confidence (having lived through it many times in the data industry):

  1. Well-intentioned researchers and developers articulate the promise of quantum computing. Journalists hype it. Leaders and executives come to believe that quantum will solve their most challenging problems. Enterprises spend large sums on quantum tools and infrastructure. Analysts start to track the quantum computing market. Investors bestow millions on entrepreneurs.
  2. Thought leaders narrowly focus on metrics like computational speed. Enterprises deploy the first quantum computing applications without understanding that the quantum component is actually just a small part of a much larger (and much less exciting) system. Enterprises spend a lot of money on quantum and too little on everything else. No one thinks about annoying things like field service, corner cases, enhancements, maintenance and customer support.
  3. The highly-compensated gurus who know how to create quantum solutions become bottlenecks. They work around the clock, yet still deliver fewer features and later than initially planned. Data riddled with errors flow into quantum applications corrupting results. The slow and error-prone processes for deploying quantum application enhancements and updates rely on complex manual steps. Groups managing subsystems of quantum applications engage in turf wars. Projects exceed their allocated budgets. Applications underdeliver on their promise. After painful field trials, prospects refrain from becoming customers.
  4. Researchers and developers transparently discuss their underwhelming results at conferences. Industry analysts publish surveys indicating that only 3% of quantum projects meet performance targets. Journalists write articles about how quantum is โ€œnot ready for prime time.โ€ CIOs and CDOs decide to spend more time with family.
  5. Everyone else resumes doing what they were doing before they ever heard the term โ€œquantum computing.โ€

The disaster scenario described above says nothing about quantum computing or its eventual success. It applies to any new technology. Today, we see it happening in AI and machine learning, which are transitioning from step 3 to step 4. Like AI and machine learning, quantum computing will have a bright future when business leaders design processes and methodologies that deliver high-quality results with minimal cycle time. The keys to success with quantum computing include governing data quality using statistical and process controls, delivering new applications rapidly and continuously using DevOps methods, and iterating on quantum models, guided by customer feedback, using Agile development. In the context of analytics and big data, these methods are collectively calledย DataOps.

The ability to execute quantum algorithms very fast will surely change everything. The fact that humans will deploy and manage these capabilities will prove that nothing at all has changed. Enterprises can accelerate the success of quantum computing by decades using DataOps to eliminate errors and compress application development cycle time. The irony is that the DataOps tools and methods that enterprises implement today can make or break their future with quantum computing.ย 

Sign-Up for our Newsletter

Get the latest straight into your inbox

Open Source Data Observability Software

DataOps Observability: Monitor every Data Journey in an enterprise, from source to customer value, and find errors fast! [Open Source, Enterprise]

DataOps Data Quality TestGen: Simple, Fast Data Quality Test Generation and Execution. Trust, but verify your data! [Open Source, Enterprise]

DataOps Software

DataOps Automation: Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change. [Enterprise]

recipes for dataops success

DataKitchen Consulting Services


Assessments

Identify obstacles to remove and opportunities to grow

DataOps Consulting, Coaching, and Transformation

Deliver faster and eliminate errors

DataOps Training

Educate, align, and mobilize

Commercial Pharma Agile Data Warehouse

Get trusted data and fast changes from your warehouse

 

dataops-cookbook-download

DataOps Learning and Background Resources


DataOps Journey FAQ
DataOps Observability basics
Data Journey Manifesto
Why it matters!
DataOps FAQ
All the basics of DataOps
DataOps 101 Training
Get certified in DataOps
Maturity Model Assessment
Assess your DataOps Readiness
DataOps Manifesto
Thirty thousand signatures can't be wrong!

 

DataKitchen Basics


About DataKitchen

All the basics on DataKitchen

DataKitchen Team

Who we are; Why we are the DataOps experts

Careers

Come join us!

Contact

How to connect with DataKitchen

 

DataKitchen News


Newsroom

Hear the latest from DataKitchen

Events

See DataKitchen live!

Partners

See how partners are using our Products

 

Monitor every Data Journey in an enterprise, from source to customer value, in development and production.

Simple, Fast Data Quality Test Generation and Execution. Your Data Journey starts with verifying that you can trust your data.

Orchestrate and automate your data toolchain to deliver insight with few errors and a high rate of change.