For the past decade, quantum computing has struggled to balance promise and practicality. While the world’s most advanced systems remain engineering marvels, they’re bedeviled by the same flaw: the fragility of qubits—the fundamental units of quantum data—and the delicate hardware required to control them. A single fluctuation, for example, can collapse a quantum state, invalidating a computation.
Most quantum systems also depend on large-scale refrigeration colder than deep space, with cryogenic racks that often occupy multiple rooms. Scaling quantum systems demands exponential increases in cost, energy, and environmental stability. So while the U.N. has designated 2025 as the International Year of Quantum Science and Technology, for all its scientific significance, quantum’s commercial trajectory remains narrow.
But Japanese conglomerate Nippon Telegraph and Telephone Corp. (NTT) is attempting to rewrite that equation. In partnership with Japan-based quantum technology developer OptQC, NTT is attempting to break current orthodoxy through what is known as optical quantum computing, which uses photons instead of electrical currents to perform calculations. Since photons generate less heat compared to electron-based systems and can travel without resistance, these systems consume far less power.
NTT argues that optical systems can be faster and more energy-efficient, forming the basis for greener, more sustainable computing. “This combination not only accelerates computational capability but also reduces environmental impact, positioning quantum technology as a foundation for a sustainable digital future,” says Shingo Kinoshita, SVP and head of R&D planning at NTT.
Rather than relying on cooling systems, NTT’s design utilizes light sources and error-correction technologies developed under its Innovative Optical and Wireless Network (IOWN) initiative.
Japan’s broader industrial strategy sits just beneath the surface of this partnership. With the U.S. and China locked in geopolitical competition over quantum supremacy, Japan’s photonic-first model is being positioned as an alternative: one that favors energy efficiency and manufacturability over extreme-environment engineering.
“Today, the energy footprint of AI is emerging as a global challenge. Optical quantum computing processes information with light, enabling dramatically lower power consumption and scalable qubit growth through optical multiplexing,” Kinoshita says.
A million-qubit road map
The approach builds on a series of rapid scientific breakthroughs across Japan’s quantum ecosystem.
Over the past year, NTT—alongside RIKEN, Fixstars Amplify, the University of Tokyo, and the National Institute of Information and Communications Technology—demonstrated the world’s first general-purpose optical quantum computing platform capable of running calculations without any external cooling.
The upcoming platform fits inside a single room, a feat that many leading quantum systems developers can’t claim.
NTT and OptQC outlined a five-year plan leading to the 2030 milestone. During the first year, the companies will conduct technical studies and begin codesigning while identifying early use cases with external partners. In the second year, they plan to build complete development environments for hardware and software.
In year three, they expect to begin verifying enterprise use cases such as drug development, financial optimization, materials science, and climate modeling. The final phase will focus on scaling the system to reach millions of qubits and making it reliable enough to handle real-world use cases, thereby preparing the technology for adoption among companies, governments, and industries.
Qubits must scale into the thousands for quantum computing to surpass the current capabilities of AI. Unlike classical bits used in general-purpose computing systems, which exist as 0 or 1, qubits can exist in multiple states simultaneously, enabling exponentially faster processing of complex calculations.
“The 2030 vision of 1 million qubits is not just about performance, it’s about redefining how we align advanced computing with planetary limits,” Kinoshita says. “In the near term, as we aim for 10,000 qubits by 2027, the first impact will be within NTT’s own communications infrastructure.”
Japan’s photonic bet to power AI
As AI models grow in size and complexity, the demand for simulation, optimization, and high-dimensional problem-solving has also increased exponentially. NTT asserts that photonic quantum systems will become essential accelerators for next-generation AI and telecom networks such as 6G.
In classical systems, electrical signals travel through semiconductor processors. Photonic systems replace those electrons with light, transmitting information through properties such as photon number, polarization, and amplitude.
However, practical commercial quantum computing requires a scale of 1 million logical qubits, along with reliable quantum error correction, a mechanism that detects and corrects the subtle errors qubits constantly make. Today’s machines—even the most advanced systems by IBM, Google, and others—sit orders of magnitude below that mark and remain extremely sensitive to environmental disturbances.
NTT claims that photonics changes the math. “Scaling to 1 million qubits by 2030 and then moving into mass deployment will demand a robust supply chain. Achieving high-performance quantum light sources and improving yield in precision fabrication will be critical steps,” Kinoshita explains. In essence, this means NTT must be able to reliably manufacture the key components, such as high-quality light sources, and improve production yields so the hardware can be built at scale.
“By 2030, with 1 million qubits, the scope expands beyond telecom,” he adds. “NTT plans to explore these opportunities through partnerships with leaders in chemistry, finance, and industrial sectors.”
The global stakes of a photonic strategy
This is not the first attempt at room-temperature quantum hardware, as companies like Sydney-based Quantum Brilliance are also pursuing cryogenics-free architectures. Quantum Brilliance is targeting edge and data-center deployments with compact photonic-inspired diamond devices, while Atom Computing, headquartered in Berkeley, Calfornia, is building large-scale, room-temperature systems that use neutral atoms.
“We truly believe that optically controllable neutral atom qubits allow a level of flexibility and practicality to the challenge of controlling millions of qubits with high-fidelity, low-crosstalk signals at room temperature,” says Ben Bloom, founder and CEO of Atom Computing.
But NTT argues that photons, not electrons or atoms, offer an architecture capable of reaching true commercial scale. Its thesis is simple: Light is inherently more stable, generates less heat, and is ultimately more manufacturable than any matter-based system. “This shift transforms quantum computing from a niche technology into a broadly available resource,” Kinoshita says.
Still, experts caution that the light-based computation path comes with its own unresolved challenges.
“Photonics faces significant challenges that often get glossed over in the room-temperature narrative,” says Yuval Boger, chief commercial officer at Boston-based QuEra Computing. “You need near-perfect sources and detectors at scale, plus efficient photon-photon interactions, which don’t occur naturally and require complex optical elements. The engineering complexity of building a fault-tolerant photonic quantum computer with thousands of high-fidelity qubits is immense.”
If NTT stays on track, the world’s first million-qubit system may come from a room-temperature optical platform in Tokyo, engineered for real-world use cases including molecular simulation for drug discovery and materials science, financial risk modeling, and manufacturing optimization.
“Beyond technology, global coordination for specialized materials and resilience against geopolitical risks remains essential,” Kinoshita says. “When these systems can run in standard IT environments with ultra-low power consumption and rack-scale integration, enterprises will see cost-effective performance, governments will recognize strategic advantage, and the public will experience tangible benefits like greener networks and faster innovation. That moment will mark quantum’s shift from experimental to essential.”