Jeremy O’Brien & the Rise of Scalable Photonic Quantum Computing
- Gokul Rangarajan
- Sep 21
- 13 min read
How Integrated Photonics, PsiQuantum, and Milestone Advances Are Shifting Quantum Computing from Labs to Factories
Quantum computing promises to leap beyond classical limits in cryptography, chemistry, optimization, simulation, and more. One of the biggest challenges standing between promise and reality is error correction and scaling up: how do you build systems with thousands to millions of fault‑tolerant qubits? Jeremy L. O’Brien, a pioneer in quantum photonics, is among those leading a path toward scalable, manufacturable quantum systems. Through his research and as co‑founder & CEO of PsiQuantum, O’Brien has helped shift the focus from small proof‑of‑principle experiments into industrial, manufacturable technologies. This blog reviews his validated contributions, recent breakthroughs (especially the Omega chipset), popular questions in quantum computing, and what else is happening in the field.

At Pitchworks, our Quantum100 initiative is more than a directory it’s a living map of the people and ideas accelerating the quantum revolution. Through deep-dive features, we highlight leaders whose vision is shaping the enterprise adoption of quantum technologies. We’ve already explored the pioneering role of Dr. Jay Gambetta in superconducting quantum computing and the transformative leadership of Peter Chapman at IonQ. Building on that journey, this blog turns the spotlight toward another figure whose impact has redefined the pace and perception of quantum progress Hartmut Neven, the architect behind Google’s Quantum AI Lab and one of the most influential forces in bringing quantum from theory into practice. In our recent blog we spoke about Hartmut Naven. and Robert Suotr
From Early Research to Integrated Quantum Photonics
Jeremy O’Brien’s early work laid much of the groundwork for the field of integrated photonic quantum circuits. Among his validated contributions:
In 2003, O’Brien and his collaborators experimentally demonstrated one of the first controlled‑NOT (CNOT) gates for single photons using bulk optical components (mirrors, beam splitters). Optics.org+2PNAS+2
Later, needing a more scalable architecture, his group turned to integrated photonics. In 2008, Politi, Cryan, Rarity, Yu, and O’Brien published Silica‑on‑Silicon Waveguide Quantum Circuits, showing high‑fidelity two‑photon interference (visibility ~94.8 ± 0.5 %) and a CNOT gate with logical basis fidelity ~94.3 ± 0.2 %. arXiv
They also demonstrated multi‑photon state generation and small implementations of algorithms or algorithmic fragments, such as a compiled version of Shor’s algorithm on on‑chip circuits. The integrated circuits included phase shifters and programmable interferometers. Optics.org+3MDPI+3arXiv+3
These experiments established proof of principle: photonic qubits can be manipulated on chip, interference and entanglement are possible with reasonably high fidelity, and integrated optics begins to solve stability, size, reproducibility issues compared with bulk optics. But those early systems were small scale—few photons, few gates, still demanding custom optics, and not yet at error‑correction thresholds.
Jeremy O’Brien is an Australian physicist (born 1975) whose fields of expertise include quantum optics, optical quantum metrology, and quantum information science. Before cofounding PsiQuantum, he held academic positions, including Professorial Research Fellow at the University of Bristol, where he directed the Centre for Quantum Photonics. One of his early landmark achievements was demonstrating the first optical quantum controlled‑NOT (CNOT) gate for single photons. In 2003, while at the University of Queensland, his group built a CNOT logic gate using standard, bulk optical components—mirrors, beam splitters—to operate on two photonic qubits. Although the setup occupied a full laboratory bench, and was not using waveguides or integrated optics, it established one of the first two‑qubit logic gates in photonic quantum computing. This demonstration addressed a foundational requirement: showing that two qubits of light could be made to interact (or be made conditional) under controlled experimental conditions
As photonic quantum information progressed, O’Brien and collaborators shifted toward integrated photonics—waveguide circuits etched or patterned on chips. These allow much greater stability, reproducibility, compactness, and potential for mass fabrication. In 2008, for example, O’Brien’s group published Silica‑on‑Silicon Waveguide Quantum Circuits, which demonstrated high‑fidelity two‑photon interference and a CNOT gate in an integrated circuit, rather than in bulk optics. These experiments included several photons on chip, phase shifters, interferometers, control and manipulation, heralded generation of small entangled states, etc. These experiments showed that integrated photonic quantum circuits could perform key quantum operations with increasingly high reliability
These early works laid the theoretical, technical, and experimental foundations for what was to come. They proved that quantum interference of photons, entanglement, small quantum logic circuits, and even compiled versions of small algorithms were possible using photonic waveguides and integrated optics, moving away from tables of mirrors and beamsplitters. But the question remained: could this be done at industrial scale, with fabrication yield, integration of sources, detectors, switches, interconnects, cooling, etc., to reach error rates low enough for fault tolerance?
That question is now being answered in part by PsiQuantum’s recent announcement of Omega, a photonic quantum computing chipset purpose‑built for utility‑scale systems, publicized in early 2025. In a Nature paper (and accompanying materials), PsiQuantum revealed that Omega integrates many photonic components into a semiconductor fab process with record performance metrics. The technologies include high‑performance single‑photon sources; superconducting single‑photon detectors; advanced optical switches; new materials such as Barium Titanate (BTO) for optical switching; and fabrication in partnership with GlobalFoundries. Omega is designed not merely for experiments, but for manufacture: full‑size, industry‑standard semiconductor wafers are used.
To assess whether Omega meets the demands of fault tolerance, PsiQuantum developed benchmarking circuits. Through these, they have measured single‑qubit state preparation and measurement fidelity (SPAM) at 99.98 % ± 0.01 %, chip‑to‑chip qubit interconnect fidelity at 99.72 % ± 0.04 %, quantum interference visibility (which measures how indistinguishable photons from different sources are) at 99.50 % ± 0.25 %, and two‑qubit fusion gate (i.e. entangling operation) fidelity at 99.22 % ± 0.12 %. These are among the highest reported in photonic quantum computing for those classes of operations.
An especially important aspect of Omega is the so‑called “chip‑to‑chip qubit interconnect.” Scaling up a quantum computer not only means adding more qubits; it also means that qubits on different chips must be connected (optically, in the case of photonics) with high fidelity. Losses, timing jitter, mismatches among components, background noise, and photon distinguishability all degrade performance. That Omega demonstrates high fidelity over chip‑to‑chip links is a key enabler of modular, large‑scale quantum architectures, rather than monolithic quantum chips of limited size.
Another critical aspect is manufacturability: Omega is built in GlobalFoundries’ fabs, using industry standard processes, and produced on full‑size silicon photonics wafers. PsiQuantum claims yields that “match those of standard semiconductors,” and that they have characterized millions of devices across thousands of wafers. They perform about half a million measurements per month to validate performance. This scale of testing and production is unprecedented in quantum photonics, moving beyond experiments with dozens of photons to mass measurement and quality control. Optica OPN+1
Moreover, the company has introduced innovations not just in the quantum components, but also in infrastructure: cooling, for example. Omega uses a high‑power cryogenic module with a “cuboid” design rather than the traditional “chandelier” style dilution refrigerator. The new cooling architecture is more compatible with datacenter form‑factors, easier to scale, and in many ways more manufacturable. For large numbers of qubits, cooling and operational support become major bottlenecks; solving that is as critical as improving gate fidelities. PsiQuantum+1
PsiQuantum has also committed to building large quantum compute centers in Brisbane, Australia, and Chicago, USA. These centers are intended to be datacenter‑sized facilities meant to house utility‑scale, million‑qubit quantum photonic hardware. These sit alongside the industrial manufacturing infrastructure, indicating that the path to commercialization is being taken seriously. PsiQuantum+2Optica OPN+2
It is worth pausing for a moment to consider how Jeremy O’Brien’s earlier research connects to Omega and why his contributions are central rather than peripheral. The first photonic CNOT gate in lab settings, though bulky and fragile, showed that conditional interactions among photons were physically possible. The integrated photonic experiments later led by his groups showed that chip‑scale circuits, waveguides, interferometers, and phase shifters could reproduce and manipulate quantum interference, entanglement and small gate logic operations with reasonably high fidelity. O’Brien also contributed in popularizing and demonstrating hybrid algorithms like the Variational Quantum Eigensolver (VQE) in photonic platforms, which help connect theoretical quantum algorithms to hardware performance. Verified sources list his known research fields as quantum optics, quantum metrology, and quantum information science, and recognize his role in both academic and now commercial efforts. Wikipedia
In popular discussions, many questions about quantum computing recur. One is: Will photons remain competitive versus other platforms? Superconducting qubits, trapped ions, neutral atoms, spins in semiconductors—all have strengths and weaknesses. Photonics has long been hampered by challenges: efficient, identical single photon sources; low‑loss optical components; integration of detectors; and the probabilistic nature of some entangling operations. What O’Brien’s work shows is that many of these challenges are being addressed: sources and detectors are being integrated, component performance is approaching fidelity thresholds, inter‑chip links are made with low loss using industrial processes, and cooling infrastructure is being rethought.
Another frequent question is: What are the error thresholds needed for fault-tolerant quantum computing and how close are current systems? Error correction theory gives different thresholds depending on architecture and error model, but two‑qubit operations with error rates in the neighborhood of 10⁻³ to 10⁻⁴ are often needed (or better) for many fault‑tolerant codes. The Omega results—with two‑qubit fusion gate fidelity ~ 99.22 % (error ~0.78 %)—are not yet fully below the most stringent thresholds, but are within the range that many architects believe is entering the “practical domain” when combined with large redundancy, error correction, and scalable fabrication. Single‑qubit operations and interconnects with errors of ~10⁻⁴–10⁻³ are already being approached.
People also ask: When will quantum computers outperform classical ones—when will “quantum advantage” or “quantum supremacy” be achieved for useful applications? As of the Omega paper and related reporting, nobody has publicly surpassed classical systems in broadly useful tasks using photonic quantum computing in a real‑world application. But with gate‑ and interconnect‑ fidelities this high, with scalable manufacturing in place, the timeline is getting shorter. PsiQuantum aims to build systems in the coming years (they plan quantum compute centers) that could tackle tasks beyond simulation or specific theoretical exercises.
Furthermore, many ask: What remains to be solved? Several hard problems remain: reliably generating many identical photons with low loss; photon loss is a particularly crippling issue in photonics; ensuring that entangling operations that are probabilistic (fusion‑style) can be done efficiently without too much overhead; integrating all components with extremely low background noise; maintaining coherence and indistinguishability at scale; achieving high yield in manufacturing; achieving cryogenic or cooling systems that are cost‑effective, reliable, maintainable in datacenters; error correction implementation in full logical qubit layers; control, calibration, packaging, system stability, integration with software stack, etc.
Regarding PsiQuantum: its strategy, under O’Brien’s leadership, has been to combine academic insight, engineering rigor, and industrial fabrication. The company sources advanced photonic and superconducting materials; uses large semiconductor fabs (GlobalFoundries operating on standard geometries, full‑size wafers); introduces new optical switch materials (such as Barium Titanate) for low loss optical switching; integrates superconducting photon detectors; tests devices in large numbers (millions of devices across thousands of wafers), with high measurement throughput. All of these are features typical in mature semiconductor industries but new to quantum photonics at this scale. This shows that the mode of operation is transitioning from bespoke, lab‑based prototyping toward reproducible, mass manufacturable quantum hardware.
As for Jeremy O’Brien’s academic contribution beyond hardware: he has carried forward algorithmic bridging— for example in variational algorithms, simulation, quantum metrology, optical quantum interferometry. These are crucial because hardware alone is insufficient; without error correction, without algorithms designed to tolerate noise, without software stack and error models, even high‑fidelity gates may not yield useful quantum advantage. O’Brien’s career has combined both sides: building physical systems and pushing for algorithmic or circuit designs that demonstrate what those systems can do.
All of this leads to one major conclusion: the recent Omega chipset announcement is less a single “breakthrough” than a convergence of many breakthroughs—photonic sources, detectors, optical switching, manufacturing, cooling, interconnects—all maturing to the point where building large, fault‑tolerant quantum photonic machines is plausible rather than wildly speculative. Jeremy O’Brien is one of the architects of this transition from lab to factory.
It is important to remain realistic: “million‑qubit‐scale” does not automatically mean “useful quantum computer tomorrow.” Fault tolerance demands logical qubits, overhead, error correction cycles, resource management. But with component fidelities in the high 99s, with manufacturing yield, with inter‑chip connectivity, and with cooling and infrastructure starting to be scalable, the field has crossed a boundary. The vision of a practical quantum photonic computer is now being built, not just imagined.
Jeremy O’Brien: Research Foundations and Early Contributions
O’Brien (born 1975, Australia) is an expert in quantum optics, photonic quantum information, and quantum metrology. arXiv+3Wikipedia+3Optics.org+3
He was part of the team that built the first optical controlled‑NOT (CNOT) gate for single photons using bulk optical components (mirrors, beam splitters) in the early 2000s. arXiv+3Optics.org+3arXiv+3
He also led work in integrated photonics: implementing silica‑on‑silicon (or “waveguide”) quantum circuits that packaged interferometers, beamsplitters, phase shifters, etc., in small scale circuits rather than large optical tables. These enabled demonstrations such as two‑photon interference, high‑fidelity CNOT gates, small compiled versions of algorithms (e.g. factoring) and entangled multi‑photon states. arXiv+3arXiv+3Photonics+3
His group also helped popularize the Variational Quantum Eigensolver (VQE) algorithm in applied settings, bridging theory and experiment. Wikipedia
PsiQuantum, Omega Chipset, and Recent Milestones
Under O’Brien’s leadership at PsiQuantum, the company has taken photonic quantum computing from research into manufacturable hardware. Key validated facts include:
Omega chipset: Announced in February 2025; a “manufacturable chipset” for photonic quantum computing designed for million‑qubit scale. It integrates high‑performance single‑photon sources, superconducting single‑photon detectors, advanced optical switches (including novel materials like Barium Titanate) and is fabricated in silicon photonic fabs (not purely lab custom). Photonics+3PsiQuantum+3PsiQuantum+3
Performance benchmarks:
MetricFidelity / PerformanceSingle‑Qubit State Preparation & Measurement (SPAM)~ 99.98% ± 0.01% PsiQuantum+2PsiQuantum+2Two‑Photon Quantum Interference Visibility~ 99.50% ± 0.25% PsiQuantum+2The Quantum Insider+2Chip‑to‑Chip (optical interconnect) Fidelity~ 99.72% ± 0.04% PsiQuantum+2The Quantum Insider+2Two‑Qubit Fusion Gate Fidelity~ 99.22% ± 0.12% PsiQuantum+2Quantum Computing Report+2
The fabrication is done in partnership with GlobalFoundries, using full‑size, high‑volume semiconductor wafers (silicon photonics) rather than custom lab‐only devices. PsiQuantum+3Optica OPN+3Photonics+3
PsiQuantum is also working toward building large‑scale quantum data centers: announced plans for facilities in Brisbane, Australia, and Chicago, Illinois, USA. Photonics+2Quantum Computing Report+2
These steps show movement toward fault-tolerant quantum computing, where error rates must be kept below threshold and where qubits, gates, and interconnects are reliable, manufacturable, and scalable.
Below are some commonly‑asked questions in quantum computing, with how O’Brien’s work and PsiQuantum help to address them.
Question | Why it matters | How O’Brien / PsiQuantum contribute |
What physical platform will scale best? Superconducting, trapped ions, photonics, neutral atoms, etc. | Scalability requires high fidelity, manufacturability, and error correction. | O’Brien’s focus: photonic qubits, integrated photonics, silicon fabrication + high‑volume fabs. This approach offers potential advantages in manufacturing, connectivity (especially optical interconnects), and because photons don’t decohere from environmental noise as easily. |
How to achieve two‑qubit entangling operations with high fidelity? | Two‑qubit gates are necessary for universal quantum computing; errors in these operations largely drive demands for error correction overhead. | PsiQuantum’s Omega shows high‑fidelity fusions, high two‑photon interference visibility, etc. Their fusion‑based approach uses probabilistic entangling operations but with extremely high component performance. |
What about error correction thresholds? | To build large‑scale fault‑tolerant systems, gate errors must be below certain thresholds (commonly quoted ~10⁻³ to 10⁻⁴ depending on scheme). | The fidelity numbers above show that single‑qubit and interconnect operations are very close to those thresholds; two‑qubit fusion gates are still a bit above the most stringent in some schemes but rapidly improving. |
Can quantum computers outperform classical computers: i.e. “quantum advantage”? | It’s one thing to have qubits; another to do things classical supercomputers can’t. | Photonics offers a path with highly parallel optical interconnects, high bandwidth, long coherence of photons. While no definitive large‑scale quantum advantage demonstration via photonics yet (as of validated sources), progress such as Omega brings the community closer. |
What are the engineering challenges? Cooling, fabrication defects, loss, interconnects, error rates, scaling. | Many platforms suffer from high cooling requirements (mK), loss of qubits, difficulty in connecting qubits over distances, etc. | PsiQuantum addresses: the Omega design includes new cooling architectures (removing the “chandelier” dilution fridge, adopting more scalable cooling modules) and focusing on inter‑chip optics. Also, using mature fabrication techniques helps manage defect rates and yield. |
Jeremy O’Brien’s career, from leading research in integrated photonic quantum circuits to steering PsiQuantum toward industrial‑scale, manufacturing‑based quantum hardware, represents one of the clearest paths today toward scalable, fault‑tolerant quantum computing. His contributions have addressed many of the biggest challenges: high‑fidelity operations, manufacturability, loss and error, inter‑chip interconnects, photon sources and detectors, and cooling.
The recent Omega chipset marks a milestone: benchmarks like ~99.98% single‑qubit fidelity, ~99.72% chip‑to‑chip interconnect fidelity, and ~99.22% fidelity on two‑qubit fusion begin to make the thresholds for practical error correction realistic. While there remains work ahead—especially in reliably producing large numbers of resource states, full error correcting codes, integrated fault tolerance, and demonstrating applications beyond what classical systems can simulate—O’Brien and PsiQuantum are among the strongest players working in that direction.
Quantum computing remains a field of both wonder and complexity: many unknowns persist. But thanks to effort and vision like Jeremy O’Brien’s, we are increasingly moving from demonstrations toward devices and systems that might one day transform computation.
At Pitchworks VC Studio Group, we invest in frontier technologies that can shift civilization forward. Our mission is to empower startups, technical founders, and transformative projects that expand human potential across healthcare, productivity, AI, energy, materials, and planetary-scale infrastructure. We believe that quantum computing is not just another emerging trend—it is one of the most important technological inflection points of the 21st century.
To accelerate this transformation, we’ve established the Global Competency Center for Quantum (GCC-Q): a dedicated, cross-functional initiative designed to support the research, commercialization, and deployment of quantum computing in real-world, high-impact domains.
GCC-Q is designed to:
Support quantum-focused startups with access to capital, research infrastructure, and technical mentorship
Convene researchers, engineers, and domain experts working on quantum algorithms, compilers, and hardware integration
Pilot real-world applications of quantum computing in healthcare, biotech, logistics, materials science, cybersecurity, and AI
Bridge the gap between academic research and commercial deployment, using our venture studio platform to help deep tech teams build fast, hire talent, and find product-market fit
Build partnerships with global infrastructure players, including cloud platforms, HPC centers, semiconductor fabs, and sovereign research labs
This is not a theoretical project. GCC-Q is already engaged in pilots with startups and enterprise R&D teams working on hybrid quantum-classical algorithms for:
Quantum chemistry and molecular modeling using VQE (Variational Quantum Eigensolver)
AI/ML acceleration using quantum kernel methods
Combinatorial optimization in logistics, transport, and smart grid management
Quantum-safe encryption and post-quantum cryptography for data security resilience
We are also in active dialogue with global institutions to ensure that GCC-Q aligns with broader public-good missions: climate modeling, public health preparedness, and sustainable supply chain systems.
Quantum computing is a general-purpose technology—meaning its eventual impact will not be limited to one domain. But we are particularly focused on areas that align with our mission to uplift humanity through health, productivity, and intelligence. Through GCC-Q and our broader deep tech portfolio, we’re aiming to:
Accelerate therapeutic discovery for rare and complex diseases
Unlock new materials for energy storage, carbon capture, and sustainable manufacturing
Redesign logistics systems with higher efficiency and lower emissions
Enable more powerful AI models with less training data and better interpretability
Build resilient cybersecurity systems that protect data and communication in a post-quantum world
Each of these goals represents not just commercial opportunity—but real progress toward a more advanced, equitable, and sustainable society.
At Pitchworks VC Studio Group, we don’t just follow trends—we invest in paradigm shifts. Quantum computing represents one of the greatest paradigm shifts of our time. And with the launch of our Global Competency Center for Quantum, we are committing time, capital, and talent to ensure that this technology fulfills its potential—not just as a scientific triumph, but as a foundation for real-world, human-scale solutions.
We invite researchers, engineers, founders, institutions, and strategic partners who share this vision to connect with us. Whether you’re building a new quantum protocol, designing fault-tolerant hardware, or applying quantum tools to solve global problems, GCC-Q is here to help you scale your impact.
Let’s build the future. One qubit at a time.



Comments