
Complete confocal fluorescence microscope that empowers researchers to advance quantitative functional imaging from individual molecules to cells and tissues.

Modular, customizable, time-resolved confocal microscope with single-molecule sensitivity for life and materials science.

Compact FLIM and FCS upgrade kit that adds advanced functional imaging and correlation analysis to existing laser scanning microscopes.

Designed for flexible, sensitive, and precise steady-state and time-resolved spectroscopy across the UV to NIR range and time scales from picoseconds to milliseconds.

Modular lifetime spectrometer designed for flexible fluorescence and photoluminescence measurements in both materials and life science research.

Add spectral and time-resolved photoluminescence to your setup through flexible microscope–spectrometer coupling options.

Get the most out of superconducting nanowire detectors in large-scale quantum communication and computing experiments requiring precise multichannel timing.

Boost your time-resolved experiments with a flexible, high-precision time tagging and TCSPC unit for materials science and quantum sensing.

Scale your photonic quantum computing and detector characterization setups while maintaining performance, flexibility, and high data throughput.

Compact 3-color picosecond laser delivering flexible ns to ms excitation with cost-effective multicolor performance and straightforward operation.

Smart picosecond laser diode heads covering UV-A to NIR, providing the right combination of power, pulse width, and diode type for any time-resolved technique.

VisUV provides clean short pulses and stable timing across key UV and visible wavelengths, including deep UV lines as well as 488 nm and 532 nm.

Enhance your single-photon counting experiments with wide dynamic range and excellent timing precision in the UV and visible even at the highest count rates.

Capture even the weakest signals over large areas with maximum dynamic range and enhanced low-light sensitivity in a compact detector design.

Unlock spatially resolved single-photon detection with a 23-pixel SPAD array, combining low dark counts and precise time tagging for advanced experiments.

Advanced FLIM analysis software for fast, accurate interpretation of lifetime imaging data.

Intuitive, free software solution for real-time, high-precision photon data acquisition, visualization, and initial data analysis.

Advanced software for time-resolved fluorescence acquisition and analysis.

An imaging technique that uses fluorescence lifetimes to generate image contrast.

Investigating how proteins dynamically explore multiple conformational states that control biological function.

Investigating how biomolecules separate into dynamic liquid phases to organize cellular space and regulate biological function.

A time-resolved technique that measures photoluminescence lifetimes to reveal excited-state dynamics in materials.

Studying exciton dynamics, charge carrier processes, and structural properties through optical and time-resolved characterization methods.

Investigating charge-carrier lifetimes and recombination dynamics to enable precise optical characterization of material quality and device performance.

A quantum optical signature revealed by time-resolved photon correlation analysis to identify single-photon emission in materials and nanostructures.

The transmission of information using individual photons, using quantum effects to ensure absolute security.

Quantifying photons per detection event enables direct access to photon-number statistics, providing insight into quantum and statistical properties of light.

An optical technique that analyzes light emission under electrical excitation to reveal electronic properties of electroluminescent materials.

Monitoring environmental signals and trace compounds to understand dynamic changes in natural and engineered environments.

A photon timing technique that measures single-photon arrival times to resolve ultrafast dynamics in fluorescence, materials research, and quantum optics.
Quantum computing refers to the processing of information using quantum states, where qubits exploit superposition and entanglement to perform operations beyond classical limits. Different physical platforms have been developed to realize quantum computation, including superconducting circuits, trapped ions, neutral atoms, and photonic systems.
In photonic quantum computing, quantum information is encoded in the states of individual photons, such as polarization, time-bin, or path. These states are generated using deterministic or probabilistic single-photon sources and routed through optical circuits before being measured using single-photon detectors, often based on superconducting nanowire technology. The realization of photonic quantum computing systems therefore relies on controlled state preparation, stable optical transformations, and reliable multi-photon detection.
Across different implementations, photonic quantum computing follows a common principle: quantum information is encoded in multiple photons that propagate through an optical network, where their probability amplitudes interfere. The resulting quantum state evolves according to the structure of the optical circuit.
Since photons do not interact directly, computation is inherently probabilistic and is revealed only through measurement. The output of a computation is encoded in the pattern of detected photons across multiple channels, often requiring the identification of multi-photon coincidence events. In addition, photon-number resolution (PNR) can play a crucial role in distinguishing between different multi-photon contributions, enabling more detailed access to the underlying quantum state and improving the verification of computational outcomes. As system size increases, the ability to resolve these correlated detection events with high temporal precision and across many channels becomes a central requirement for experimental realization.
Photonic quantum computing can be broadly divided into two paradigms: discrete-variable (DV) and continuous-variable (CV) approaches. In DV photonic quantum computing, quantum information is encoded in individual photons, for example using polarization, path, or time-bin states. In many implementations, PNR detection provides additional information about multi-photon events and improves state discrimination. In CV photonic quantum computing, quantum information is encoded in continuous degrees of freedom of the electromagnetic field, such as quadratures. These approaches rely on homodyne detection and squeezed states rather than single-photon detection. The following sections focus on DV approaches, where precise timing, coincidence detection, and photon-number resolution are essential.
LOQC implements quantum computation using linear optical elements such as beam splitters and phase shifters. Since photons do not directly interact, effective quantum gates are realized probabilistically through measurement-induced interactions.
KLM Protocol
The Knill–Laflamme–Milburn (KLM) protocol provides a blueprint for universal quantum computing with linear optics. It uses ancillary photons and conditional measurements to implement quantum gates. Successful gate operations are heralded by specific multi-photon coincidence detections, which indicate that the desired transformation has occurred.
This approach places strong requirements on:
Boson sampling with multiplexed photon input and multi-port interference. Photons are distributed into multiple modes, interfere in a linear optical circuit, and are detected across many channels, where coincidence and photon-number-resolved (PNR) events encode the output distribution. Time tagging enables precise correlation and scalable readout of multi-photon events.Boson Sampling
Boson sampling is a specialized, non-universal model that exploits the interference of multiple indistinguishable photons in a large interferometric network.
Photons are injected into a multi-port interferometer, where they propagate along many possible paths. Due to quantum interference, the probability of detecting photons at specific output modes follows a highly complex distribution. The computational task consists of sampling this multi-photon output distribution, which becomes intractable for classical systems as photon number and system size increase.
Experimentally, this requires:
MBQC shifts the computational paradigm from circuit-based operations to measurement-driven processing. Instead of applying gates sequentially, computation is performed by preparing a highly entangled resource state, known as a cluster state, and applying sequences of single-qubit measurements.
A key advantage of this approach is its natural compatibility with fault-tolerant quantum computing. By encoding quantum information in large-scale entangled states and using measurement-based protocols, errors can be detected and corrected during the computation. This makes MBQC, and in particular fusion-based approaches, a promising pathway toward scalable and fault-tolerant photonic quantum computing architectures.
Fusion-based quantum computing (FBQC) using resource state generators (RSG) and heralded fusion operations (F). Entangled resource states are synchronized and combined into large cluster states via coincidence-based fusion, while low-latency feedforward enables adaptive computation. A time tagging unit provides the timing precision and coincidence analysis required for scalable multi-photon processing.Fusion-Based Quantum Computing (FBQC)
In FBQC, small entangled resource states are generated independently and combined into large-scale cluster states using fusion operations. These operations are inherently probabilistic and are heralded by multi-photon coincidence detection, ensuring that only successful entanglement events contribute to the growing computational resource.
A key feature of this approach is its compatibility with fault-tolerant quantum computing, as errors can be identified and mitigated during the construction and measurement of the cluster state. This makes FBQC a promising route toward scalable photonic quantum computing architectures.
Unlike circuit-based approaches, computation in FBQC is driven by sequential measurements on the cluster state, where each measurement outcome determines subsequent operations. This requires fast feedforward, meaning that detection events must be processed in real time to adapt the computation dynamically.
As a result, FBQC places stringent demands on:
In this context, time tagging plays a central role, enabling the identification of coincidence events, real-time correlation of detection signals, and scalable processing of multi-photon data streams required for adaptive quantum computation.
Photonic quantum computing offers several advantages compared to other quantum computing platforms. Photons exhibit low decoherence and can propagate over long distances with minimal interaction with the environment, making them well suited for distributed and networked quantum systems. In addition, photonic systems can operate at or near room temperature, reducing the need for complex cryogenic infrastructure required by other platforms.
Another key advantage is the natural compatibility with optical communication technologies, enabling the integration of quantum computing with quantum networks. Photonic systems can also be scaled by increasing the number of optical modes and detection channels, allowing parallelization of quantum operations.
At the same time, the experimental complexity shifts toward state preparation, synchronization, and especially detection, where resolving multi-photon events with high fidelity becomes the central challenge.
HydraHarp 500 and MultiHarp 160 suited for photonic quantum computing.The realization of photonic quantum computing systems places stringent requirements on detection and data acquisition. Since computation is encoded in multi-photon detection patterns, the ability to record and process large numbers of time-correlated photon events is critical.
Key requirements include:
In practice, photonic quantum computing experiments rely on time-tagging techniques that record individual photon events with high temporal resolution. Efficient data handling, real-time processing, and scalable analysis architectures are therefore essential to enable larger and more complex quantum systems.
These time-tagging & TCSPC electronics address different experimental requirements in photonic quantum computing.

HydraHarp 500 is best suited for large-scale quantum computing experiments using superconducting nanowire detectors. It delivers picosecond timing precision, ultra-short dead time, and scalable multichannel architectures, enabling high-fidelity quantum state readout and multi-photon coincidence measurements.

MultiHarp 160 enables scalable quantum computing and multipixel detector experiments with up to 64 synchronized channels, ultra-short dead time, and high-throughput time tagging. It supports parallel quantum state readout and multi-photon coincidence detection in complex photonic quantum computing architectures.
Poster on high-precision time tagging for scalable photonic quantum experiments using SNSPD arrays and multichannel TCSPC systems.
Please fill out the form below to receive the requested file. After submitting your details, the file will be sent to you by email.
* Required
Please fill out the form below to request more information. You may also use it to inquire about pricing, availability, technical specifications, or discuss your specific application. Our sales team will be happy to review your request and get in touch with you. If additional information is needed to process your inquiry, we will let you know.
* Required
Please fill out the form below to request more information about our products and services. You may also use it to ask for pricing, availability, technical specifications, or any other details relevant to your inquiry. Our team will be happy to review your request and get in touch with you. If additional information is needed to process your inquiry, we will let you know.
* Required