Research

Pushing the boundaries of quantum-native AI.

Five research programs exploring how quantum computing can fundamentally transform artificial intelligence.

01

Quantum Attention

A quantum-enhanced transformer architecture where the attention mechanism, the core of modern language models, runs natively on quantum hardware.

By computing attention weights through quantum superposition and interference, we can evaluate all token relationships simultaneously rather than sequentially, enabling richer contextual understanding with fundamentally different computational properties.

Active Research
02

Hilbert Space Tracking

Models that measure quantum states as they evolve during training and inference, mapping exactly how tokens move through Hilbert space and how thoughts are formed.

By tracking these quantum trajectories, we gain unprecedented visibility into the reasoning process: not just what a model decides, but the full path of how it gets there.

Active Research
04

Quantum Grassmann Models

Leveraging quantum computing for geometrical tensor lifting on Grassmann manifolds.

Grassmann geometry provides a natural framework for representing subspaces. By performing these tensor operations on quantum hardware, we access computational shortcuts that classical methods cannot exploit, enabling more expressive model representations.

Early-Stage Research
05

Quantum Transformers

Transformer-based models that run fully on quantum hardware. Not just quantum-enhanced components, but end-to-end quantum architectures.

This is the ultimate goal: a complete transformer where embeddings, attention, feedforward layers, and output projections are all implemented as quantum circuits, unlocking the full potential of quantum computation for language understanding.

Active Research