Data Reuploading
Re-encodes classical data multiple times across layers, achieving universal function approximation.
Qubits
4
Depth
12
Total Gates
21
Simulability
Not simulable
Mathematical Formulation
Description
Data reuploading is a powerful variational encoding strategy that re-encodes the classical input data at every layer of the quantum circuit, interleaved with entangling gates. This repeated injection of data, analogous to the universal approximation theorem for classical neural networks, provably enables approximation of any continuous function.
Each layer applies RY rotations with feature-dependent angles to encode data, followed by a CNOT ladder for entanglement. Features are cyclically mapped to qubits: feature x_i is encoded on qubit (i mod n_qubits), so all features are represented even when n_qubits < n_features. The Fourier expressivity of the circuit is f(x) = Σ_k c_k exp(ikx) for k ∈ {-L, ..., L}, where L is the number of layers.
The encoding excels at tasks requiring high expressibility and is particularly well-suited for time series data due to its sequential, recurrent-like structure. However, the deep circuits can suffer from barren plateaus and noise accumulation on current NISQ hardware.
Circuit Diagram
Property Radar
Properties
Resource Scaling
How resource requirements grow with the number of input features.
| Features | Qubits | Depth | Gates | 2Q Gates |
|---|---|---|---|---|
| 2 | 2 | 6 | 9 | 3 |
| 4 | 4 | 12 | 21 | 9 |
| 8 | 8 | 24 | 45 | 21 |
| 16 | 16 | 48 | 93 | 45 |
Code Examples
Data reuploading with PennyLane using 3 layers and 4 features.
from encoding_atlas import DataReuploading
import pennylane as qml
import numpy as np
enc = DataReuploading(n_features=4, n_layers=3)
dev = qml.device("default.qubit", wires=enc.n_qubits)
@qml.qnode(dev)
def circuit(x):
enc.get_circuit(x, backend="pennylane")
return qml.state()
x = np.array([0.1, 0.5, 1.2, 2.3])
state = circuit(x)When to Use This Encoding
- Universal function approximation in quantum ML
- Time series classification and forecasting
- Tasks requiring high expressibility and rich Fourier spectra
- Variational quantum eigensolvers with data-dependent ansätze
- Quantum neural networks (QNNs) with data-driven layers
Pros & Cons
Advantages
- Universal approximation capability (proven for single-qubit case)
- Fourier expressivity grows linearly with number of layers
- Cyclic feature mapping enables qubit-efficient encoding
- Natural recurrent structure suits sequential data
- Good trainability for moderate layer counts
Limitations
- Deep circuits — depth grows with layers and qubit count
- Barren plateau risk for many layers (>10 triggers warning)
- Not NISQ-friendly for large feature counts
- Feature count recommended ≤8 for practical use
- Noise accumulation in deep circuits degrades fidelity
References
- [1]Pérez-Salinas, A., et al. (2020). Data re-uploading for a universal quantum classifier. Quantum, 4, 226.
- [2]Schuld, M., Sweke, R., & Meyer, J.K. (2021). Effect of data encoding on the expressive power of variational quantum machine learning models. Physical Review A, 103(3), 032430.
- [3]Vidal, G. & Dawson, C.M. (2004). Universal quantum circuit for two-qubit transformations. Physical Review A, 69(1), 010301.