Decomposition Examples¶
Tensor decomposition factorizes a tensor into simpler components, preserving symmetry at each factor. This is crucial for compression, canonical forms, and various many-body algorithms like DMRG and iPEPS.
Decomposition modes:
- SVD: Full singular value decomposition \(T = U \cdot S \cdot V^\dagger\)
- UR: Left-orthogonal form \(T = U \cdot R\), where \(U^\dagger U = I\)
- LV: Right-orthogonal form \(T = L \cdot V^\dagger\), where \(V V^\dagger = I\)
- QR: QR decomposition \(T = Q \cdot R\), where \(Q^\dagger Q = I\) and \(R\) is upper triangular
Each mode performs the decomposition block-by-block: sectors with different charges are decomposed independently. This block structure ensures that the decomposed factors maintain proper charge conservation and can be efficiently contracted in subsequent operations.
Truncation allows you to compress tensors by keeping only the most important singular values, controlled by either a maximum count or threshold.
Basic SVD¶
group = U1Group()
idx = Index(Direction.OUT, group, sectors=(Sector(0, 3), Sector(1, 2)))
# Create a tensor
T = Tensor.random([idx, idx.flip()], itags=["i", "j"], seed=7)
# Perform SVD: T ≈ U @ S @ Vh
U, S, Vh = decomp(T, axes=0, mode="SVD")
print(f"U:\n{U}\n\nS:\n{S}\n\nVh:\n{Vh}\n")
# Verify reconstruction
US = contract(U, S) # U @ S
T_reconstructed = contract(US, Vh) # (U @ S) @ Vh
error = (T - T_reconstructed).norm() / T.norm()
print(f"Reconstruction error: {error:.2e}")
U:
info: 2x { 2 x 1 } having 'A', Tensor, { i*, _bond_L* }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; -1 ] 32 B
S:
info: 2x { 2 x 1 } having 'A', Diagonal, { _bond_L, _bond_R }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 3.32699
1. 2x2 | 1x1 [ -1 ; 1 ] 32 B
2. 3x3 | 1x1 [ 0 ; 0 ] 72 B
Vh:
info: 2x { 2 x 1 } having 'A', Tensor, { _bond_R*, j }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; 1 ] 32 B
Reconstruction error: 4.73e-16
UR Decomposition¶
# Get U and R = S @ Vh combined
U_ur, R = decomp(T, axes=0, mode="UR")
print(f"U:\n{U_ur}\n\nR:\n{R}\n")
# Verify
T_reconstructed_ur = contract(U_ur, R)
error = (T - T_reconstructed_ur).norm() / T.norm()
print(f"Reconstruction error: {error:.2e}")
U:
info: 2x { 2 x 1 } having 'A', Tensor, { i*, _bond_L* }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; -1 ] 32 B
R:
info: 2x { 2 x 1 } having 'A', Tensor, { _bond_L, j }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 3.32699
1. 2x2 | 1x1 [ -1 ; 1 ] 32 B
2. 3x3 | 1x1 [ 0 ; 0 ] 72 B
Reconstruction error: 4.36e-16
LV Decomposition¶
# Get L = U @ S and V combined
L, V = decomp(T, axes=0, mode="LV")
print(f"L:\n{L}\n\nV:\n{V}\n")
# Verify
T_reconstructed_lv = contract(L, V)
error = (T - T_reconstructed_lv).norm() / T.norm()
print(f"Reconstruction error: {error:.2e}")
L:
info: 2x { 2 x 1 } having 'A', Tensor, { i*, _bond_R }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 3.32699
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; 1 ] 32 B
V:
info: 2x { 2 x 1 } having 'A', Tensor, { _bond_R*, j }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; 1 ] 32 B
Reconstruction error: 4.73e-16
Truncation: Keep N Values¶
# Keep at most 10 singular values globally
U_trunc, S_trunc, Vh_trunc = decomp(T, axes=0, mode="SVD", trunc={"nkeep": 10})
print(f"Original bond dim: {U.indices[1].dim}")
print(f"Truncated bond dim: {U_trunc.indices[1].dim}\n")
# Compare norms
print(f"Original norm: {T.norm():.4f}")
US_trunc = contract(U_trunc, S_trunc)
T_trunc = contract(US_trunc, Vh_trunc)
print(f"Truncated norm: {T_trunc.norm():.4f}\n")
# Truncation error
error = (T - T_trunc).norm() / T.norm()
print(f"Relative error: {error:.4f}")
Truncation: Threshold¶
# Keep singular values >= 1e-10
U_thresh, S_thresh, Vh_thresh = decomp(T, axes=0, mode="SVD", trunc={"thresh": 1e-10})
print(f"Threshold truncated bond dim: {U_thresh.indices[1].dim}")
Multi-Index Decomposition¶
# Decompose 4-index tensor
idx4 = Index(Direction.OUT, group, sectors=(Sector(0, 2), Sector(1, 1)))
T4 = Tensor.random([idx4, idx4.flip(), idx4, idx4.flip()], itags=["a", "b", "c", "d"], seed=42)
# Partition: (a, b) | (c, d)
U4, S4, Vh4 = decomp(T4, axes=[0, 1], mode="SVD")
print(f"U:\n{U4}\n\nVh:\n{Vh4}")
U:
info: 3x { 4 x 1 } having 'A', Tensor, { a*, b, _bond_L* }
data: 3-D float64 (264 B) 3 x 3 x 9 => 3 x 3 x 9 @ norm = 3
1. 2x2x5 | 1x1x1 [ 0 ; 0 ; 0 ] 160 B
2. 2x1x2 | 1x1x1 [ 0 ; 1 ; 1 ] 32 B
3. 1x2x2 | 1x1x1 [ 1 ; 0 ; -1 ] 32 B
4. 1x1x5 | 1x1x1 [ 1 ; 1 ; 0 ] 40 B
Vh:
info: 3x { 4 x 1 } having 'A', Tensor, { _bond_R*, c*, d }
data: 3-D float64 (264 B) 9 x 3 x 3 => 9 x 3 x 3 @ norm = 3
1. 2x1x2 | 1x1x1 [ -1 ; 1 ; 0 ] 32 B
2. 5x2x2 | 1x1x1 [ 0 ; 0 ; 0 ] 160 B
3. 5x1x1 | 1x1x1 [ 0 ; 1 ; 1 ] 40 B
4. 2x2x1 | 1x1x1 [ 1 ; 0 ; 1 ] 32 B
Examining Singular Values¶
# Low-level SVD for singular value access
U_sv, S_dict, Vh_sv = svd(T, axis=0)
# S_dict maps block keys to 1D singular value arrays
for key, s_values in S_dict.items():
print(f"Block {key}:")
print(f" Number of singular values: {len(s_values)}")
print(f" Largest: {s_values[0]:.4f}")
print(f" Smallest: {s_values[-1]:.4f}")
print(f" Ratio: {s_values[0] / s_values[-1]:.2e}")
QR Decomposition¶
QR decomposition factorizes a tensor into an orthogonal matrix Q and an upper triangular matrix R. Unlike SVD, no truncation is applied, making it useful for obtaining canonical forms without compression.
# QR decomposition using high-level interface
Q, R = decomp(T, axes=0, mode="QR")
print(f"Q:\n{Q}\n\nR:\n{R}\n")
# Verify reconstruction
T_reconstructed_qr = contract(Q, R)
error_qr = (T - T_reconstructed_qr).norm() / T.norm()
print(f"Reconstruction error: {error_qr:.2e}")
# Verify orthogonality: Q†Q = I
QdagQ = contract(conj(Q), Q, excl=((1,),()))
print(f"\nOrthogonality check (Q†Q should be identity-like):\n{QdagQ}")
Q:
info: 2x { 2 x 1 } having 'A', Tensor, { i*, _bond_L* }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; -1 ] 32 B
R:
info: 2x { 2 x 1 } having 'A', Tensor, { _bond_L, j }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 3.32699
1. 2x2 | 1x1 [ -1 ; 1 ] 32 B
2. 3x3 | 1x1 [ 0 ; 0 ] 72 B
Reconstruction error: 2.07e-16
Orthogonality check (Q†Q should be identity-like):
info: 2x { 2 x 1 } having 'A', Tensor, { _bond_L, _bond_L* }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 2x2 | 1x1 [ -1 ; -1 ] 32 B
2. 3x3 | 1x1 [ 0 ; 0 ] 72 B
Alternatively, use the low-level qr() function directly:
# Low-level QR function
Q_low, R_low = qr(T, axis=0)
print(f"Q (low-level):\n{Q_low}\n\nR (low-level):\n{R_low}")
Q (low-level):
info: 2x { 2 x 1 } having 'A', Tensor, { i*, _bond }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 2.23607
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; 1 ] 32 B
R (low-level):
info: 2x { 2 x 1 } having 'A', Tensor, { _bond*, j }
data: 2-D float64 (104 B) 5 x 5 => 5 x 5 @ norm = 3.32699
1. 3x3 | 1x1 [ 0 ; 0 ] 72 B
2. 2x2 | 1x1 [ 1 ; 1 ] 32 B