PyTorch – 如何计算矩阵的奇异值分解 (SVD)?
火炬。linalg.svd()计算矩阵或一批矩阵的奇异值分解(SVD)。奇异值分解表示为命名元组(U,S,Vh)。
U和Vh对于实矩阵是正交的,对于输入复矩阵是酉矩阵。
当V是实数值时Vh是V的转置,当V是复数时是共轭转置。
即使输入是复数,S也始终是实数值。
语法
U, S, Vh = torch.linalg.svd(A, full_matrices=True)
参数
A–PyTorch张量(矩阵或矩阵批次)。
full_matrices–如果为True,则输出为完整的SVD,否则为简化的SVD。默认为真。
输出结果
它返回一个命名元组(U,S,Vh)。
脚步
导入所需的库。
import torch
创建一个矩阵或一批矩阵。
A = torch.randn(3,4)
计算上面创建的矩阵或矩阵批次的SVD。
U, S, Vh = torch.linalg.svd(A)
显示U、S和Vh。
print("U:\n",U)
print("S:\n",S)
print("Vh:\n",Vh)示例1
以下Python程序展示了如何计算矩阵的SVD。
# import necessary library
import torch
# create a matrix
A = torch.randn(3,4)
print("Matrix:\n", A)
# compute SVD
U, S, Vh = torch.linalg.svd(A)
# print U, S, and Vh
print("U:\n",U)
print("S:\n",S)
print("Vh:\n",Vh)输出结果Matrix:
tensor([[-1.5122, -0.4714, -0.1173, -0.3914],
[ 0.4288, -1.9329, 0.9171, -1.0288],
[ 0.1143, 0.1989, 0.3290, 0.3031]])
U:
tensor([[ 0.1769, 0.9716, 0.1569],
[ 0.9815, -0.1860, 0.0448],
[-0.0728, -0.1460, 0.9866]])
S:
tensor([2.4383, 1.6226, 0.4119])
Vh:
tensor([[ 0.0595, -0.8182, 0.3508, -0.4516],
[-0.9649, -0.0787, -0.2050, -0.1438],
[-0.2554, 0.0864, 0.8433, 0.4650],
[ 0.0092, -0.5629, -0.3519, 0.7478]])示例2
以下Python程序展示了如何计算复矩阵的SVD。
# import necessary library
import torch
# create a matrix of complex random number
A = torch.randn(2,2, dtype = torch.cfloat)
print("Complex Matrix:\n", A)
# compute SVD
U, S, Vh = torch.linalg.svd(A)
# print U, S, and Vh
print("U:\n",U)
print("S:\n",S)
print("Vh:\n",Vh)输出结果Complex Matrix:
tensor([[-0.2761-0.6619j, -1.4248-0.3026j],
[-0.2797+0.2036j, 0.2143+1.3459j]])
U:
tensor([[-0.2670-0.7083j, 0.3372+0.5597j],
[-0.4943+0.4273j, -0.4737+0.5905j]])
S:
tensor([2.1358, 0.2259])
Vh:
tensor([[ 0.3595+0.0000j, 0.4981-0.7891j],
[-0.9332+0.0000j, 0.1919-0.3040j]])示例3
以下Python程序展示了如何计算一批三个矩阵的SVD。
# import necessary library
import torch
# create a batch of three 2x3 matrices
A = torch.randn(3,2,3)
print("Matrices:\n", A)
# compute SVD
U, S, Vh = torch.linalg.svd(A)
#print U, S, and Vh
print("U:\n",U)
print("S:\n",S)
print("Vh:\n",Vh)输出结果Matrices:
tensor([[[ 0.2195, -1.3015, -1.0770],
[-0.5884, -0.8269, 0.0135]],
[[ 1.0753, -1.7080, -0.3692],
[-1.3024, 0.2581, -1.2018]],
[[-0.3576, -1.0531, -0.6192],
[ 0.8453, 0.4187, -0.1622]]])
U:
tensor([[[ 0.9242, -0.3818],
[ 0.3818, 0.9242]],
[[ 0.8178, 0.5755],
[-0.5755, 0.8178]],
[[ 0.8604, 0.5097],
[-0.5097, 0.8604]]])
S:
tensor([[1.8131, 0.8030],
[2.2789, 1.4912],
[1.4146, 0.7317]])
Vh:
tensor([[[-0.0120, -0.8376, -0.5462],
[-0.7815, -0.3329, 0.5276],
[ 0.6237, -0.4332, 0.6506]],
[[ 0.7148, -0.6781, 0.1710],
[-0.2993, -0.5176, -0.8015],
[-0.6321, -0.5217, 0.5730]],
[[-0.5221, -0.7913, -0.3182],
[ 0.7448, -0.2413, -0.6221],
[ 0.4155, -0.5617, 0.7154]]])