科比一路走好,愿曼巴精神永存!
具体用法参照官方文档
1 2 3 4 5 6
| import torch x = torch.ones(2, 2, requires_grad=True)
print(x)
|
1 2 3 4 5
|
print(x.grad_fn) print(y.grad_fn)
|
1 2 3 4 5
| z = y * y * 3 out = z.mean()
print(z, out)
|
1 2 3 4 5 6 7 8
| a = torch.randn(2, 2) a = ((a * 3) / (a - 1)) print(a.requires_grad) a.requires_grad_(True) print(a.requires_grad) b = (a * a).sum() print(b.grad_fn)
|
1 2 3 4 5 6 7 8
| print(x.requires_grad, y.requires_grad, z.requires_grad) print(out.requires_grad) out.backward()
print(x.grad)
|
1 2 3 4 5 6
| print(x.requires_grad) print((x ** 2).requires_grad)
with torch.no_grad(): print((x ** 2).requires_grad)
|
注意 1. 求导只对用户定义的变量进行,即对各个leaf Variable计算梯度 2. 运算结果变量的“requires_grad”是不可以更改的,且不会改变