13

What is the tf.stop_gradient() equivalent (provides a way to not compute gradient with respect to some variables during back-propagation) in pytorch?

aerin
  • 16,939
  • 27
  • 90
  • 123
  • 1
    Do any of these answer your question? https://datascience.stackexchange.com/questions/32651/what-is-the-use-of-torch-no-grad-in-pytorch https://stackoverflow.com/questions/56816241/difference-between-detach-and-with-torch-nograd-in-pytorch/56817594 – Stef Sep 16 '20 at 14:30

2 Answers2

20

Could you check with x.detach().

Deepali
  • 241
  • 1
  • 7
5

Tensors in pytorch have requires_grad attribute. Set it to False to prevent gradient computation for that tensors.

Shai
  • 102,241
  • 35
  • 217
  • 344