7

I would like to convert a Pytorch tensor to numpy array using cuda:

this is the code line while not using cuda:

A = self.tensor.weight.data.numpy()

How can I do the same operation using cuda? According to this: https://discuss.pytorch.org/t/how-to-transform-variable-into-numpy/104/3 it seems:

A = self.tensor.weight.data.cpu().numpy()

Noa Yehezkel
  • 378
  • 1
  • 3
  • 18
  • 1
    Possible duplicate of [How to convert Pytorch autograd.Variable to Numpy?](https://stackoverflow.com/questions/44340848/how-to-convert-pytorch-autograd-variable-to-numpy) – Fábio Perez Nov 25 '18 at 12:20

2 Answers2

5

I believe you also have to use .detach(). I had to convert my Tensor to a numpy array on Colab which uses CUDA and GPU. I did it like the following:

embedding = learn.model.u_weight

embedding_list = list(range(0, 64382))

input = torch.cuda.LongTensor(embedding_list)
tensor_array = embedding(input)
# the output of the line bwlow is a numpy array
tensor_array.cpu().detach().numpy() 
azizbro
  • 1,814
  • 3
  • 14
  • 23
  • 3
    You only need to call `detach` if the `Tensor` has associated gradients. When `detach` is needed, you want to call `detach` before `cpu`. Otherwise, PyTorch will create the gradients associated with the Tensor on the CPU then immediately destroy them when `numpy` is called. Calling `detach` first eliminates that superfluous step. For more information see: https://discuss.pytorch.org/t/should-it-really-be-necessary-to-do-var-detach-cpu-numpy/35489/8?u=zayd – ZaydH Sep 14 '19 at 08:05
4

If the tensor is on gpu or cuda as you say.

You can use self.tensor.weight.data.cpu().numpy() It will copy the tensor to cpu and convert it to numpy array.

If the tensor is on cpu already you can do self.tensor.weight.data.numpy() as you correctly figured out, but you can also do self.tensor.weight.data.cpu().numpy() in this case since tensor is already on cpu, .cpu() operation will have no effect. and this could be used as a device-agnostic way to convert the tensor to numpy array.

Umang Gupta
  • 12,625
  • 6
  • 43
  • 63