0

I have an embedding with limited size (say 5)

        self.embedding = torch.nn.Embedding(length,embedding_dim)

I receive input ids like (7, 18, 6, ...) as a pytorch tensor. However the embedding for 7 is in the first index of embedding, for 18 it is in second row, etc.

I want a map from these numbers to 1,2, 3... to access stored value in embedding. It seems I can't use a dictionary as follows

   def forward(self,prompt_token_ids,pids=None):
        prompt_token_ids = [self.id_map[x] for x in prompt_token_ids]
        return self.embedding(prompt_token_ids)

How can I do these mappings for tensors?

Ahmad
  • 7,430
  • 9
  • 60
  • 105
  • A similar question could be answer: https://stackoverflow.com/questions/65565461/how-to-map-element-in-pytorch-tensor-to-id/65567587 – Ahmad Dec 05 '21 at 19:29

0 Answers0