0

I´m testing the Redis, but I have a problem. If I store over 8000 items, lose more than 2 records. The amount of data lost is proportional to the amount of records inserted.

  class ProdutoDTO
  {
     public long id { get; set; }
     public string name { get; set; }
  }

 using (var produtosRedis = redisClient.GetTypedClient<ProdutoDTO>())
 {
     for (var i = 0; i < 15000; i++)
     {
        ProdutoDTO produto = new ProdutoDTO();
        produto.id = produtosRedis.GetNextSequence();
        produto.name = "test" + produto.id.ToString();
        produtosRedis.Store(produto);
     }
  }      

I solved:

Old code:

class ProdutoDTO  {
 public long id { get; set; }
 public string name { get; set; }
}

New code:

class ProdutoDTO  {
 public long Id { get; set; }
 public string name { get; set; }
}

Because, C# Redis Client works with any POCOs that have a single primary key - which by default is expected to be Id.

More information about C# Redis: ServiceStack.Net Redis: Storing Related Objects vs. Related Object Ids

Community
  • 1
  • 1
Fernando JS
  • 4,179
  • 3
  • 29
  • 29
  • 1
    As far as I understand it, Redis uses a least recently used algorithm. I think depending on the memory limitations you've set up for your redis server it may be discarding items. – devshorts Apr 05 '12 at 18:02
  • Well, I used the default config. The PC have 1GB free memory, but always lose data. – Fernando JS Apr 05 '12 at 18:21
  • 2
    Seems odd. First thing to do is to find out if it's a problem with a) your code. b) the C# driver or c) redis itself. Can you reproduce this behavior with another C# driver? – Tim Skauge Apr 05 '12 at 18:40
  • @FernandoJS the PC may have 1GB of free memory but you have to check the configuration to see how much memory Redis will allow itself to use. – brimble2010 Mar 01 '13 at 16:17

0 Answers0