2

My application (MVC) needs to download, zip and return one or many files from Amazon S3. I am using the .NET SDK and GetObject to receive the files, and want to use DotNetZip to then zip them up and return the generated zip file as a file stream result for the user to download.

Can anyone suggest the most efficient way of doing this, I am seeing OutOfMemory exceptions when downloading large files from S3, they could be up to 1gb in size for example.

My code so far;

        using (
            var client = AWSClientFactory.CreateAmazonS3Client(
                "apikey",
                "apisecret",
                new AmazonS3Config { RegionEndpoint = RegionEndpoint.EUWest1 })
            )
        {
            foreach (var file in files)
            {
                var request = new GetObjectRequest { BucketName = "bucketname", Key = file };

                using (var response = client.GetObject(request))
                {



                }
            }
        }

If I copy the response into a memory stream and add that to the zip, all works ok (on small files), but with large files assume I cannot store the entire thing in memory?

Cœur
  • 34,719
  • 24
  • 185
  • 251
lordy1981
  • 198
  • 1
  • 12
  • 1
    Have you found a solution? Some additional info might be found here: http://stackoverflow.com/questions/8563933/c-sharp-out-of-memory-exception . So it is during the download and not zip? – Par6 Mar 19 '15 at 18:50

0 Answers0