My application (MVC) needs to download, zip and return one or many files from Amazon S3. I am using the .NET SDK and GetObject to receive the files, and want to use DotNetZip to then zip them up and return the generated zip file as a file stream result for the user to download.
Can anyone suggest the most efficient way of doing this, I am seeing OutOfMemory exceptions when downloading large files from S3, they could be up to 1gb in size for example.
My code so far;
using (
var client = AWSClientFactory.CreateAmazonS3Client(
"apikey",
"apisecret",
new AmazonS3Config { RegionEndpoint = RegionEndpoint.EUWest1 })
)
{
foreach (var file in files)
{
var request = new GetObjectRequest { BucketName = "bucketname", Key = file };
using (var response = client.GetObject(request))
{
}
}
}
If I copy the response into a memory stream and add that to the zip, all works ok (on small files), but with large files assume I cannot store the entire thing in memory?