0

I've read quite a few blog posts and stack questions on moving files between S3 buckets or moving files from S3 buckets to an EC2 instance, but I've had a hard time finding examples to move files from and EC2 instance to an S3 bucket with boto3.

I've been looking at the put_object, copy, put, upload_file, etc. functions, but am stuck with the syntax, especially when working with shapefiles and not text files. I've been able to put text objects from my EC2 to my S3, but I can't seem to figure out how to get the shapefile with it's meta data to move. It always turns up as text data. I've looked at this [solution][1] quite a few times, but again, it's text data.

Can someone nudge me in the correct direction?

import boto3
import geopandas

# Creating the low level functional client
client=boto3.client(
    's3',
    'aws_access_key_id=#####',
    'aws_secret_access_key=#####',
    'region_name=####',
)

# Creating the high level object oriented interface
resource=boto3.resource(
    's3',
    'aws_access_key_id=#####',
    'aws_secret_access_key=#####',
    'region_name=####',
)

# Create the S3 object
obj = client.get_object(
    Bucket='Bucket-input',
    Key='input-file.geojson'
)

# Read data from the S3 object
data = geopandas.read_file(obj['Body'])

# Run python code
data.to_file('obj.shp')

# Put shapefile to output S3
# Use the `put_object` function to move 'obj.shp' (including .dbf, .cpg, .shx, .prj, etc.) to S3 Bucket='bucket-output'


  [1]: https://stackoverflow.com/questions/40336918/how-to-write-a-file-or-data-to-an-s3-object-using-boto3
Binks
  • 350
  • 5
  • 20

0 Answers0