如何使用boto将file upload到S3存储桶中的目录

我想用python复制s3桶中的文件。

例如:我有桶名称=testing。 而在斗中,我有2个文件夹名称“转储”和“input”。 现在我想复制一个文件从本地目录到S3“转储”文件夹使用Python …任何人都可以帮助我吗?

尝试这个…

import boto import boto.s3 import sys from boto.s3.key import Key AWS_ACCESS_KEY_ID = '' AWS_SECRET_ACCESS_KEY = '' bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump' conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) bucket = conn.create_bucket(bucket_name, location=boto.s3.connection.Location.DEFAULT) testfile = "replace this with an actual filename" print 'Uploading %s to Amazon S3 bucket %s' % \ (testfile, bucket_name) def percent_cb(complete, total): sys.stdout.write('.') sys.stdout.flush() k = Key(bucket) k.key = 'my test file' k.set_contents_from_filename(testfile, cb=percent_cb, num_cb=10) 

[更新]我不是一个pythonist,所以感谢有关import陈述的头脑。 此外,我不build议将凭据放置在您自己的源代码中。 如果您在AWS内部运行此操作,请将IAM凭据与实例configuration文件( http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html )一起使用,并保持相同的行为你的开发/testing环境,使用AdRoll( https://github.com/AdRoll/hologram )中的全息图,

我用这个,实现起来很简单

 import tinys3 conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True) f = open('some_file.zip','rb') conn.upload('some_file.zip',f,'my_bucket') 

https://www.smore.com/labs/tinys3/

不需要那么复杂:

 s3_connection = boto.connect_s3() bucket = s3_connection.get_bucket('your bucket name') key = boto.s3.key.Key(bucket, 'some_file.zip') with open('some_file.zip') as f: key.send_file(f) 

这也将工作:

 import os import boto import boto.s3.connection from boto.s3.key import Key try: conn = boto.s3.connect_to_region('us-east-1', aws_access_key_id = 'AWS-Access-Key', aws_secret_access_key = 'AWS-Secrete-Key', # host = 's3-website-us-east-1.amazonaws.com', # is_secure=True, # uncomment if you are not using ssl calling_format = boto.s3.connection.OrdinaryCallingFormat(), ) bucket = conn.get_bucket('YourBucketName') key_name = 'FileToUpload' path = 'images/holiday' #Directory Under which file should get upload full_key_name = os.path.join(path, key_name) k = bucket.new_key(full_key_name) k.set_contents_from_filename(key_name) except Exception,e: print str(e) print "error" 
 from boto3.s3.transfer import S3Transfer import boto3 #have all the variables populated which are required below client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key) transfer = S3Transfer(client) transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename) 
 import boto from boto.s3.key import Key AWS_ACCESS_KEY_ID = '' AWS_SECRET_ACCESS_KEY = '' END_POINT = '' # eg. us-east-1 S3_HOST = '' # eg. s3.us-east-1.amazonaws.com BUCKET_NAME = 'test' FILENAME = 'upload.txt' UPLOADED_FILENAME = 'dumps/upload.txt' # include folders in file path. If it doesn't exist, it will be created s3 = boto.s3.connect_to_region(END_POINT, aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY, host=S3_HOST) bucket = s3.get_bucket(BUCKET_NAME) k = Key(bucket) k.key = UPLOADED_FILENAME k.set_contents_from_filename(FILENAME) 
 import boto3 s3 = boto3.resource('s3') BUCKET = "test" s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")