python - Counting keys in an S3 bucket -


using boto3 library , python code below, can iterate through s3 buckets , prefixes, printing out prefix name , key name follows:

import boto3 client = boto3.client('s3')  pfx_paginator = client.get_paginator('list_objects_v2') pfx_iterator = pfx_paginator.paginate(bucket='app_folders', delimiter='/') prefix in pfx_iterator.search('commonprefixes'):     print(prefix['prefix'])      key_paginator = client.get_paginator('list_objects_v2')     key_iterator = key_paginator.paginate(bucket='app_folders', prefix=prefix['prefix'])     key in key_iterator.search('contents'):         print(key['key']) 

inside key loop, can put in counter count number of keys (files), expensive operation. there way make 1 call given bucket name , prefix , return count of keys contained in prefix (even if more 1000)?

update: found post here shows way aws cli follows:

aws s3api list-objects --bucket bucketname --prefix "folder/subfolder/" --output json --query "[length(contents[])]" 

is there way similar boto3 api?


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -