1

I am trying to access AWS s3 bucket from Azure HDInsight Cluster VM. I generated new keys and added to .aws/credentials. "aws s3 ls" is working fine in Azure VM. If I do hadoop distcp or read a s3 file in spark-shell, I am getting 403 error "AWS Access Key Id you provided does not exist in our records.". I did try exporting AWS_SESSION_TOKEN,AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY. but no luck. Please help me resolve this issue.

1 Answer 1

0

I found a fix for this issue. I added the following properties in core-site.xml.

Property1:

Name: fs.s3a.aws.credentials.provider

value: org.apache.hadoop.fs.s3a.TemporaryAWSCredentialsProvider

Property2:

Name: fs.s3a.access.key

Value: XXXXXXXXX

Property3:

Name: fs.s3a.secret.key

Value: XXXXXXX

Property4:

Name:fs.s3a.session.token

Value: XXXXX

Initially I was trying to access s3 without the property "fs.s3a.aws.credentials.provider" from spark-shell

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .