Page 1 of 1

403 Forbidden using STS creds from OB.DAAC /s3credentials with s3fs (HeadObject / ListBucket denied)

Posted: Wed Mar 04, 2026 11:59 pm America/New_York
by benjaminmouldayveolia
Hi all,

I’m trying to access OB.DAAC Aqua MODIS NRT in Google Colab using the direct S3 URLs, like:

s3://ob-cumulus-prod-public/AQUA_MODIS.20260303T053500.L2.OC.NRT.nc
creds endpoint: https://obdaac-tea.earthdatacloud.nasa.gov/s3credentials
I can successfully retrieve temporary AWS credentials from /s3credentials (HTTP 200; accessKeyId/secretAccessKey/sessionToken returned).

However, when using those STS credentials with boto3/s3fs, I consistently get 403s:

s3fs.open() fails immediately with:
"An error occurred (403) when calling the HeadObject operation: Forbidden"

boto3 list also fails with explicit deny:
"AccessDenied ... not authorized to perform: s3:ListBucket on arn:aws:s3:::ob-cumulus-prod-public with an explicit deny in an identity-based policy"

The assumed role shown in the error is:
arn:aws:sts::<acct-id>:assumed-role/s3-same-region-access-role/<my-username>

Code snippet:

import boto3, s3fs

fs = s3fs.S3FileSystem(
key=creds['accessKeyId'],
secret=creds['secretAccessKey'],
token=creds['sessionToken'],
client_kwargs={'region_name': 'us-west-2'}
)

with fs.open('s3://ob-cumulus-prod-public/AQUA_MODIS.20260303T053500.L2.OC.NRT.nc','rb') as f:
data = f.read()

I can download the same granule over HTTPS using an Earthdata bearer token (Authorization: Bearer ...) and open it with netCDF4, so the account/auth itself works. Is there something wrong with my account permissions for S3 credentials? Is there some sort of limitation with the ListBucket/HeadObject how can I change my code for direct S3 reads.

Thank you in advance.