browse.pl
browse.pl
I was pretty sure I've seen documentation on all of the options that are available to use with browse.pl, but I haven't been able to format a query to the fora the returns useful information.
My goal (in case there's a better way) is to, given a date/time/lat/long, retrieve all granule *names* that are likely to contain a pixel that contains data at that location. I don't want to download the data, but want to be able to quickly and locate granule names for later download.
My goal (in case there's a better way) is to, given a date/time/lat/long, retrieve all granule *names* that are likely to contain a pixel that contains data at that location. I don't want to download the data, but want to be able to quickly and locate granule names for later download.
Filters:
-
- Subject Matter Expert
- Posts: 147
- Joined: Tue Feb 09, 2021 8:19 am America/New_York
browse.pl
Hi Bruce,
There have been many changes (added missions, file-name-convention changes, etc.)
since this forum thread that have caused the suggested wget calls to no longer work.
My old browse code (never intended for such uses) is still limping along while others
work out a design for a new browser that will hopefully be better at this sort of
non-interactive use. In the mean time here is an updated possibility for getting
the file names you seek. (I assume a bash shell below.)
Here is a list of the sensor codes currently understood by the old browser.
If you need more than one sensor, string them together with the "@" character
(e.g. sen=vrsn@vrj1 ).
Once again, this is largely unsupported but provided here in case you find it useful.
Regards,
Norman
There have been many changes (added missions, file-name-convention changes, etc.)
since this forum thread that have caused the suggested wget calls to no longer work.
My old browse code (never intended for such uses) is still limping along while others
work out a design for a new browser that will hopefully be better at this sort of
non-interactive use. In the mean time here is an updated possibility for getting
the file names you seek. (I assume a bash shell below.)
# Find SNPP/VIIRS scenes likely to include Bigelow Lab on the ides of March this year.
lat=43.86
lon=-69.578
date=2020-03-15
sen=vrsn
epoch=`date -d $date -u "+%s"`
day=`echo "scale=0;$epoch/86400"|bc -l`
url=https://oceancolor.gsfc.nasa.gov/cgi/browse.pl
wget -qO - \
$url'?sub=level1or2list&sen='$sen'&per=DAY&day='$day'&n='$lat'&w='$lon'&dnm=D&prm=TC' \
| perl -n -0777 \
-e 'if(/filenamelist&id=(\d+\.\d+)/){' \
-e 'print `wget "'$url'?sub=filenamelist&id=$1&prm=TC" -qO -`;' \
-e '}' \
-e 'elsif(/\/getfile\/([^"]+)/){' \
-e 'print "$1\n";' \
-e '}'
Here is a list of the sensor codes currently understood by the old browser.
hico => HICO(ISS)
swml => SeaWiFS(MLAC)
swga => SeaWiFS(GAC)
amod => Aqua
tmod => Terra
octs => OCTS(ADEOS)
czcs => CZCS(Nimbus-7)
mefr => MERIS(FRS)
merr => MERIS(RR)
vrsn => Suomi-NPP
vrj1 => NOAA-20
s3ar => Sentinel3A(ERR)
s3af => Sentinel3A(EFR)
s3br => Sentinel3B(ERR)
s3bf => Sentinel3B(EFR)
goci => GOCI(COMS)
If you need more than one sensor, string them together with the "@" character
(e.g. sen=vrsn@vrj1 ).
Once again, this is largely unsupported but provided here in case you find it useful.
Regards,
Norman
-
- Posts: 1519
- Joined: Wed Sep 18, 2019 6:15 pm America/New_York
- Been thanked: 9 times
browse.pl
Bruce,
Another method is to use the CMR API
Following on Norman's example:
The trick is figuring out which "short_name" to use, a list of them can be obtained with various incantations of the API.
Not all of our data sets are in CMR yet (e.g. VIIRS L1 data) but we will be adding them. Once the next OC reprocessing is done, we should be fully in CMR.
Another caveat is that some of our L1/2 "collections" were inserted with bounding box geolocation information, so there will be more "false positives" than browse.pl will produce.
Those are being redone with geo polygons (GRings in MODIS parlance) which will be much more accurate - the SST data (MODIS and VIIRS) are already done.
BTW, this API is how we'll be doing it on the backend for the new browser to which Norman alluded.
Sean
Another method is to use the CMR API
Following on Norman's example:
$ curl "https://cmr.earthdata.nasa.gov/search/granules.csv?provider=OB_DAAC&short_name=VIIRSN_L2_OC&point=-69.578,43.86&temporal\%5b\%5d=2020-03-15T00:00:00Z,2020-03-16T00:00:00Z"
Granule UR,Producer Granule ID,Start Time,End Time,Online Access URLs,Browse URLs,Cloud Cover,Day/Night,Size
2018_VIIRSN_L2_OC_V2020075154200.L2_SNPP_OC.nc,V2020075154200.L2_SNPP_OC.nc,2020-03-15T15:42:01Z,2020-03-15T15:47:58Z,https://oceandata.sci.gsfc.nasa.gov/cmr/getfile/V2020075154200.L2_SNPP_OC.nc,,,DAY,
2018_VIIRSN_L2_OC_V2020075171800.L2_SNPP_OC.nc,V2020075171800.L2_SNPP_OC.nc,2020-03-15T17:18:00Z,2020-03-15T17:23:59Z,https://oceandata.sci.gsfc.nasa.gov/cmr/getfile/V2020075171800.L2_SNPP_OC.nc,,,DAY,
2018_VIIRSN_L2_OC_V2020075172400.L2_SNPP_OC.nc,V2020075172400.L2_SNPP_OC.nc,2020-03-15T17:24:01Z,2020-03-15T17:29:59Z,https://oceandata.sci.gsfc.nasa.gov/cmr/getfile/V2020075172400.L2_SNPP_OC.nc,,,DAY,
The trick is figuring out which "short_name" to use, a list of them can be obtained with various incantations of the API.
Not all of our data sets are in CMR yet (e.g. VIIRS L1 data) but we will be adding them. Once the next OC reprocessing is done, we should be fully in CMR.
Another caveat is that some of our L1/2 "collections" were inserted with bounding box geolocation information, so there will be more "false positives" than browse.pl will produce.
Those are being redone with geo polygons (GRings in MODIS parlance) which will be much more accurate - the SST data (MODIS and VIIRS) are already done.
BTW, this API is how we'll be doing it on the backend for the new browser to which Norman alluded.
Sean