Welcome to the Earthdata User Forum! Here, subject matter experts from several NASA Distributed Active Archive Centers (DAAC) can discuss general questions, research needs and data applications. Users can query how to access, view and interpret the data.
by alaroy » Thu Oct 14, 2021 5:12 pm America/New_York
Some time late in the UTC day 2021-10-10 or early in the day 2021-10-11 modis_GEO.py mysteriously stopped working (we hadn't manipulated the system in months). Here's an example of what's happening when I try to process almost any data with it:
/marine-services/ocssw/scripts/modis_GEO.py --threshold=10 -e /data/NC/L1/OBPG/FAI/2021-10-10/S20211010140000-EPH-MRT-OBPG.gbad -a /data/NC/L1/OBPG/FAI/2021-10-10/S20211010140000-ATT-MRT-OBPG.gbad -o /data/NC/L1/OBPG/FAI/2021-10-10/S20211010143000-GEO-MRT-OBPG.nc.tmp /tmp/ssscratch/ocssw_GEO-working/S20211010143000-L1A-MRT-OBPG.nc
Searching for first requested record .........
10 EPHEMERIS RECORDS WRITTEN. LAST JED = 2436496.50
20 EPHEMERIS RECORDS WRITTEN. LAST JED = 2436816.50
MODIS GEO version 6.1.0, built Aug 16 2019 12:46:24
scan: 0 out of 203 Wed Oct 13 02:35:41 2021
scan: 10 out of 203 Wed Oct 13 02:35:41 2021
scan: 200 out of 203 Wed Oct 13 02:35:41 2021
Percent valid data (0.00) is less than threshold (10.00)
ERROR: MODIS geolocation processing failed.
Thinking that it was related to the letsencrypt issue I've tested it on CentOS 7, RHEL 8 and SUSE (I forget the version) - it had the same failure in all three contexts. I also have tried re-retrieving ocssw from git twice (which has previously fixed this issue when we had it before) but no such luck.
Thank you in advance for any advice you can give us on this!
by alaroy » Thu Oct 14, 2021 10:09 pm America/New_York
OK, this is embarrassing
Our download script was retrieving html error messages instead of gbad files and happily injecting them into the production system as if they were actual gbad files. Needless to say modis_GEO.py didn't read / process them properly.
Anyway, I'm off to add a trap to the download script to identify and reject html files mascaraing as gbad files...
by gnwiii » Fri Oct 15, 2021 8:43 am America/New_York
If you are using wget, there is an option to have the file extension reset to match the contents. For bulk downloads, you can use the file search to download a list of files with checksums. At one time I was getting corrupt downloads at a rate of 10%, so using checksums was very helpful.
by alaroy » Fri Oct 15, 2021 1:00 pm America/New_York
Good tips, thank you!
At least for now I'm testing that the gbad files are are 19968 bytes or more long and start with EOSAM1 or EOSAM2. If I find that's not working I'll come up with plan B.
In the case of netcdf/hdf files I have an automated test that ncdump -h doesn't return an error code. In addition to truncated (or otherwise corrupted) downloads, that also traps the situations where a bad file on the source system is being transmitted correctly (I've run across that a few times, I can't recall of any of them were OBPG's systems or not)