Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

Use this Forum to find information on, or ask a question about, NASA Earth Science data.
Post Reply
jldechow
Posts: 4
Joined: Tue Jun 15, 2021 8:14 pm America/New_York
Answers: 0

Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

by jldechow » Thu Mar 07, 2024 2:27 pm America/New_York

I have been working with the MODIS/Terra Land Surface Temperature/Emissivity Daily L3 Global 1km SIN Grid V061 data product and was trying to download another year worth of data yesterday. While in EarthData search I am spatially subsetting and then setting my start and end dates to the correct dates (2022/09/01 - 2023/05/31). If I manually download a single granule from my subsetted collection I get the correct download URL and granule once I download it. When I attempt to download the whole collection (519 granules), the download script generated has incorrect URLs for all granules. Looking at the granule IDs it seems like the download script is pulling 519 random granules with no spatial subsetting from the first year of data availability. I have posted one of the random/incorrect URLs below, followed by a correct URL. I have attempted deleting my download history and regeneration the collection and download script, but the new download script has the same incorrect URLs. My study area is spanned by two MODIS swaths (h10v04 and h09v04). I have not downloaded any MODIS data since ~6 months ago, but I was successfully able to then for WY21 and WY22 and able to work with the data. Unfortunately I need too many granules to manually download them. Does anyone have any insight/suggestions on this matter. Thanks in advance.

Incorrect:
https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000113.h21v07.061.2020044204225/MOD11A1.A2000113.h21v07.061.2020044204225.hdf

Correct:
https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2023151.h10v04.061.2023153210450/MOD11A1.A2023151.h10v04.061.2023153210450.hdf

Filters:

LP DAAC - lien
User Services
User Services
Posts: 188
Joined: Thu Jun 25, 2020 9:51 am America/New_York
Answers: 1
Been thanked: 6 times

Re: Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

by LP DAAC - lien » Thu Mar 07, 2024 2:47 pm America/New_York

Hello,
Could you send the search parameters you are using and also copy and paste the script? We would like to emulate exactly what what you are doing.
Thanks,
Brett

jldechow
Posts: 4
Joined: Tue Jun 15, 2021 8:14 pm America/New_York
Answers: 0

Re: Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

by jldechow » Thu Mar 07, 2024 2:59 pm America/New_York

Hi Brett,

For my spatial subset I am using a rectangle in WA state east of the cascades. Coords are
SW: 46.7048,-121.86914
NE: 48.77902,-117.68555

Temporal subset
Start: 2022-09-01 00:00:00
End: 2023-05-31 23:59:59

Data Product: MODIS/Terra Land Surface Temperature/Emissivity Daily L3 Global 1km SIN Grid V061

Total result: 519 granules, all h10v04 and h09v04 on the MODIS SIN grid. Download size is 2.2 GB (actual download was 1.4 GB I believe). I will post the script as a separate reply.

jldechow
Posts: 4
Joined: Tue Jun 15, 2021 8:14 pm America/New_York
Answers: 0

Re: Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

by jldechow » Thu Mar 07, 2024 3:00 pm America/New_York

Script up til first line of fetch_urls (hitting character limit on post)

#!/bin/bash

GREP_OPTIONS=''

cookiejar=$(mktemp cookies.XXXXXXXXXX)
netrc=$(mktemp netrc.XXXXXXXXXX)
chmod 0600 "$cookiejar" "$netrc"
function finish {
rm -rf "$cookiejar" "$netrc"
}

trap finish EXIT
WGETRC="$wgetrc"

prompt_credentials() {
echo "Enter your Earthdata Login or other provider supplied credentials"
read -p "Username (jldechow): " username
username=${username:-jldechow}
read -s -p "Password: " password
echo "machine urs.earthdata.nasa.gov login $username password $password" >> $netrc
echo
}

exit_with_error() {
echo
echo "Unable to Retrieve Data"
echo
echo $1
echo
echo "https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h12v10.061.2020043121102/MOD11A1.A2000055.h12v10.061.2020043121102.hdf"
echo
exit 1
}

prompt_credentials
detect_app_approval() {
approved=`curl -s -b "$cookiejar" -c "$cookiejar" -L --max-redirs 5 --netrc-file "$netrc" https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h12v10.061.2020043121102/MOD11A1.A2000055.h12v10.061.2020043121102.hdf -w '\n%{http_code}' | tail -1`
if [ "$approved" -ne "200" ] && [ "$approved" -ne "301" ] && [ "$approved" -ne "302" ]; then
# User didn't approve the app. Direct users to approve the app in URS
exit_with_error "Please ensure that you have authorized the remote application by visiting the link below "
fi
}

setup_auth_curl() {
# Firstly, check if it require URS authentication
status=$(curl -s -z "$(date)" -w '\n%{http_code}' https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h12v10.061.2020043121102/MOD11A1.A2000055.h12v10.061.2020043121102.hdf | tail -1)
if [[ "$status" -ne "200" && "$status" -ne "304" ]]; then
# URS authentication is required. Now further check if the application/remote service is approved.
detect_app_approval
fi
}

setup_auth_wget() {
# The safest way to auth via curl is netrc. Note: there's no checking or feedback
# if login is unsuccessful
touch ~/.netrc
chmod 0600 ~/.netrc
credentials=$(grep 'machine urs.earthdata.nasa.gov' ~/.netrc)
if [ -z "$credentials" ]; then
cat "$netrc" >> ~/.netrc
fi
}

fetch_urls() {
if command -v curl >/dev/null 2>&1; then
setup_auth_curl
while read -r line; do
# Get everything after the last '/'
filename="${line##*/}"

# Strip everything after '?'
stripped_query_params="${filename%%\?*}"

curl -f -b "$cookiejar" -c "$cookiejar" -L --netrc-file "$netrc" -g -o $stripped_query_params -- $line && echo || exit_with_error "Command failed with error. Please retrieve the data manually."
done;
elif command -v wget >/dev/null 2>&1; then
# We can't use wget to poke provider server to get info whether or not URS was integrated without download at least one of the files.
echo
echo "WARNING: Can't find curl, use wget instead."
echo "WARNING: Script may not correctly identify Earthdata Login integrations."
echo
setup_auth_wget
while read -r line; do
# Get everything after the last '/'
filename="${line##*/}"

# Strip everything after '?'
stripped_query_params="${filename%%\?*}"

wget --load-cookies "$cookiejar" --save-cookies "$cookiejar" --output-document $stripped_query_params --keep-session-cookies -- $line && echo || exit_with_error "Command failed with error. Please retrieve the data manually."
done;
else
exit_with_error "Error: Could not find a command-line downloader. Please install curl or wget"
fi
}

fetch_urls <<'EDSCEOF'
https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h12v10.061.2020043121102/MOD11A1.A2000055.h12v10.061.2020043121102.hdf
https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h08v05.061.2020043120932/MOD11A1.A2000055.h08v05.061.2020043120932.hdf
https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/MOD11A1.061/MOD11A1.A2000055.h33v08.061.2020043121151/MOD11A1.A2000055.h33v08.061.2020043121151.hdf

LP DAAC - lien
User Services
User Services
Posts: 188
Joined: Thu Jun 25, 2020 9:51 am America/New_York
Answers: 1
Been thanked: 6 times

Re: Download script for MODIS/Terra LST Daily 1km has different download URLS from collection granules shown

by LP DAAC - lien » Tue Mar 12, 2024 10:13 am America/New_York

Hi,
Can I point you in a little different direction. We have seen this issue with the Earthdata Search before when users try to subset. We are the LPDAAC, so the data comes from us, but the Earthdata Search is out of our control. Can you try our AppEEARS tool: https://appeears.earthdatacloud.nasa.gov
This will allow you to spatially subset your exact area of interest so you wouldn't need to download all the extra data that comes with the complete tiles when you only need a small amount of the area. You can also reproject and reformat the data and it creates charts and a .csv file along with the raster images. Please let me know if you have any questions.
Thanks,
Brett

Post Reply