Welcome to the Earthdata Forum! Here, the scientific user community and subject matter experts from NASA Distributed Active Archive Centers (DAACs), and other contributors, discuss research needs, data, and data applications.
by oceani » Tue Dec 03, 2019 2:30 pm America/New_York
I am in the process of modifying our scripts that download our subscription data from your server using the new file naming convention. All is working well, except for one subscription number (1385). The command I am using to download the list of available files for subscription 1385 to a file (muamf1) is:
For this ID only I am getting the following error: --2019-12-03 12:16:25-- https://oceandata.sci.gsfc.nasa.gov/search/file_search.cgi Resolving oceandata.sci.gsfc.nasa.gov... xx.xxx.xx.xx Connecting to oceandata.sci.gsfc.nasa.gov|xx.xxx.xx.xx|:443... connected. HTTP request sent, awaiting response... 403 Forbidden 2019-12-03 12:16:25 ERROR 403: Forbidden.
My other subscription IDs work fine using the same wget command string. Please let me know what I may be doing wrong in this case.
by oceani » Tue Dec 03, 2019 4:05 pm America/New_York
Yep. Tried about 6-7 times. Just tried again a few minutes ago. That is quite strange. The exact same call works (so far ... still updating) for all of my other subscriptions. Could that folder containing those subscription files have different permissions than the rest? Perhaps, I should remove that subscription and create a new one???
Thanks for the look. Let me know if you have any other suggestions.
I, too, just tried again from another machine on a completely different network...still works.
There is no folder for the non-extracted subscription. In fact, if there are 30 files in your retrieval, they could reside on 30 different machines. No need to recreate the subscription, it would have no effect on the API - other than giving you a different subscription ID to pull.
All I can suggest is that perhaps the command you put in the forum post is not be the command that's being executed.
by oceani » Tue Dec 03, 2019 6:35 pm America/New_York
OK. I actually copied the command from my script file and pasted to the post. I also copied the wget command from a script that is working and just swapped out the subscription ID number. Same result. I'll see what I can figure out. I'll try from a different machine as well.
My version of wget is, GNU Wget 1.17.1 built on linux-gnu.
Was also wondering if this subscription is Nth in a script of successive calls, where N is somehow exceeding a frequency limit. Probably not it as i assume you've tried just the one subscription URL outside of the script.
I can see the 403s in the logs for your IP (well, IPs in the same last octet), but nothing to indicate why. Are you still seeing issues? Have you tried John's suggestion of using the "GET" method instead of "POST"?