wget --post-data="subID=1385&results_as_file=1" -O - https://oceandata.sci.gsfc.nasa.gov/api/file_search > /users/staff/nasamodissst/munetcdf/muamf1
For this ID only I am getting the following error:
--2019-12-03 12:16:25-- https://oceandata.sci.gsfc.nasa.gov/search/file_search.cgi
Resolving oceandata.sci.gsfc.nasa.gov... xx.xxx.xx.xx
Connecting to oceandata.sci.gsfc.nasa.gov|xx.xxx.xx.xx|:443... connected.
HTTP request sent, awaiting response... 403 Forbidden
2019-12-03 12:16:25 ERROR 403: Forbidden.
My other subscription IDs work fine using the same wget command string. Please let me know what I may be doing wrong in this case.
Thanks for the look. Let me know if you have any other suggestions.
I, too, just tried again from another machine on a completely different network...still works.
There is no folder for the non-extracted subscription. In fact, if there are 30 files in your retrieval, they could reside on 30 different machines.
No need to recreate the subscription, it would have no effect on the API - other than giving you a different subscription ID to pull.
All I can suggest is that perhaps the command you put in the forum post is not be the command that's being executed.
- Subject Matter Expert
- Posts: 110
- Joined: Fri Feb 19, 2021 1:09 pm America/New_York
What version of wget do you have, and do you have different results with the GET method? Note quotes around URL become necessary to protect the special chars.
?wget -O - "https://oceandata.sci.gsfc.nasa.gov/api/file_search?subID=1385&results_as_file=1"
My version of wget is, GNU Wget 1.17.1 built on linux-gnu.
Was also wondering if this subscription is Nth in a script of successive calls, where N is somehow exceeding a frequency limit. Probably not it as i assume you've tried just the one subscription URL outside of the script.
Are you still seeing issues?
Have you tried John's suggestion of using the "GET" method instead of "POST"?