Crafting URLs to download data? the page automatically blocks me

Use this Forum to find information on, or ask a question about, NASA Earth Science data.
marcsandoval
Posts: 15
Joined: Wed Sep 19, 2018 11:22 am America/New_York
Answers: 0

Crafting URLs to download data? the page automatically blocks me

by marcsandoval » Mon Jan 14, 2019 10:40 am America/New_York

Hello Paul,

Yes I get that, that is why I attached my script... to see if anyone can find any problem with it
It is weird, because I had been working with this script without problem since november, but now takes so long to charge with "wget" (and even, sometimes it does not charge anything) but I can access and download from the "http://oceancolor.gsfc.nasa.gov/cgi/browse.pl" with my browse without issue

regards!

Tags:

OB ODPS - jgwilding
Subject Matter Expert
Subject Matter Expert
Posts: 139
Joined: Fri Feb 19, 2021 1:09 pm America/New_York
Answers: 0
Been thanked: 1 time

Crafting URLs to download data? the page automatically blocks me

by OB ODPS - jgwilding » Mon Jan 14, 2019 11:14 am America/New_York

Hi Marco,

I ran your script on my system, and it downloaded the file, A2009001192000.L2_LAC_OC.nc, so i don't think you have any syntactical errors in the script.  I did change the URL for the browse.pl to https however.  Your URL to the getfile service has https already.  Not having the https for the browse.pl URL did not prevent the script from working for me.

Are you trying to set something up to acquire data going forward, something that a subscription might support, or are you trying to obtain older data from our archive, something that a data order might support?

john

marcsandoval
Posts: 15
Joined: Wed Sep 19, 2018 11:22 am America/New_York
Answers: 0

Crafting URLs to download data? the page automatically blocks me

by marcsandoval » Mon Jan 14, 2019 3:27 pm America/New_York

Hello John,

Is weird, because when I wrote that script, I figured out that without the "s" my script worked and with "s" didn't... but now it is the opposite!!.
So, it is working now,

thank you !

healthhakim
Posts: 1
Joined: Mon Feb 04, 2019 8:47 pm America/New_York
Answers: 0

Crafting URLs to download data? the page automatically blocks me

by healthhakim » Tue Feb 05, 2019 3:29 pm America/New_York

I ran your script on my system, and it downloaded the file, A2009001192000.L2_LAC_OC.nc, so I don't think you have any syntactical errors in the script.  I did change the URL for the browse.pl to https, however.  Your URL to the getfile service has https already. 

Not having the https for the browse.pl URL did not prevent the script from working for me. Maybe your group can think in something like the Blended Sea Winds product with differents ways to subset the data (https://www.ncei.noaa.gov/thredds/catalog/uv/daily_agg/catalog.html?dataset=uv/daily_agg/Aggregation_of_Daily_Ocean_Wind_best.ncd) or something like the marine.copernicus.eu web page with the Python package called "motuclient (GitHub Link)".

Thank you,

Amol Patel
National Institute of Ocean Technology
Health Hakim

marcsandoval
Posts: 15
Joined: Wed Sep 19, 2018 11:22 am America/New_York
Answers: 0

Crafting URLs to download data? the page automatically blocks me

by marcsandoval » Thu Apr 11, 2019 5:07 pm America/New_York

Hello! It is me again.

I'm sorry, but I'm confused with something
I change my script and now it is downloading files from Aqua, Terra and VIIRS. And, I made a test with some data I ordered before, from Level 1&2 Browser.
My script work fine, but I notice that It download less (and some different) files from my order. In fact, it is downloading:

A2014020193500.L2_LAC_SST.nc
T2014020152000.L2_LAC_SST.nc
V2014020185400.L2_SNPP_SST.nc
V2014020190000.L2_SNPP_SST.nc

And my order gave me these files:

A2014020175500.L2_LAC_SST.nc
A2014020193000.L2_LAC_SST.nc
A2014020193500.L2_LAC_SST.nc
T2014020134500.L2_LAC_SST.nc
T2014020152000.L2_LAC_SST.nc
T2014020152500.L2_LAC_SST.nc
V2014020185400.L2_SNPP_SST.nc
V2014020203000.L2_SNPP_SST.nc
V2014020203600.L2_SNPP_SST.nc

I want to know why this is happening and how prevent it

Thanks!

OB ODPS - jgwilding
Subject Matter Expert
Subject Matter Expert
Posts: 139
Joined: Fri Feb 19, 2021 1:09 pm America/New_York
Answers: 0
Been thanked: 1 time

Crafting URLs to download data? the page automatically blocks me

by OB ODPS - jgwilding » Thu Apr 11, 2019 5:56 pm America/New_York

You should be downloading files named like,
requested_files_N.tar

where N is a number between 1 and 161.  The tar files contain the extract data for your order.  When you extract the tar files, you should find files like these.  The x in the file name indicates that it is an extracted file.

% tar -tf requested_files_39.tar
requested_files/A2013306201500.L2_LAC_SST.x.nc
requested_files/A2013308200000.L2_LAC_SST.x.nc
requested_files/A2013308200500.L2_LAC_SST.x.nc
requested_files/A2013310181000.L2_LAC_SST.x.nc
requested_files/A2013308182500.L2_LAC_SST.x.nc
requested_files/A2013309191000.L2_LAC_SST.x.nc
requested_files/A2013312194000.L2_LAC_SST.x.nc
requested_files/A2013283183000.L2_LAC_SST.x.nc
requested_files/A2013283183500.L2_LAC_SST.x.nc
requested_files/A2013310195500.L2_LAC_SST.x.nc
requested_files/A2013310181500.L2_LAC_SST.x.nc
requested_files/A2013310195000.L2_LAC_SST.x.nc
requested_files/A2013313202000.L2_LAC_SST.x.nc
requested_files/A2013312180500.L2_LAC_SST.x.nc
requested_files/A2013314192500.L2_LAC_SST.x.nc
requested_files/A2013312180000.L2_LAC_SST.x.nc
requested_files/A2013311190000.L2_LAC_SST.x.nc

Is this what you are doing?  I'm assuming this is for order 807060a0c4a9cf58

john

marcsandoval
Posts: 15
Joined: Wed Sep 19, 2018 11:22 am America/New_York
Answers: 0

Crafting URLs to download data? the page automatically blocks me

by marcsandoval » Fri Apr 12, 2019 10:14 am America/New_York

Hello John,

Thanks for your response.
I'm was talking about the script above mentioned to automatically download L2 files without going through the L1&L2 Browser. I just compared one request I made with the results of this script and they did not match... but I found the problem, so now it is working!

Best regards,

Marco

Post Reply