Page 1 of 2

Posted: Mon Jun 01, 2020 11:01 pm America/New_York
by avmehta
   I have been  trying to run OCSSW on NCCS/ADAPT system. I had help in installing SeaDAS and OCSSW on the system. When I try to run I get the following messages - it seems that some database is missing. This does not happen when I use it on my Linux server. I'll  appreciate your help.
Traceback (most recent call last):
  File "/usr/local/cb/ocssw/scripts/", line 196, in <module>
  File "/usr/local/cb/ocssw/scripts/", line 189, in main
  File "/usr/local/cb/ocssw/scripts/modules/", line 491, in findweb
  File "/usr/local/cb/ocssw/scripts/modules/", line 18, in openDB
    conn = sqlite3.connect(self.dbfile, timeout=30)
sqlite3.OperationalError: unable to open database file

Posted: Tue Jun 02, 2020 9:02 am America/New_York
by OB.DAAC - SeanBailey
The script creates the SQLite database file.  This error indicates that it cannot do so.  Check that the user running the script has write permissions for the file.  The default location is $OCSSWROOT/var/ancillary_data.db
The path in the error message suggests the installation was done as root in a system wide location.  That may explain the permission issues.


Posted: Tue Jun 02, 2020 9:25 am America/New_York
by avmehta
Thank you Sean. Will check about the write permission.

Posted: Tue Jun 02, 2020 2:41 pm America/New_York
by avmehta
Hi Sean,
   We were able to resolve this and execute! Thanks a lot for your help.

    I am having problem getting OLCI Level-1 data. I select dates/region from L1&L2 Browser, then order data and get a list of urls for OLCI L-1 zip files. These are supposed to be directories. When I use wget to download these files I do get the files but when I try to unzip I get the following message:

  End-of-central-directory signature not found.  Either this file is not
  a zipfile, or it constitutes one disk of a multi-part archive.  In the
  latter case the central directory and zipfile comment will be found on
  the last disk(s) of this archive.
unzip:  cannot find zipfile directory in one of or, and cannot find, period.


Posted: Tue Jun 02, 2020 4:52 pm America/New_York
by gnwiii
It would be good to start a new topic for the new problem.  

When downloaded files don't behave, it is useful to provide as much detail as possible about the state of your misbehaving file.   The linux file command should  confirm or deny that the downloaded file is a zip archive.   "ls -l" could also help.   There are a couple ways a downloaded file gets replaced by an html document.   If the "file" command says you have an html file then you should be able to rename it to replace ".zip" with ".html" and open it in a browser.  The two examples I have encountered are 1) a message from the IT people saying your file was blocked by some policy, or 2) a login page, which may mean your downloader doesn't handle the EOSDIS single-signon system or just that EOSDIS was having a bad day.

Posted: Tue Jun 02, 2020 5:32 pm America/New_York
by avmehta
Thanks George for the reply.

  -  I am having this problem for several days so may not be EOSDIS

  - Also,  I download other data from EOSDIS (Earthdata/GES DISC) with wget without any problem.  But these are not  simple files - they are directories  as I mentioned in my previous mail. I see these directories on  When I order data through L1&L2 Browser,  I get a list of URLS -- the names point to particular swaths (directories) from this  L1 collection.   The list of URLs when I order data looks as shown below. But if I type in my browser (which is on the URL list), I get :
OceanColor Biology Processing Group (OBPG)
Sorry, an error has occurred. Use the back button to return to the previous page or go to the Ocean Color Home Page.

Not sure how to get to those swaths and do bulk download.
Thanks again.

List of URLs from the L1&L2 Data Browser/selector


Posted: Wed Jun 03, 2020 12:01 pm America/New_York
by OB.DAAC - SeanBailey

The data access page and the L1/L2 browser point to the same files.  While the unzipped OCLI "files" are in fact directories, the zip file is what you download - and that is, well,  a file.
The getfile interface requires the file to be retrieved, so yes, by itself will result in an error. 

The "sentinel" bit in the URL tells getfile that it expects the user to have accepted the EULA for Sentinel data access in their Earthdata Login profile.  If that is not true, the download will fail (well, you'll download the login page as an html file).  You can verify your EULA acceptance by checking your profile at (on the profile summary page, but also under Applications->Accepted EULAs).  If you don't see the Sentinel EULA as accepted, you can edit your profile and accept it (select Edit Profile and scroll down to the "Agreements" section).

As for bulk downloading, the page describes various methods, including a python script that should make it easy...


Posted: Thu Jun 04, 2020 10:02 am America/New_York
by avmehta
Hi Sean,
  Thanks for the clarification.
   I checked on Earthdata and I do have accepted EULA for Sentinel.
   I have looked at the download methods and that is how I used wget. In any method  I can not figure out how to specify geographical region ( as I don't want global swaths) that is why I am using L1/L2 Browser to get specific URLS for the region. Is there a way I can download those specific swaths using the Python script ?
   Another  question I have is about lonlat2pixline function. I am using the following for OLCI:
   lonlat2pixline -r 300 -x 1 -y l xfdumanifest.xml  -76.94 36.84 -75.66 38.53
   I do get line/pix for most images when the area is covered. I just don't see any return code! How can I check that?

  Thanks a lot for your help.

Posted: Thu Jun 04, 2020 10:20 am America/New_York
by OB SeaDAS - dshea
in bash the return code is stored in $?

so to see the return code you can:

echo $?


Posted: Thu Jun 04, 2020 10:51 am America/New_York
by avmehta
Thanks Don.