The following example illustrates a command-line scriptable command for downloading all the files in the directory.
This example assumes the 'wget' utility is installed on the computer executing the download.
wget -r -np -nH -w1 --cut-dirs=2 --reject="index.html*"
The directory: f89ef2fc-a0ed-4430-b158-f46ef32606fd will be created within the directory the wget command was executed.
Additionally, there are other options for downloading the CALIPSO data files you are looking for. The ASDC's Direct Data Download application provides the ability to directly download data files via a web browser. Since you are requesting such a large amount
of CALIPSO data, we also have scripts available for use to perform bulk downloads:
ASDC DDD Application:
Scripts for Downloading Data:
We hope you find this information helpful.
The problem I now have is my internet keeps dropping after say downloading 10-20% of the files. Then I have to start the download over. Is there any way to specify certain files to download? For instance if the download dropped after getting all files from April 1 to april 19, and if it were ftp I would say "get CAL_LID_L1-Standard-V4-10.2020-04-2*" to get just the files after the 19th.
Are you attempting to use FTP to download the CALIPSO data files? The FTP protocol for data and information access at NASA’s ASDC DAAC has been retired and discontinued as of a few years ago. We are now using HTTPS for data transfer.
Unfortunately there is not much we can do from our end to increase the internet speeds or data throughput.
Which method are you currently using to download the data files?
Alternatively, you may find the ASDC's CALIPSO Subsetting tool of use https://subset.larc.nasa.gov/calipso/ which allows you to specify data products and parameters (including temporal subsets) for data download.
Well after I asked that question, I read the man page on wget and it said there is a log kept in the directory the wget command is executed from which contains the status of the download including the last file successfully downloaded. So when the internet connection is broken, and you re-enter the wget command it knows where it is in the download and just starts with the file after the last successfully downloaded file. So even though my internet connection went down more than once during the download process, all I needed to do was re-issue the wget command each time until all files were downloaded.
Thanks for your help
Great to hear you were able to successfully download the files!
Please let us know if you have any additional questions or need any further assistance.