!

This repo is no longer under development

and is probably broken given the pace of the Google Earth Engine team. I am now the CEO of Earthscope, a startup company from Entrepreneur First, so have no time to squash any bugs that I introduced (sorry) or that have since appeared...

The rest of the repo is 'as was', use at your own peril.


Atmospheric Correction of Sentinel2 and Landsat

Consider using gee-atmcorr-S2 if you are atmospherically correcting a small number of images (e.g. 10s). It uses Py6S directly and has less set up time.

Purpose

This repo is for atmospherically correcting large numbers (e.g. 100s) of Sentinel2 and Landsat images. Although automated, it has a longer set up time as it will download then interpolate look up tables. However, it should run considerably faster. Time series have the following properties:

Bonus

This approach might also be more suitable for onboard processing (e.g. drones, nanosats) as the computational heavy lifting can be done in advanced.

Installation

Install Docker then build the Dockerfile

docker build /path/to/Dockerfile -t atmcorr-timeseries

Usage

Run the Docker container.

docker run -i -t -p 8888:8888 atmcorr-timeseries

and authenticate the Earth Engine API.

earthengine authenticate

grab the source code

git clone https://github.com/samsammurphy/ee-atmcorr-timeseries

and run the Jupyter Notebook:

cd ee-atmcorr-timeseries/jupyter_notebooks
jupyter-notebook ee-atmcorr-timeseries.ipynb --ip='*' --port=8888 --allow-root

This will print out a URL that you can copy/paste into your web browser to run the code.

If the URL is http://(something_in_parentheses) then you will need to change the parentheses and its contents for localhost. A valid URL should look something like..

http://localhost:8888/?token=...

Notes on setup-time VS run-time

This code is optimized to run atmospheric correction of large image collections. It trades setup-time (i.e. ~30 mins) for run time. Setup is only performed once and is fully automated. This solves the problem of running radiative transfer code for each image which would take ~2 secs/scene, 500 scenes would therefore take over 16 mins (everytime).

It does this using the 6S emulator which is based on n-dimensional interpolated lookup tables (iLUTs). These iLUTs are automatically downloaded and constructed locally.