0

I am using xee, an extension of xarray to work with data from Google Earth Engine. I am trying to test computing NDVI through an xarray of Landsat imagery, but I keep getting this error.

EEException: Total request size (56623104 bytes) must be less than or equal to 50331648 bytes.

Am I just working with too much data? I kind of assumed that xarray, in tandem with Dask, would allow me to work with data of any size. I'm attaching my code in case you would like to replicate the error yourself. Also, I know I could just use pre-generated NDVI products or compute NDVI in Earth Engine first before creating an xarray Dataset. I'm computing NDVI as just a test for future custom functions I want to run so I'm starting with something simple.

import ee
import xarray

ee.Initialize(opt_url='https://earthengine-highvolume.googleapis.com')

def prep_sr_l8(image):
    # Develop masks for unwanted pixels (fill, cloud, cloud shadow).
    qa_mask = image.select('QA_PIXEL').bitwiseAnd(int('11111', 2)).eq(0)
    saturation_mask = image.select('QA_RADSAT').eq(0)

    # Apply the scaling factors to the appropriate bands.
    def get_factor_img(factor_names):
        factor_list = image.toDictionary().select(factor_names).values()
        return ee.Image.constant(factor_list)
    
    scale_img = get_factor_img([
        'REFLECTANCE_MULT_BAND_.|TEMPERATURE_MULT_BAND_ST_B10'])
    offset_img = get_factor_img([
        'REFLECTANCE_ADD_BAND_.|TEMPERATURE_ADD_BAND_ST_B10'])
    scaled = image.select('SR_B.|ST_B10').multiply(scale_img).add(offset_img)

    # Replace original bands with scaled bands and apply masks.
    return image.addBands(scaled, None, True)\
    .updateMask(qa_mask).updateMask(saturation_mask)


CALIFORNIA = ee.FeatureCollection("projects/calfuels/assets/Boundaries/California")
#LTBMU = ee.FeatureCollection("projects/calfuels/assets/Boundaries/park_lane_tahoe")

ic = (ee.ImageCollection('LANDSAT/LC08/C02/T1_L2').map(prep_sr_l8).filterBounds(CALIFORNIA.geometry())
      .filterDate('2019-01-01', '2019-12-31'))

ic_xr = xarray.open_dataset(ic, engine = "ee", crs='EPSG:3310', scale = 30, chunks="auto")

ndvi = (ic_xr['SR_B5'] - ic_xr['SR_B4']) / (ic_xr['SR_B5'] - ic_xr['SR_B4'])

ic_xr['NDVI'] = ndvi

ic_xr_result = ic_xr.compute()
2
  • Why would you try to do that math on your client instead of doing it on the server? Commented Apr 30 at 10:41
  • I mentioned basically why in the last sentence of my question. This is just a test to eventually write future custom functions that Earth Engine does not have built into its API. Commented Apr 30 at 17:25

1 Answer 1

0

I know why this happens now. Earth Engine has a max payload size of 50331648 bytes (approximate 48 MBs-ish). In the Earth Engine documentation, it states a max payload size of 10 MBs (see the doc here), but I guess this just isn't true? Because xee is essentially a wrapper to Earth Engine's ee.data.computePixels(), this method is called on a chunk-by-chunk basis when you use open_dataset(engine="ee"). To fix my problem, I need to make sure that the code to query Earth Engine as well as my chunk size fits within 48 MBsish.

Not the answer you're looking for? Browse other questions tagged or ask your own question.