I am using ArcGIS 10.1 for Desktop.
I have a value raster and a polygon feature in GCS coordinates. My raster resolution is 0.5 decimal degrees which is much larger than the size of my polygons. My value raster has discrete floating points. So I resampled my raster to smaller resolution of 0.01 using the NEAREST option and used Zonal Statistics in table to get the mean value for each polygon. However when I compare the means of the bigger polygons that have zonal statistics values for both original and resampled raster, I find that the resampled values are higher. My raster values are in terms of mm/sq km. So the small difference during resampling is amplified when I scale the value up with the polygon areas. Any ideas how to prevent this?
I have also tried doing the same with projecting both my raster and polygons to WGS and then resampling and zonal statistics. But i get the same trends in the results. Am I missing something?
أكثر...
I have a value raster and a polygon feature in GCS coordinates. My raster resolution is 0.5 decimal degrees which is much larger than the size of my polygons. My value raster has discrete floating points. So I resampled my raster to smaller resolution of 0.01 using the NEAREST option and used Zonal Statistics in table to get the mean value for each polygon. However when I compare the means of the bigger polygons that have zonal statistics values for both original and resampled raster, I find that the resampled values are higher. My raster values are in terms of mm/sq km. So the small difference during resampling is amplified when I scale the value up with the polygon areas. Any ideas how to prevent this?
I have also tried doing the same with projecting both my raster and polygons to WGS and then resampling and zonal statistics. But i get the same trends in the results. Am I missing something?
أكثر...