Understanding the TimeWholeYear configuration in ArcGIS Area Solar Radiation

المشرف العام

Administrator
طاقم الإدارة
I am a bit confused with the interpretation of my results and it would be great to have some clarification!

I used the TimeWholeYear configuration (without choosing the {each-interval} option, so the result should be a single output for the whole year) for the 'Area Solar Radiation' tool in ArcGIS10. I set the analysis to run with intervals of 14 days, and 0.5 hours.

With that I understand that every 14 days, and every 0.5 hours of those days, the tool will measure the radiation that falls on each cell of the raster.

Now that I have the results, I am a little bit confused as to what they mean. Basically, I have 2 questions regarding how to interpret the results:


  1. Is the result value in each cell an average or the sum of the radiation received by that cell in the whole year?

  2. If I go to layer properties > statistics in the output raster, I can see the minimum, mean and maximum value of the global radiation. How do I interpret that?

a) it refers to the minimum, mean, and maximum radiation measured during the year (i.e. measured on one occasion every 14 days and 0.5 hours)?...

...or b) it is the min, mean, and max values amongst all the cells of the raster?

I'm sorry if these are a very basic questions!!

Thanks, JJ



أكثر...
 
أعلى