When I'm trying to dissolve a polygon through an arcpy script the operation fails with the 999999 error on computers with small amount of RAM [~3 GB] (known issue).
Unfortunately neither this, that nor that work for me, so I was thinking about dividing the polygon into smaller pieces first, processing them and then merging back? But I have absolutely no idea how to do that. Any ideas? Maybe there's another approach that hasn't come to my mind?
An example polygon for testing can be found here, I'm dissolving on the "Godziny" field with the default settings:
arcpy.Dissolve_management(path_to_original_shp, path_to_dissolved_shp, "Godziny") ADDED: Unfortunately when I tested Geog's solution and it worked it was just a fluke. Still 99 out of 100 dissolves fail. However the conditions of failing are quite weird. If I run dissolving as a tool in ArcMap it dissolves just fine (in about 30 seconds). If I run it from python's interpreter (doesn't matter whether in ArcMap or not) it also dissolves fine. However If I run it from my script as one of the functions in a processing chain it fails (it's stuck for about 20-30 min with memory usage growing slowly but constantly over time and then gives 999999 error - "Invalid topology").
The way I run this script is a bit unusual, because I have a GUI built with wxpython that I run in ArcMap, from which the processing chain with dissolving is run from. I run it on a Windows XP SP3 machine with 3 GB of RAM. If I use a machine with more RAM and Windows 7 or 8 (64 bit but without 64 bit background geoprocessing) everything works just fine. In addition to that, if I comment out dissolving and the rest of the chain, run it, then comment out the first part and run it from the dissolve point it also works fine. That gave me an idea that I might be somehow locking the resources, so I've tried doing gc.collect(), but with no success.
I've just tried running the script from the command line and it also works fine... So I guess there's something wrong when dissolving from my extension GUI, however I have no idea how my extension could influence the function as I'm not modifying the environment in any way.
The processing chain I'm doing looks like this:
arcpy.CheckOutExtension("spatial") # getting necessary licences ccm1_cut = ExtractByMask(project_manager.conf_dict["CCM1"], project_manager.conf_dict["hordist"]) ccm2_cut = ExtractByMask(project_manager.conf_dict["CCM2"], project_manager.conf_dict["hordist"]) ccm3_cut = ExtractByMask(project_manager.conf_dict["CCM3"], project_manager.conf_dict["hordist"]) ccm4_cut = ExtractByMask(project_manager.conf_dict["CCM4"], project_manager.conf_dict["hordist"]) ccm5_cut = ExtractByMask(project_manager.conf_dict["CCM5"], project_manager.conf_dict["hordist"]) # merging ccm layers ccm_merged = Con(IsNull(ccm1_cut), Con(IsNull(ccm2_cut), Con(IsNull(ccm3_cut), Con(IsNull(ccm4_cut), Con(IsNull(ccm5_cut), project_manager.conf_dict["lu_speed"], ccm5_cut), ccm4_cut), ccm3_cut), ccm2_cut), ccm1_cut) walk_speed_cut = ExtractByMask(project_manager.conf_dict["walkspeed"], project_manager.conf_dict["hordist"]) travel_speed_raster = walk_speed_cut / ((Float(ccm_merged)) / 100) travel_speed_raster.save(travel_speed) # creating path distance ipp = project_manager.get_ipp_path() path_dist_raster = PathDistance(ipp, travel_speed) # normalising to hours path_dist_norm_raster = Divide(path_dist_raster, 3600) path_dist_norm_raster.save(travel_time_normalised) # creating hour classes remap_range = RemapRange(create_remap_range_ccm()) temp_reclass = Reclassify(travel_time_normalised, "VALUE", remap_range, "NODATA") temp_reclass.save(travel_time_classes) # converting travel time raster to polygons arcpy.RasterToPolygon_conversion(travel_time_classes, travel_time_classes_shp, "SIMPLIFY", "VALUE") # calculating time value arcpy.AddField_management(travel_time_classes_shp, "Godziny", "SHORT") arcpy.CalculateField_management(travel_time_classes_shp, "Godziny", "[GRIDCODE]", "VB") # dissolving polygons arcpy.Dissolve_management(travel_time_classes_shp, travel_time_classes_dissolved, "Godziny") ...
أكثر...
Unfortunately neither this, that nor that work for me, so I was thinking about dividing the polygon into smaller pieces first, processing them and then merging back? But I have absolutely no idea how to do that. Any ideas? Maybe there's another approach that hasn't come to my mind?
An example polygon for testing can be found here, I'm dissolving on the "Godziny" field with the default settings:
arcpy.Dissolve_management(path_to_original_shp, path_to_dissolved_shp, "Godziny") ADDED: Unfortunately when I tested Geog's solution and it worked it was just a fluke. Still 99 out of 100 dissolves fail. However the conditions of failing are quite weird. If I run dissolving as a tool in ArcMap it dissolves just fine (in about 30 seconds). If I run it from python's interpreter (doesn't matter whether in ArcMap or not) it also dissolves fine. However If I run it from my script as one of the functions in a processing chain it fails (it's stuck for about 20-30 min with memory usage growing slowly but constantly over time and then gives 999999 error - "Invalid topology").
The way I run this script is a bit unusual, because I have a GUI built with wxpython that I run in ArcMap, from which the processing chain with dissolving is run from. I run it on a Windows XP SP3 machine with 3 GB of RAM. If I use a machine with more RAM and Windows 7 or 8 (64 bit but without 64 bit background geoprocessing) everything works just fine. In addition to that, if I comment out dissolving and the rest of the chain, run it, then comment out the first part and run it from the dissolve point it also works fine. That gave me an idea that I might be somehow locking the resources, so I've tried doing gc.collect(), but with no success.
I've just tried running the script from the command line and it also works fine... So I guess there's something wrong when dissolving from my extension GUI, however I have no idea how my extension could influence the function as I'm not modifying the environment in any way.
The processing chain I'm doing looks like this:
arcpy.CheckOutExtension("spatial") # getting necessary licences ccm1_cut = ExtractByMask(project_manager.conf_dict["CCM1"], project_manager.conf_dict["hordist"]) ccm2_cut = ExtractByMask(project_manager.conf_dict["CCM2"], project_manager.conf_dict["hordist"]) ccm3_cut = ExtractByMask(project_manager.conf_dict["CCM3"], project_manager.conf_dict["hordist"]) ccm4_cut = ExtractByMask(project_manager.conf_dict["CCM4"], project_manager.conf_dict["hordist"]) ccm5_cut = ExtractByMask(project_manager.conf_dict["CCM5"], project_manager.conf_dict["hordist"]) # merging ccm layers ccm_merged = Con(IsNull(ccm1_cut), Con(IsNull(ccm2_cut), Con(IsNull(ccm3_cut), Con(IsNull(ccm4_cut), Con(IsNull(ccm5_cut), project_manager.conf_dict["lu_speed"], ccm5_cut), ccm4_cut), ccm3_cut), ccm2_cut), ccm1_cut) walk_speed_cut = ExtractByMask(project_manager.conf_dict["walkspeed"], project_manager.conf_dict["hordist"]) travel_speed_raster = walk_speed_cut / ((Float(ccm_merged)) / 100) travel_speed_raster.save(travel_speed) # creating path distance ipp = project_manager.get_ipp_path() path_dist_raster = PathDistance(ipp, travel_speed) # normalising to hours path_dist_norm_raster = Divide(path_dist_raster, 3600) path_dist_norm_raster.save(travel_time_normalised) # creating hour classes remap_range = RemapRange(create_remap_range_ccm()) temp_reclass = Reclassify(travel_time_normalised, "VALUE", remap_range, "NODATA") temp_reclass.save(travel_time_classes) # converting travel time raster to polygons arcpy.RasterToPolygon_conversion(travel_time_classes, travel_time_classes_shp, "SIMPLIFY", "VALUE") # calculating time value arcpy.AddField_management(travel_time_classes_shp, "Godziny", "SHORT") arcpy.CalculateField_management(travel_time_classes_shp, "Godziny", "[GRIDCODE]", "VB") # dissolving polygons arcpy.Dissolve_management(travel_time_classes_shp, travel_time_classes_dissolved, "Godziny") ...
أكثر...