I am doing something similar to below to create a really large (large than memory on my machine) shapefile:
shapes = shapefile.Writer(shapefile.POINT) shapes.autoBalance = 1 shapes.field('MyId','N') shapes.field('OtherData','C', 20) for item in largeIteration(): . . . . . shapes.record(item['id'], item['desc']) shapes.point(item['lat'], item['lon']) The problem is while I'm using yield in largeIteration() to not hold onto memory there, the entire shapefile seems to remain open in memory, and therefore shapes grows with each call to shapes.record() and shapes.point(). Is there any way I can have pyshape not hold the entire shape in memory? I see OGR and its virtual file systems. However, I don't see a "stream to disk" vfs option.
أكثر...
shapes = shapefile.Writer(shapefile.POINT) shapes.autoBalance = 1 shapes.field('MyId','N') shapes.field('OtherData','C', 20) for item in largeIteration(): . . . . . shapes.record(item['id'], item['desc']) shapes.point(item['lat'], item['lon']) The problem is while I'm using yield in largeIteration() to not hold onto memory there, the entire shapefile seems to remain open in memory, and therefore shapes grows with each call to shapes.record() and shapes.point(). Is there any way I can have pyshape not hold the entire shape in memory? I see OGR and its virtual file systems. However, I don't see a "stream to disk" vfs option.
أكثر...