7

I often am unable to delete files due to a lock or need to reset the data frame coordinate system. Very often starting a new session of Arc will allow certain geoprocessing or data management tasks to run effectively. What solutions exist to reset the local memory usage in ArcMap 10 other than closing and starting a new session? For the Python programmers out there, is there a way to add code to scripts that clears local memory before or after running geoprocessing tasks?

Andre Silva
10.5k12 gold badges57 silver badges109 bronze badges
asked Jul 5, 2012 at 21:49
2
  • 1
    Can you be more specific about what you are having trouble with? Better to tackle the issues individually rather than take a hammer to it. Commented Jul 5, 2012 at 22:47
  • 1
    I'm looking for more of a hammer approach on this one. It seems that I often resort to starting a new session of Arc as part of my geoprocessing troubleshooting routine--very often this last resort works! This leads me to believe that local memory storage may be the underlying issue to many unusual geoprocessing quirks (e.g. the failed tasks that shoot back a generic error message). Commented Jul 6, 2012 at 0:15

1 Answer 1

7

Geoprocessing is set to run in two ways: foreground and background. Someone who has more ezperience then I do can comment on the specific nuances of either setting in relation to memory leakage.

However, when running geoprocessing tools, many of which store temporary data in memory and it could be the tool itself that is causing the memory to reach capacity and errors to occur.

I can address the arcpy issues as Ive been tackling them myself lately. Its important to use the in_memory workspace when possible to speed up processing but always remember to delete temporary data from that workspace if its not needed anymore using a "del object" statement or using thr Delete_management() function.

Inside arcmap, a failed script or tool can be caused by memory leaks not alllowing the tool to finish its job. Sometimes the input format of the data makes a large difference in processing costs ( excel table vs csv, vs dbf). What tools are failing on you? There are known issues with ones like xy to line.

More information would be helpful. Please update with specific tasks that are causing you trouble.

answered Jul 6, 2012 at 3:05
4
  • +1 This is really helpful, thanks hhart. Regarding specific problem tasks, multi-ring buffering large polygon files immediately comes to mind as a real troublemaker, yet manageable in a fresh work session. Commented Jul 6, 2012 at 3:55
  • Specifically regarding the in_memory workspace, the only way to clear it out is with Delete_management; del object will not do anything. Commented Jul 6, 2012 at 4:04
  • 2
    I would also be careful before using the term memory leak as that has a very specific technical meaning; I would suggest "memory consumption" instead. Commented Jul 6, 2012 at 4:10
  • 1
    Or just "memory usage". Just because an application uses a lot of memory doesn't mean it is leaking memory. Commented Jul 7, 2012 at 2:20

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.