Java garbage collector fails to clean up memory while loading ontology files -
i facing issue of java.lang.outofmemoryerror in application every time try load , process bunch of .owl files. have 500 odd .owl files in directory , load them 1 one memory using loadontologyfromontologydocument()
method of owl api inside for
loop. however, after method loads few ontologies, memory starts getting exhausted. unused object references not getting cleaned garbage collector. googled problem , used -xmx increase heap size upto 5gb suggested many . problem still persists. appreciate in regard.
owlontologymanager owlmanager = owlmanager.createowlontologymanager(); file folder = new file("g:\\owl , obo"); file[] listoffiles = folder.listfiles(); (int = 0; < listoffiles.length; i++) { if (listoffiles[i].isfile()) { system.out.println("file " + listoffiles[i].getname()); try{ file sourcefile = new file( "g:\\owl , obo\\" + listoffiles[i].getname()); owlontology ontology = owlmanager.loadontologyfromontologydocument(sourcefile); } catch(exception e){ e.printstacktrace(); } } }
it seems owlontologymanager hanging on memory between files. have tried moving first line inside listoffiles loop? it'll 'more expensive' create new instance each file, that's better 'broken'.
file folder = new file("g:\\owl , obo"); file[] listoffiles = folder.listfiles(); (int = 0; < listoffiles.length; i++) { if (listoffiles[i].isfile()) { owlontologymanager owlmanager = owlmanager.createowlontologymanager(); system.out.println("file " + listoffiles[i].getname()); try{ file sourcefile = new file( "g:\\owl , obo\\" + listoffiles[i].getname()); owlontology ontology = owlmanager.loadontologyfromontologydocument(sourcefile); } catch(exception e){ e.printstacktrace(); } } }
Comments
Post a Comment