An Ontology Editor for Android. 4. Mobile Considerations - Ontology Load Times

In my last post, I showed the beginnings of an ontology editor for Android come into existence. When I developed this first version, I did so mainly against relatively small ontologies, such as the General Formal Ontology, the Basic Formal Ontology and others.

Once this was stable, I tested in particular the loading of ontologies without import statements (which need to be resolved over the web) from a location on the device. As I was part of the ChEBI team for a short while, it was natural to try and use ChEBI.

Well, I was in for a rude shock. Once I had selected the ontology and started to load - well - nothing happened. The application didn't crash and the developer console clearly indicated that it was running - and more importantly, that the memory allocated to this process was expanding and expanding. So I stopped the loading and decide to dig into this a bit more. 

Logging on Android

Android has a rather wonderful logging framework built in and getting hold of the data was a breeze using the Android TimingLoggerClass:

TimingLogger timings = new TimingLogger("LOAD", "ontology loading");

ontology = manager.loadOntologyFromOntologyDocument(in);
timings.addSplit("LoadTime");
timings.dumpToLog();
Log.d("LOAD", "loading finished");

The output from the logger will then show up in the LogCat console.

 

Ontology Load Times

To get a better handle on loading behaviour I started to look at load times for ChEBI (can't remember which version - this is not a completely scientific experiment), GFO and DOLCE Light. I loaded each ontology from the shared storage area on the device into the application and observed the load process. The experiment was repeated 10 times for each ontology. The link to the data is here. The bottom line is: GFO and Dolce-Light with its 78 and 37 classes respectively had reasonably similar load times (within standard deviation): 483 (+/- 38) ms and 540 (+/-35) ms. Still somewhat surprising as Dolce-Light is the smaller ontology and should have loaded faster (but maybe there are other factors at play here). ChEBI by contrast required a whopping 370391 (+/-7039) ms - i.e. about 6 minutes - to load. The device didn't crash and I could explore the class list just fine. I did, however, have to remove memory restrictions on the application and allow OntoDroid to just take over whatever device memory it could get hold of.

This is a problem which - in time - will probably solve itself as devices get more powerful. Until then, however, it has got consequences for how to program - OWLOntology is not serialisable and hence cannot just be passed around via Intents.

More about this later.