Previous topic  Top  Next topic  Print this Topic

By Example


We start by initializing the ontology manager, either with a local connection or to a remote server instance. In this example a local connection is used, configured for RDF. In addition the module names ground is activated.

Properties properties = new Properties();

properties.setProperty(IConfig.MODULE_NAMES_GROUND, IConfig.ON);

OntologyManager manager = KAON2Manager.newOntologyManager(OntologyLanguage.RDF, properties);

A new ontology can be created using the initialized manager. The first argument "module" in createOntology equals the RDF context or graph identifier. Simply set this value to a valid absolute IRI. The second argument allows you to specify further options for the new ontology. For our example we do not need them and set the argument to null. 

Ontology ontology = manager.createOntology(module, null);

The created ontology is empty. There are several ways to add new statements, i.e. triples. Let's start by importing the content of a RDF ontology file, here referred to as "sample.rdf" in the current directory. In this example, all of the additional arguments of the import method are not required. 

ontology.importContentsFrom(new File("sample.rdf"), null, null);

New statements can be also created using the API. The following lines of code show the construction of several RDF terms which are finally combined to a new RDF statement. This statement is then added to the ontology.´ 

KAON2Factory factory = KAON2Manager.factory();

Constant subject = factory.constantIRI("");

Constant predicate = factory.constantIRI("");

Constant object = factory.constantRDFText("Kain", "de");

FPropertyMember stmt = factory.fpropertyMember(subject, predicate, object);


Using Turtle syntax, which is compact and easy to read and also avoids some issues of RDF/XML serialization, new statements can be inserted by executing a SPARQL Update command.

manager.execute("sparqlUpdate \"\"\"PREFIX : <> "

+ "INSERT DATA { GRAPH <urn:demo#> { :able :age 32; :knows :cain. } }\"\"\"");

Serializing a ontology is provided by the method saveOntology.

ontology.saveOntology(OntoBrokerOntologyFileFormat.RDF_TURTLE, new File("sample-modified.ttl"));

Finally we query the ontology using SPARQL. This example specifies a simple query including a filter. The results are printed to the console. For details on queries see the SPARQL compatibility overview. Non-standard query extensions are also explained there. For example the 'debug' option generates a more verbose output and shows details of the internally used rules and query optimizations. A quite handy tool especially for the optimization of complex queries. For better legibility, exception and error handling has been excluded.


// Execute SPARQL query

reasoner = ontology.createReasoner();

String queryText = "SELECT * WHERE { ?s ?p ?o . FILTER (?o > 30) } OPTIONS 'debug'";

query = reasoner.createQuery(Namespaces.INSTANCE, queryText);;


// show variables used

Variable[] vars = query.getDistinguishedVariables();

for (Variable var: vars) {

    System.out.print(var.getVariableName() + "\t");




// show results

Term[] buffer = query.tupleBuffer();

Formatting format = ontology.getOntologyFormatting().getDefaultFormatting();

while (!query.afterLast()) {

    for (Term term: buffer) {

        System.out.print(term.toString(format) + "\t");