To highlight the resemblance of the Ontology and the API models, from the perspective of outcome and utility, let’s consider an example from an existing ontology.
FOAF (Friend of a friend) is an ontology defined on RDF and OWL which aims to describe people and relations within the context of the Internet, on-line presence. It defines a hierarchy of types which can be complex or simple and are connected between them via properties and relations:

  • Example 23, Snippet from the FOAF Ontology
  • Class: foaf:Image
    Class: foaf:OnlineAccount
    Class: foaf:OnlineGamingAccount  (Subclass Of: Online Account)
    Property: foaf:mbox_sha1sum
    Property: foaf:msnChatID
    Property: foaf:lastName
    Property: foaf:account (Range: every value of this property is a Online Account)

The semantic charge, however, from a computer’s point of view, is similar or identical in nature to other classical APIs, modelled in XSD or UML (unified modeling language). This is not a problem of RDF or OWL languages, but rather a problem born from the way the industry sees data modeling: abstracting a slice of reality into a custom, proprietary model.

FOAF grabs a piece of reality, that of people and the web, disjoints it conceptually from all the rest of reality and attempts to represent it in such way that applications can make sens of it without the aid of humans. Applications that are specifically built to interpret FOAF will undoubtedly be capable of interpreting FOAF and give their users relevant response within the realm of FOAF reality, but this is no different that any other application implementing any other API. Aside from that, any other application that is not strictly designed to conform to the types defined in the FOAF ontology will be incapable to interpret any result. Why this is, has to do with the reliance of the ontology on humans to interpret it. There is no consistency in naming, real continuity between types and no link to a larger reality (ontology) to which references are made via suggestive naming and explanation (documentation).

The “OnlineGamingAccount” in example: 23 is a derivative of the “OnlineAccount”, which in turn is a derivative of “Thing” in the FOAF ontology. While the text in the description of the these types make complete sense to a person, from a computer’s perspective it could be anything. In order for a computer to be capable to perform any operations with a foaf:OnlineGamingAccount, in an automated fashion, the computer has to have some operations defined that take it as input and some mechanism that trigger an automated reaction. If it does not have such an operation, it should at least be capable of doing something with foaf:OnlineAccount. If neither of these are implemented (hard coded) to take these very types as input no operation can be performed because there is no other computationally intelligible information related with the type. These are self contained types, they either make sense on their own (operations exist) or they don’t. They are completely disjoint from the reality (the human reality) from where they originate.

A person on the other hand is capable to derive lots of information from the simple name “OnlineGamingAccount”. Although the designation is merged into a single word distorting slightly the meaning this way, it is sufficiently similar to “on-line gaming account” so that a person can deduce that this is what the type is about. The person, can immediately draw the conclusion that it is:

  • an account, something representing him or her or another person
  • on-line, in the on-line world (Internet)
    the upper two, hint towards a plethora of collateral information: people log in with these account, they have a profile, information accumulates into these accounts, actions can be performed with the information and so on….
  • it has to do with gaming, as in play, fun, and so on and so forth…

None of these distinct concepts (account, on-line or gaming) are defined in the FOAF ontology. It is either an foaf:OnlineGamingAccount, an foaf:OnlineAccount or nothing at all and as such, a computer built around the foaf ontology cannot draw any additional information from an object of this type. There is simply no reality behind these types within the ontology.

What is worse is that the ontology is organizationally controlled: The concept “foaf:OnlineGamingAccount” is the property of somebody. The term “xyz:OnlineGamingAccount” could be the property of somebody else, who might chose (not that they will, but the possibility exists) to define it in terms of carrots and potatoes rather than gaming and on-line. The example might be exaggerated, but it is meant to show that such terminology has nothing to do with reality, or a reality. They are proprietary terms used by particular organizations who presume that the entire world will conform to the standard and thus enable seamless communication within that particular sector. The very concept of Ontology is slightly misplaced as it implies knowledge about a slice of reality. In human world an ontology is usually a small segregated part from the global reality, which is particular to a group of people and it only emphasizing the particularities of that group of people within the context of this global reality. In IT, ontologies are defined outside the context of this general reality, which continues to exist only in the human world. This segregationist view is very akin to an API, containing a proprietary type hierarchy that can be used for information interchange but only if the ones doing the interchange are sufficiently aware of the context to recreate information from data, because at the root that is what is being interchanged. For anybody else, putting reality together from these separate ontologies will inevitably lead to confusion, double definitions, subjectivity, conflict of interest and all the other problems of the classical APIs.

Ontologies need to emerge naturally from custom needs for various groups of people and it is the more basic, general world that needs to be coined. As long as the way of creating them stays as it currently is, these ontologies cannot bring the awaited revolution because people will never be interested or be able to using it. For example, the specification describing the FOAF (Dan Brickley, Libby Miller, 2010) is so long, that no human will be enticed to read and learn it just to be able to say a few words about themselves in the new semantic web. It is many times longer than the semantic meaning (dictionary definition) of the objects that it defines, and it represents only a fraction of the knowledgeable information within human world. If all knowledge would be standardized like this, it would become an unreadable document. If the alternative method is presumed, that an application will help the humans to create these files, then we have to assume that in a complete Web 3.0 environment humans will be using thousands of custom made applications designed particularly for each ontology that will exists out there, which is just as unreasonable.

In either case, it is not likely that the current approach will be able to yield any real results outside small scale studies or strict lab conditions.