The World Wide Web has been made possible through a set of widely established standards which
guarantee interoperability at various levels. For example, the TCP/IP protocol has ensured interoper-
ability at the transport level, while HTTP and HTML have provided a standard way of retrieving and
presenting hyperlinked text documents. Applications have been able to use this common infrastructure
and this has made possible the World Wide Web as we know it now.
The \rst generation" Web consisted largely of handwritten HTML pages. The current Web,
which can be described as the second generation, has made the transition to machine generated and
often active HTML pages. Both the rst and second generation Web were meant for direct human
processing (reading, browsing, form-lling, etc.). The third generationWeb aims to makeWeb resources
more readily accessible to automated processes by adding meta-data annotations that describe their
content. This idea was rst delineated, and named the Semantic Web, in Tim Berners-Lee's recent
book \Weaving the Web" [5].
If meta-data annotations are to make resources more accessible to automated agents, it is essential
that their meaning can be understood by such agents. This is where ontologies will play a crucial
role, providing a source of shared and precisely dened terms that can be used in meta-data. An
ontology typically consists of a hierarchical description of important concepts in a domain, along with
descriptions of the properties of each concept. The degree of formality employed in capturing these
descriptions can be quite variable, ranging from natural language to logical formalisms, but increased
formality and regularity obviously facilitates machine understanding. Examples of the use of ontologies
could include e-commerce sites [16], search engines [17] and Web services [19].
Thursday, 1 January 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment