To: | "'[ontolog-forum] '" <ontolog-forum@xxxxxxxxxxxxxxxx> |
---|---|
From: | "Patrick Cassidy" <pat@xxxxxxxxx> |
Date: | 2010年1月26日 00:52:24 -0500 |
Message-id: | <035e01ca9e4b$bc349960349ドルdcc20$@com > |
David,
>> I want something--MT? Ontology support?--that can read Fortran, Jovial, COBOL. Java, PHP, Ruby, C, etc. (oops... that's a computer language) documents & make (more) sense out of said documents. These are textual artifacts (therefore "documents"?) which may or may not be written by humans, they're decidedly NOT edited for readability, and they are really not intended for human consumption.
I believe that current ontology technology, or extensions of it (to include procedural attachments) has the technical capability to do such things. But non-trivial applications will be quite labor-intensive to implement.
As I see it, ontology technology is still in its infancy – or perhaps still embryonic. I have had great difficulty finding any publicly inspectable (open source) applications that go much beyond an advanced version of database information retrieval – adding in a little logical inference, but not using that inference to do anything conspicuously more impressive than RDB’s themselves. CYC suggests it has built applications that do that, but we do not have them available for public testing – and much of CYC is still proprietary, a big turn-off for those who need a language that can be used freely.
John Sowa has told us that he uses a combination of techniques to solve knotty problems efficiently. I believe that is what will be very effective in general, but for that to work outside the confines of a single group – i.e. to enable multiple separately developed agents to cooperate in solving a problem- they will also need a common language to accurately communicate information.
The problem, as I perceive it is that, although up to now there has been great progress in understanding the science (mathematical properties) of inference – for which we can be grateful to the mathematicians and logicians - understanding inference only provides a **grammar** and a minimal basic **semantics** for a language that computers can understand. What we have very little agreement on is the **vocabulary**, without which there is no useful language. For computers to properly interpret each other’s data, it is necessary to have a common vocabulary – or vocabularies that can be **accurately** translated. Such a translation mechanism is possible if a common foundation ontology were adopted, which would have representations of all the fundamental concepts necessary to logically describe the domain concepts of the ontologies in programs that need to communicate data. It is a measure of the pre-scientific nature of the field that there is actually even disagreement about the need for a common foundation ontology. To me it is blindingly obvious – one cannot communicate without a common language (including vocabulary); there are no exceptions. But most efforts at interoperability among separately developed ontologies currently focus on developing mappings in some automated manner – which any inspection immediately reveals cannot be done with enough accuracy to allow machines to make mission-critical decisions based on such inaccurate mappings. Accurate mappings are possible via a common foundation ontology. But for reasons that I believe are not based on relevant technical considerations, there is little enthusiasm for developing such an ontology at present. Past efforts have failed, because they depended on voluntary commitment of a great deal of time from participants in order to find common ground among a large enough user community. What will work is if a large developing community is **paid** to build and test a common foundation ontology and demonstrate its capability for broad general semantic interoperability. I am certain it will happen sometime that such an ontology will be developed, because the need for it and benefits of it are so compelling. The only question for me is how much time and money will be wasted before such a widely used foundation ontology is developed and tested in multiple applications – and who will pay for it.
So, I believe that current ontology technology provides the basis to tackle the problems you cite, but I don’t know of any off-the-shelf programs that can do that now. Perhaps someone has developed one?
Pat
Patrick Cassidy
MICRA, Inc.
908-561-3416
cell: 908-565-4053
cassidy@xxxxxxxxx
_________________________________________________________________ Message Archives: http://ontolog.cim3.net/forum/ontolog-forum/ Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontolog-forum/ Unsubscribe: mailto:ontolog-forum-leave@xxxxxxxxxxxxxxxx Shared Files: http://ontolog.cim3.net/file/ Community Wiki: http://ontolog.cim3.net/wiki/ To join: http://ontolog.cim3.net/cgi-bin/wiki.pl?WikiHomePage#nid1J To Post: mailto:ontolog-forum@xxxxxxxxxxxxxxxx (01)
<Prev in Thread] | Current Thread | [Next in Thread> |
---|---|---|
|
Previous by Date: | Re: [ontolog-forum] Tighter control of ontolog forum? - a solution? , Peter Yim |
---|---|
Next by Date: | Re: [ontolog-forum] Context in a sentence , Jack Park |
Previous by Thread: | [ontolog-forum] Context in a sentence , David Eddy |
Next by Thread: | Re: [ontolog-forum] Context in a sentence , Jack Park |
Indexes: | [Date] [Thread] [Top] [All Lists] |