I have just returned from presenting Deep Content on both coasts of North America at the TAUS Annual Conference in Portland, Oregon and the 32nd Localization World in Montréal, Canada.
Deep Content is the combination of natural language processing (NLP) tools and Linked Data. Services such as terminology spotting, Named Entity Recognition (NER) and machine translation can consume and produce data in a common protocol called Natural Language Processing Interchange Format (NIF). A digital text document is sent to each of the services either individually or sequentially in a pipeline. Entities identified by the various services are passed to a graph query service which searches for related information. Finally all of this data is used to enrich the document.
Deep Content uses open standards and enriched content can be serialized as valid HTML 5 and made available as any other page on the web.
We are currently running some beta pilot projects with customers and I’ll post on their results soon. If you’d like to know more leave a comment.