Tag Archives: ADAPT

Polymath Service Provider

Over the Christmas break I started to reflect on the nature of service provision in the Language Services industry in the light of new technologies coming out of machine learning and artificial intelligence advances and my own predictions of the influences upon the industry and the industry’s response to them.

There are the recent announcements of adaptive and neural network machine translation; pervasive cloud platforms with ubiquitous connectivity and cognitive capabilities; an upsurge in low-cost, high-benefit open source tooling and frameworks; and many mature api’s and standards.

All of these sophisticated opportunities really do mean that as a company providing services you have to be informed, adaptable, and agile; employ clever, enthusiastic people; and derive joy and satisfaction from harnessing disruptive influences to the benefit of yourselves and your customers.

I do have concerns: How do we sustain the level of investment necessary to stay abreast of all these influences and produce novel services and solutions from them in an environment of very small margins and low tolerance to increased or additional costs?

Don’t get me wrong though. Having spent the last 10 years engaging with world-class research centers such as ADAPT, working alongside thought leading academics and institutions such as DFKI and InfAI, participating in European level Innovation Actions and Projects, and generally ensuring that our company has the required awareness, understanding and expertise, I continue to be positive and enthusiastic in my approach to these challenges.

I am satisfied that we are active in all of the spaces that industry analysts see as being currently significant. To whit: ongoing evaluations of adaptive translation environments and NMT, agile platforms powered by distributed services and serverless architectures, Deep Content (semantic enrichment and NLP), and Review Sentinel (machine learning and text classification).

Less I sound complacent, we have much more in the pipeline and my talented and knowledgeable colleagues are excited for the future.

Positive Thoughts for Blue Monday

Just before Christmas I joined the OASIS XLIFF Object Model and Other Serializations Technical Committee. I think it reflects the maturity and degree of adoption of XLIFF that this TC has been convened. It’s another opportunity to work with some technical thought leaders of the localization industry.

On Wednesday 13th I attended the launch of the ADAPT Centre in the Science Gallery at Trinity College. ADAPT is the evolution of the Centre for Next Generation Localization CSET into a national Research Center. Vistatec was a founding industrial partner of CNGL back in 2007 and I’m happy to continue our collaboration on topics such as machine translation, natural language processing, analytics, digital media and personalization. Unexpectedly but happily I was interviewed by RTE News and appeared on national television.

Like millions of people, I am saddened at the passing of David Bowie and Alan Rickman. Kudos to Bowie for releasing Blackstar and bequeathing such unique and thought-provoking art. The positive angle? The lessons of live and appreciate life to the full.

To a large extent my development plans for Q1 were approved. This includes extending the deployment of SkyNet to other accounts within Vistatec.

On January 11th we released Ocelot 2.0.

My Italian is coming along. Slowly but surely. We have a number of Italian bistro’s and ristorante in the town where I live so I have every opportunity to try it out.

On the coding front I’ve been looking at OAuth2, OpenID and ASP.NET MVC 6. I continue to be impressed by Microsoft’s transformation from an “invented here” company to one that both embraces and significantly contributes to open source.

Onward and Upward!

Public Defrag

I’m using this post to both keep my blog live and also organise my own thoughts on everything that’s been going on over the last six weeks.

We deployed the distributed components of our XTM integration to production and have pushed a number of projects through it. We delivered this project through a series of backlog driven sprints. It wasn’t without its challenges: requirements changes, unforeseen constraints and aggressive timelines. Some elements of test-driven development were employed (and very useful) as were domain-driven design.

On Wednesday I was delighted to receive a tweet from @kevin_irl announcing the open source publication of their .NET XLIFF 2.0 Object Model on GitHub. Coincidentally Wednesday was also the day that my copy of Bryan Schnabel’s “A Practical Guide to XLIFF 2.0” arrived from Amazon. One of my developers, Marta Borriello, is currently working on a prototype of the XLIFF Change Tracking module which includes support for inline markup inside of Ocelot with the hope that this will be part of XLIFF 2.1.

Machine Learning is one of my primary interests but sadly a tertiary focus after the “day job” and family (don’t you listen to their denials). Hot on the heels of machine learning is programming paradigms and languages. So with a peak in travel I decided to combine both and downloaded “Machine Learning Projects for .NET Developers” by Mathias Brandewinder and four F# courses from Pluralsight and got myself up to speed on functional programming. This turned out to be a really valuable exercise because I got to understand that functional programming gives you much more than immutable data and functions as first class objects. There’s sequences, partial application and composition to name a few. One day I plan to re-implement Review Sentinel’s core algorithms in F# but don’t hold your breath for a post about that.

I’m just back from a FREME Project hackathon in hyp-zig. Two days of enthusiasm-fuelled fun creating and querying linked data with guys at the forefront of of this exciting technology. We hacked topic/domain detection/identification using the services of the FREME platform.



FREME Team: Francisc Tar, Felix Sasaki, me, Jan Nehring, Martin Brümmer, Milan Dojchinovski and Papoutsis Georgios.

Today I attended an ADAPT Governance Board meeting. ADAPT is the evolution of the CNGL CSET into a Research Centre. I think ADAPT has a great research agenda and has numerous world-class domain thought leaders. I’m looking forward to working with the ADAPT team during 2016 to push a few envelopes. My engagement with academia and research bodies at both national and European level over the last 10 years has been of great tangible and intangible value (no I don’t just mean drinking buddies). I have to thank Peter Reynolds (who will never let me forget it), Fred Hollowood (who will have an opinion about it) and Dave Lewis (who will be typically English and modest about it) for helping me overcome the initial inertia and Felix Sasaki who has made the bulk of it demanding, rewarding and enjoyable.

There, that’s better. Neuron activity stabilized and synapses clear.

2015 R&D Agenda

2015 is set to be very industrious for my Research and Development Team.

On the development side we have ambitious plans for a substantial amount of distributed, cloud based automation and integration. It’s exciting but at the same time a little frustrating as so much of the code will have to be written as opposed to being available in existing libraries. Our first distributed, cloud platform, Synthesis, has lived up to expectations and delivered cost and time savings, scalability and reliability. The new event and action rules engine will provide powerful and flexible real-time configuration. This has set a high bar for the new systems but I am optimistic and enthusiastic to get started.

From a research perspective, in addition to a targeted project with the new ADAPT Centre, we will kick off a European Commission Horizon 2020 project code named “FREME” (Open Framework of E-Services for Multilingual and Semantic Enrichment of Digital Content). This is an exciting opportunity to work again with the Deutsches Forschungszentrum für Künstliche Intelligenz (DFKI), the Instituts für Angewandte Informatik (InfAI)and Tilde as well as some new collaborators.

Ocelot will restate its commitment to industry standards and interoperability by supporting XLIFF 2.0. It will also form a prototype client for some of the envisioned FREME services.

Before that however, is a well earned break and the chance to play with some tools and books Santa got for me: JetBrains’ WebStorm, “Python 3 Text Processing with NLTK 3 Cookbook”, and “AngularJS UI Development” from Packtpub.

If you celebrate Christmas, have a wonderful one!