I’ve not updated my blog for a couple of months because I’ve been binge learning and sailing the East and South Coasts of Ireland.
Despite several unsuccessful past attempts, due to lack of free time, to complete a Deep Learning course, I’m delighted to have now finished and passed the first course in deeplearning.ai’s Neural Networks and Deep Learning specialization on Coursera.
I blew most of my annual leave on cruising between Dún Laoghaire and Schull, Co. Cork: my justification being that the only way to improve my technique is to get out onto the sea. A great combination of learning and adventure.
Dolphins off of the South Coast of Ireland
Rounding the Fastnet Rock Lighthouse
Deep Learning and Neural Networks: two words that have dominated the press and social media in the localization industry for the last couple of weeks. Google Research’s blog post last week about their success with neural networks to power a production scale machine translation engine sparked a lot of conversation.
I’ve been interested in neural networks for the last couple of years, researching what they’re good at and thinking of potential use cases within Vistatec. I’m not one for wading into debates particularly when I don’t have first hand experience to substantiate my view or add any unique insights. I will say that I’m very excited about this development though. It reinforces again that you cannot stay still in the technology business. New paradigms will shake the ground beneath you.
One of the aspects of NMT that intrigued me was how the encoding and decoding of variable length sentences was achieved given that neural networks essentially work with vectors. It turns out Word Embedding‘s (or Word Vectors) play a part. [Dear readers, if you fully understand the creation and implementation of these, please leave me a comment or reference.] Now I get semi-obsessed when I think I haven’t fully understood a concept and so ensued my journey of the last week:
Binge-watching Stanford’s CS224D and Coursera’s Machine Learning course; revising my secondary school and university calculus; and reading everything I could find on logistic regression, backpropogation and neural network fundamentals including Neural Networks and Machine Learning and Best Machine Learning Resources for Getting Started.
Having filled up my head with concepts and a rough mental map of how it all goes together, the next step was to play with a framework and get my hands dirty. But which one? It seems the ML Framework du jour is Google’s TensorFlow. So, sleeves rolled up, diet coke and nibbles, we’re oo… Linux! I have to have Linux?!
OK, I knew I’d have to assimilate Python but what ensued was another intravenous intake of not unknown but unfamiliar tasks. Provisioning a basic Linux box on AWS and remoting into it from Windows using Putty so I could install the Nvidia TensorFlow Tutorial. Install Docker. Learn the basics of Docker. Install a Linux GUI and figure out how to remote into that from Windows by configuring Remote Desktop protocols. Install Python and TensorFlow and … I have to stop to attend to other commitments.
So, like all great weekly television series this project will have to be continued with another exciting installment.