Monthly Archives: December 2013

Christmas Clear Out

I’ve been doing a long overdue clear out this holiday. I love to read but the rate at which items make it on to my reading list far exceeds my consumption rate. Un-deterred I have paper stalagmites around my house and office of topics once deemed worthy of reading. These Doric columns of knowledge work according to the principles of a LIFO stack and so some things towards the base are pretty old. To whit: Tony Jewtushenko’s LRC XI paper, “What’s New in XLIFF 1.2?”! Cull required!

Pleased with my reduction of these paper mountains I compensated by adding a few megabytes to my digital reading lists (encouraged by the holiday pricing promotions – thank you Amazon and PACKT): “When The Devil Holds The Candle” by Karin Fossum; “ASP.NET Web API: Build RESTful web applications and services on the .NET framework” by Joydip Kanjilal; and “Building Mobile Applications Using Kendo UI Mobile and ASP.NET Web API” by Nishanth Nair.

One blog post I got a lot from is Scott Hanselman’s “2014 Ultimate Developer and Power Users Tool List for Windows“.

A nice present from Atlassian was the granting of an open source Jira and Service Desk License for our Ocelot project development tracking. Thank you Atlassian!

As returning to work starts to make way into my thoughts I am drafting my 2014 Research and Development Manifesto. These are high level strategic desires and goals that I want myself and my team to strive for during the coming year.

My best wishes for a rewarding and prosperous 2014!

No Response Is Good Response

Last week I took a couple of full days to relax and code. I’ve been wanting to finish a long running project that gets interrupted by having to earn a living.
I figured a couple of days would do it as on the surface it was only integration that was required. Enter stage left: Windows 8.1 and 64-bit.
I got a new laptop a couple of months ago and whilst I’m very happy with it, random parts of random applications have stopped working. Okapi Checkmate not serialising ITS metadata for one.
So I start to run final integration tests and get the “Cannot load file or assembly…” error.
My application calls into native C++ libraries so I guessed they were a good place to start. Two downloads and a few configuration changes related to target platform and… compiler and linker errors abound.
It turned out the problem was that I had failed to select the “Copy settings from…” option and had lost all of my include and library paths and compiler switches. Although they were still there in my 32-bit configurations, if you’ve seen how any options there are for building C++ code, you’ll know it was tedious.
Back to tests and only after building my C# wrapper library and C# client application could I get rid of the BadImageFormat exceptions. Odd as I understood from various postings that I could leave my main application target as Any CPU. So ready to throw in the towel I run my tests again and although everything built fine and I got no other errors my tests would not report success or failure – no response.
Although my test project was set to x64 in my build configuration there is a separate switch under Tests->Settings->Platform. Changing that, closing and re-launching Visual Studio and bingo!
My final integration still isn’t done and this week is set to be very busy so in one sense I made no progress. It may need to be my Christmas present to myself!

eXtensively Maddening Language

After spending a very long day with the System.Xml and System.Xml.Linq namespaces I now understand why the Expando object and JSON were invented!

What I wanted to achieve was quite simple – I thought: read an existing XLIFF file from top to bottom looking for any trans-units, process those in some way, write the modified versions back into the XLIFF and leave everything else as it was.

The plan: use an XmlTextReader and XmlTextWriter in unison. This approach would utilise streaming and thus deal with large files.

It turns out that if you have multiple namespaces in your files this gets messy quickly, particularly if you construct modified portions of the XML as isolated fragments and then try to integrate them again. I was ending up with all kinds of locally declared namespaces though at the end of the day’s work I figured this could have been because I was looking at the non-reintegrated fragments. Perhaps if I’d been more patient and waited until I had a final integrated document they would have all been resolved correctly. Also, the methods on XmlTextReader and XmlTextWriter means you end up with very verbose code.

My final implementation used XmlTextReader for reading the existing file and then XDocument and Linq to build and write out the modified version. Linq to XML is just so elegant.

Trying to build simple namespace declarations and enforcement of prefix usage in the main file seemed overly difficult with both Sytsem.Xml and Linq.